Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
home:crameleon:misc
python-iteration_utilities
_service:obs_scm:iteration_utilities-0.12.1.obs...
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File _service:obs_scm:iteration_utilities-0.12.1.obscpio of Package python-iteration_utilities
07070100000000000041ED00000000000000000000000365E3BCDA00000000000000000000000000000000000000000000002300000000iteration_utilities-0.12.1/.github07070100000001000081A400000000000000000000000165E3BCDA000007A8000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/.github/CONTRIBUTING.rstHello potential contributor =========================== .. contents:: Table of Contents Questions --------- - If it's about ``iteration_utilities``: Open a new issue and ask. - If it's not then look for another place to ask. :-) Bug reports ----------- - Make sure it's a Bug and it's a Bug in ``iteration_utilities``. - Open a new issue. - Always include a _minimal_ and _standalone_ example that shows the Bug. - Include a description why you think this is a Bug. Feature requests ---------------- - Open a new issue. - Explain what you need. - Explain why you want this and if this could be relevant for other packages. - Explain why ``iteration_utilities`` is the right package for this feature. Implementing a new functions ---------------------------- - Create a fork. - Create a new branch. **Never** use your master branch for a pull request. - Write the function. - Don't forget to include it in the ``__all__`` list! - Write a docstring for the function. Do not forget to include interesting examples. - Write tests for the function. 100% coverage is a requirement. - If the new function is a C function include tests for potential memory leaks! - Include a link in the appropriate section of the narrative documentation. - Make a pull request. Modify existing functions ------------------------- - Create a fork. - Create a new branch. **Never** use your master branch for a pull request. - Modify the function. - Update the docstring if necessary. - Include tests for new code parts. 100% coverage is a requirement. - Make a pull request. Fixing a Bug ------------ - Create a fork. - Create a new branch. **Never** use your master branch for a pull request. - Fix the Bug. - Include a regression test. - Make a pull request. Modify the documentation ------------------------ - Create a fork. - Create a new branch. **Never** use your master branch for a pull request. - Fix the documentation/docstring. - Make a pull request. 07070100000002000081A400000000000000000000000165E3BCDA0000007E000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/.github/ISSUE_TEMPLATE.rstDescribe the problem/question. ``` Minimal example. ``` Explain what happened and how that differed from what you expected. 07070100000003000081A400000000000000000000000165E3BCDA0000070B000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/.github/PULL_REQUEST_TEMPLATE.rstGeneral: - [ ] Fixes #issuenumber (if someone opened an issue requesting the update) - [ ] Supersedes #issuenumber - optional: only if this pull request replaces another - [ ] Description of the PR - [ ] If this is your first PR for this package did you include yourself in the "docs/Authors.rst" file (it should be sorted lexically by last name)? The following shows some rough checklists depending on which kind of pull request this is: For Bugfixes: - [ ] Did you add a regression test - [ ] Did you add a changelog entry in "docs/CHANGES.rst"? For new functions/classes: - [ ] Did you include the file and function/class in the c module setup - [ ] Did you include it in the ``__all__`` of the python module? - [ ] Did you include a meaningful docstring with Parameters, Returns and Examples? - [ ] Does the docstring contain a ``.. versionadded:: {version}`` directive? - [ ] Did you include tests to ensure 100% coverage (except for memoryerrors)? - [ ] Did you add the new function/class to the narrative documentation? - [ ] Did you add it to the appropriate `Iterable` class as method? If not why? - [ ] Did you add a changelog entry in "docs/CHANGES.rst"? For new options in existing functions/classes: - [ ] Did you explain the new functionality in the existing docstring with examples? - [ ] Does the docstring contain a ``.. versionchanged:: {version}`` directive? - [ ] Did you include tests to ensure 100% coverage (except for memoryerrors)? - [ ] Did you also change the appropriate `Iterable` method? If not why? - [ ] Did you add a changelog entry in "docs/CHANGES.rst"? For documentation updates: - [ ] If this is your first PR for this package did you include yourself in the "docs/Authors.rst" file (it should be sorted lexically by last name)? 07070100000004000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002D00000000iteration_utilities-0.12.1/.github/workflows07070100000005000081A400000000000000000000000165E3BCDA0000046D000000000000000000000000000000000000004400000000iteration_utilities-0.12.1/.github/workflows/cpython-benchmarks.ymlname: Benchmark (CPython) on: pull_request: branches: - '*' push: branches: - master jobs: run-benchmarks: runs-on: ubuntu-latest timeout-minutes: 10 steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Set up Python uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install dependencies run: | python -m pip install pip setuptools wheel --upgrade - name: Install package run: | python -m pip install . - name: Install benchmark dependencies run: | python -m pip install simple_benchmark[optional] - name: Install other packages dependencies run: | python -m pip install cython - name: Install other packages run: | python -m pip install more-itertools toolz cytoolz pydash - name: Run benchmarks run: | python ./ci/collect_benchmarks.py - name: Upload results uses: actions/upload-artifact@v4 with: name: py3.12-benchmarks path: ./.benchmark_results/ 07070100000006000081A400000000000000000000000165E3BCDA00000529000000000000000000000000000000000000004200000000iteration_utilities-0.12.1/.github/workflows/cpython-coverage.ymlname: Coverage (CPython) on: pull_request: branches: - '*' push: branches: - master jobs: coverage: runs-on: ubuntu-latest timeout-minutes: 10 strategy: matrix: python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12'] steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Set up Python uses: actions/setup-python@v5 with: python-version: ${{ matrix.python-version }} - name: Install dependencies run: | python -m pip install pip setuptools --upgrade - name: Install package run: | CFLAGS="-coverage" python -m pip install . - name: Install test dependencies run: | python -m pip install pytest pytest-cov - name: Run tests run: | python -m pytest tests/ --cov=iteration_utilities --cov-report=xml --cov-config=./pyproject.toml - name: Upload Coverage report # It would probably be better to use the codecov-action but that's very slow: # https://github.com/codecov/codecov-action/issues/21 env: CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }} run: | curl -s https://codecov.io/bash | bash -s -- -t $CODECOV_TOKEN -F unittests -n ubuntu-${{ matrix.python-version }} 07070100000007000081A400000000000000000000000165E3BCDA0000133A000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/.github/workflows/cpython.ymlname: Python (CPython) on: pull_request: branches: - '*' push: branches: - master jobs: build-clang: runs-on: macos-latest timeout-minutes: 10 steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Set up Python uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install dependencies run: | python -m pip install pip setuptools --upgrade - name: Install package run: | CC=clang python -m pip install . --no-deps -vv - name: Import package run: | python -c "import iteration_utilities" - name: Install test dependencies run: | python -m pip install pytest - name: Run tests run: | python -m pytest tests/ build-python-with-debug: runs-on: ubuntu-latest timeout-minutes: 10 steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Download Python 3.12 run: | wget https://www.python.org/ftp/python/3.12.2/Python-3.12.2.tgz -q python3 ci/verify_checksum.py Python-3.12.2.tgz 4e64a004f8ad9af1a75607cfd0d5a8c8 - name: Install Python 3.12 run: | tar xzf Python-3.12.2.tgz cd Python-3.12.2 ./configure --with-pydebug sudo make altinstall -s -j2 - name: Remove download run: | sudo python3.12 -c "import os; os.remove('./Python-3.12.2.tgz'); import shutil; shutil.rmtree('./Python-3.12.2/')" - name: Install dependencies run: | python3.12 -m pip install pip setuptools wheel --upgrade --user --no-warn-script-location - name: Create wheel run: | python3.12 -m pip wheel . --no-deps --wheel-dir=./wheelhouse/ -vv - name: Install package run: | python3.12 -m pip install iteration_utilities --no-index --find-links=./wheelhouse/ --user -vv - name: Import package run: | python3.12 -c "import iteration_utilities" - name: Install test dependencies run: | python3.12 -m pip install pytest --user --no-warn-script-location - name: Run tests run: | python3.12 -m pytest tests/ build-sdist: runs-on: ubuntu-latest timeout-minutes: 10 steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Set up Python uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install dependencies run: | python -m pip install pip setuptools wheel build --upgrade - name: Create source distribution run: | python -m build - name: Delete wheel run: | rm ./dist/*.whl - name: Install package run: | python -m pip install ./dist/iteration_utilities-0.12.1.tar.gz -vv - name: Import package run: | python -c "import iteration_utilities" - name: Install test dependencies run: | python -m pip install pytest - name: Run tests run: | python -m pytest tests/ - name: Upload dist uses: actions/upload-artifact@v4 with: name: py_sdist path: ./dist/ build-wheels: name: Build wheels on ${{ matrix.os }} runs-on: ${{ matrix.os }} strategy: matrix: os: [ubuntu-latest, windows-latest, macos-latest] steps: - uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install cibuildwheel run: python -m pip install cibuildwheel==2.16.5 - name: Build wheels run: python -m cibuildwheel --output-dir wheelhouse env: CIBW_ARCHS_MACOS: x86_64 arm64 CIBW_TEST_COMMAND: > python -c "import iteration_utilities" && pytest {package}/tests CIBW_TEST_REQUIRES: pytest - uses: actions/upload-artifact@v4 with: name: wheels-${{ matrix.os }} path: ./wheelhouse/*.whl build-docs: runs-on: ubuntu-latest timeout-minutes: 10 steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Set up Python uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install dependencies run: | python -m pip install pip setuptools --upgrade - name: Install package run: | python -m pip install . --no-deps -vv - name: Install doc dependencies run: | python -m pip install sphinx numpydoc - name: Build doc run: | sphinx-build -b html -W -a -n docs/ build/sphinx/html/ - name: Upload documentation uses: actions/upload-artifact@v4 with: name: docs path: ./build/sphinx/html/ 07070100000008000081A400000000000000000000000165E3BCDA00000474000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/.github/workflows/pypy.ymlname: Python (Pypy) on: pull_request: branches: - '*' push: branches: - master jobs: build-pypy-7-3-13: runs-on: ${{ matrix.os }} timeout-minutes: 10 strategy: matrix: os: [ubuntu-latest] steps: - name: Check out source uses: actions/checkout@v4 with: fetch-depth: 1 - name: Test installation (Ubuntu) if: matrix.os == 'ubuntu-20.04' run: | python3 -c "import urllib.request; urllib.request.urlretrieve ('hhttps://downloads.python.org/pypy/pypy3.10-v7.3.15-linux64.tar.bz2', './pypy.tar.bz2')" python3 -c "import tarfile; tar = tarfile.open('./pypy.tar.bz2', 'r:bz2');tar.extractall('.'); tar.close()" ./pypy3.10-v7.3.15-linux64/bin/pypy3 -m ensurepip ./pypy3.10-v7.3.15-linux64/bin/pypy3 -m pip install pip setuptools --upgrade ./pypy3.10-v7.3.15-linux64/bin/pypy3 -m pip install . --no-deps -vv ./pypy3.10-v7.3.15-linux64/bin/pypy3 -c "import iteration_utilities" ./pypy3.10-v7.3.15-linux64/bin/pypy3 -m pip install pytest ./pypy3.10-v7.3.15-linux64/bin/pypy3 -m pytest tests 07070100000009000081A400000000000000000000000165E3BCDA00000497000000000000000000000000000000000000002600000000iteration_utilities-0.12.1/.gitignore# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # PyCharm-stuff .idea/ # Distribution / packaging .Python env/ build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ *.egg-info/ .installed.cfg *.egg # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *,cover .hypothesis/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder target/ # IPython Notebook .ipynb_checkpoints # pyenv .python-version # celery beat schedule file celerybeat-schedule # dotenv .env # virtualenv venv/ ENV/ # Spyder project settings .spyderproject # Rope project settings .ropeproject # Visual Studio .vs .vscode # pytest .pytest_cache # mypy .mypy_cache # Benchmarks .benchmark_results 0707010000000A000081A400000000000000000000000165E3BCDA000002EF000000000000000000000000000000000000002D00000000iteration_utilities-0.12.1/.readthedocs.yaml# Read the Docs configuration file for Sphinx projects # See https://docs.readthedocs.io/en/stable/config-file/v2.html for details # Required version: 2 # Set the OS, Python version and other tools you might need build: os: ubuntu-22.04 tools: python: "3.11" # Build documentation in the "docs/" directory with Sphinx sphinx: configuration: docs/conf.py # Optionally build your docs in additional formats such as PDF and ePub # formats: # - pdf # - epub # Optional but recommended, declare the Python requirements required # to build your documentation # See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html python: install: - method: pip path: . extra_requirements: - documentation 0707010000000B000081A400000000000000000000000165E3BCDA00002C51000000000000000000000000000000000000002700000000iteration_utilities-0.12.1/LICENSE.txt Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "{}" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright 2016 Michael Seifert Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. 0707010000000C000081A400000000000000000000000165E3BCDA000000BB000000000000000000000000000000000000002700000000iteration_utilities-0.12.1/MANIFEST.in# Legal stuff include README.rst include LICENSE.txt recursive-include licenses * # Source code graft src # Source code graft tests global-exclude *.py[cod] __pycache__/* *.so *.dylib 0707010000000D000081A400000000000000000000000165E3BCDA00003438000000000000000000000000000000000000002600000000iteration_utilities-0.12.1/README.rstIteration utilities ------------------- Utilities based on Pythons iterators and generators. The full list of functions and classes included in this package: ========================================================================================================== ================================================================================================================================ ============================================================================================================== ====================================================================================================== `accumulate <https://iteration-utilities.readthedocs.io/en/latest/generated/accumulate.html>`_ `all_distinct <https://iteration-utilities.readthedocs.io/en/latest/generated/all_distinct.html>`_ `all_equal <https://iteration-utilities.readthedocs.io/en/latest/generated/all_equal.html>`_ `all_isinstance <https://iteration-utilities.readthedocs.io/en/latest/generated/all_isinstance.html>`_ `all_monotone <https://iteration-utilities.readthedocs.io/en/latest/generated/all_monotone.html>`_ `always_iterable <https://iteration-utilities.readthedocs.io/en/latest/generated/always_iterable.html>`_ `any_isinstance <https://iteration-utilities.readthedocs.io/en/latest/generated/any_isinstance.html>`_ `applyfunc <https://iteration-utilities.readthedocs.io/en/latest/generated/applyfunc.html>`_ `argmax <https://iteration-utilities.readthedocs.io/en/latest/generated/argmax.html>`_ `argmin <https://iteration-utilities.readthedocs.io/en/latest/generated/argmin.html>`_ `argsorted <https://iteration-utilities.readthedocs.io/en/latest/generated/argsorted.html>`_ `chained <https://iteration-utilities.readthedocs.io/en/latest/generated/chained.html>`_ `clamp <https://iteration-utilities.readthedocs.io/en/latest/generated/clamp.html>`_ `combinations_from_relations <https://iteration-utilities.readthedocs.io/en/latest/generated/combinations_from_relations.html>`_ `complement <https://iteration-utilities.readthedocs.io/en/latest/generated/complement.html>`_ `constant <https://iteration-utilities.readthedocs.io/en/latest/generated/constant.html>`_ `consume <https://iteration-utilities.readthedocs.io/en/latest/generated/consume.html>`_ `count_items <https://iteration-utilities.readthedocs.io/en/latest/generated/count_items.html>`_ `deepflatten <https://iteration-utilities.readthedocs.io/en/latest/generated/deepflatten.html>`_ `dotproduct <https://iteration-utilities.readthedocs.io/en/latest/generated/dotproduct.html>`_ `double <https://iteration-utilities.readthedocs.io/en/latest/generated/double.html>`_ `duplicates <https://iteration-utilities.readthedocs.io/en/latest/generated/duplicates.html>`_ `empty <https://iteration-utilities.readthedocs.io/en/latest/generated/empty.html>`_ `first <https://iteration-utilities.readthedocs.io/en/latest/generated/first.html>`_ `flatten <https://iteration-utilities.readthedocs.io/en/latest/generated/flatten.html>`_ `flip <https://iteration-utilities.readthedocs.io/en/latest/generated/flip.html>`_ `getitem <https://iteration-utilities.readthedocs.io/en/latest/generated/getitem.html>`_ `groupedby <https://iteration-utilities.readthedocs.io/en/latest/generated/groupedby.html>`_ `grouper <https://iteration-utilities.readthedocs.io/en/latest/generated/grouper.html>`_ `InfiniteIterable <https://iteration-utilities.readthedocs.io/en/latest/generated/InfiniteIterable.html>`_ `insert <https://iteration-utilities.readthedocs.io/en/latest/generated/insert.html>`_ `intersperse <https://iteration-utilities.readthedocs.io/en/latest/generated/intersperse.html>`_ `ipartition <https://iteration-utilities.readthedocs.io/en/latest/generated/ipartition.html>`_ `is_even <https://iteration-utilities.readthedocs.io/en/latest/generated/is_even.html>`_ `is_iterable <https://iteration-utilities.readthedocs.io/en/latest/generated/is_iterable.html>`_ `is_None <https://iteration-utilities.readthedocs.io/en/latest/generated/is_None.html>`_ `is_not_None <https://iteration-utilities.readthedocs.io/en/latest/generated/is_not_None.html>`_ `is_odd <https://iteration-utilities.readthedocs.io/en/latest/generated/is_odd.html>`_ `ItemIdxKey <https://iteration-utilities.readthedocs.io/en/latest/generated/ItemIdxKey.html>`_ `iter_except <https://iteration-utilities.readthedocs.io/en/latest/generated/iter_except.html>`_ `Iterable <https://iteration-utilities.readthedocs.io/en/latest/generated/Iterable.html>`_ `itersubclasses <https://iteration-utilities.readthedocs.io/en/latest/generated/itersubclasses.html>`_ `last <https://iteration-utilities.readthedocs.io/en/latest/generated/last.html>`_ `ManyIterables <https://iteration-utilities.readthedocs.io/en/latest/generated/ManyIterables.html>`_ `merge <https://iteration-utilities.readthedocs.io/en/latest/generated/merge.html>`_ `minmax <https://iteration-utilities.readthedocs.io/en/latest/generated/minmax.html>`_ `ncycles <https://iteration-utilities.readthedocs.io/en/latest/generated/ncycles.html>`_ `nth <https://iteration-utilities.readthedocs.io/en/latest/generated/nth.html>`_ `nth_combination <https://iteration-utilities.readthedocs.io/en/latest/generated/nth_combination.html>`_ `one <https://iteration-utilities.readthedocs.io/en/latest/generated/one.html>`_ `packed <https://iteration-utilities.readthedocs.io/en/latest/generated/packed.html>`_ `pad <https://iteration-utilities.readthedocs.io/en/latest/generated/pad.html>`_ `partial <https://iteration-utilities.readthedocs.io/en/latest/generated/partial.html>`_ `partition <https://iteration-utilities.readthedocs.io/en/latest/generated/partition.html>`_ `Placeholder <https://iteration-utilities.readthedocs.io/en/latest/generated/Placeholder.html>`_ `powerset <https://iteration-utilities.readthedocs.io/en/latest/generated/powerset.html>`_ `radd <https://iteration-utilities.readthedocs.io/en/latest/generated/radd.html>`_ `random_combination <https://iteration-utilities.readthedocs.io/en/latest/generated/random_combination.html>`_ `random_permutation <https://iteration-utilities.readthedocs.io/en/latest/generated/random_permutation.html>`_ `random_product <https://iteration-utilities.readthedocs.io/en/latest/generated/random_product.html>`_ `rdiv <https://iteration-utilities.readthedocs.io/en/latest/generated/rdiv.html>`_ `reciprocal <https://iteration-utilities.readthedocs.io/en/latest/generated/reciprocal.html>`_ `remove <https://iteration-utilities.readthedocs.io/en/latest/generated/remove.html>`_ `repeatfunc <https://iteration-utilities.readthedocs.io/en/latest/generated/repeatfunc.html>`_ `replace <https://iteration-utilities.readthedocs.io/en/latest/generated/replace.html>`_ `replicate <https://iteration-utilities.readthedocs.io/en/latest/generated/replicate.html>`_ `return_called <https://iteration-utilities.readthedocs.io/en/latest/generated/return_called.html>`_ `return_False <https://iteration-utilities.readthedocs.io/en/latest/generated/return_False.html>`_ `return_first_arg <https://iteration-utilities.readthedocs.io/en/latest/generated/return_first_arg.html>`_ `return_identity <https://iteration-utilities.readthedocs.io/en/latest/generated/return_identity.html>`_ `return_None <https://iteration-utilities.readthedocs.io/en/latest/generated/return_None.html>`_ `return_True <https://iteration-utilities.readthedocs.io/en/latest/generated/return_True.html>`_ `rfdiv <https://iteration-utilities.readthedocs.io/en/latest/generated/rfdiv.html>`_ `rmod <https://iteration-utilities.readthedocs.io/en/latest/generated/rmod.html>`_ `rmul <https://iteration-utilities.readthedocs.io/en/latest/generated/rmul.html>`_ `roundrobin <https://iteration-utilities.readthedocs.io/en/latest/generated/roundrobin.html>`_ `rpow <https://iteration-utilities.readthedocs.io/en/latest/generated/rpow.html>`_ `rsub <https://iteration-utilities.readthedocs.io/en/latest/generated/rsub.html>`_ `second <https://iteration-utilities.readthedocs.io/en/latest/generated/second.html>`_ `Seen <https://iteration-utilities.readthedocs.io/en/latest/generated/Seen.html>`_ `sideeffects <https://iteration-utilities.readthedocs.io/en/latest/generated/sideeffects.html>`_ `split <https://iteration-utilities.readthedocs.io/en/latest/generated/split.html>`_ `square <https://iteration-utilities.readthedocs.io/en/latest/generated/square.html>`_ `starfilter <https://iteration-utilities.readthedocs.io/en/latest/generated/starfilter.html>`_ `successive <https://iteration-utilities.readthedocs.io/en/latest/generated/successive.html>`_ `tabulate <https://iteration-utilities.readthedocs.io/en/latest/generated/tabulate.html>`_ `tail <https://iteration-utilities.readthedocs.io/en/latest/generated/tail.html>`_ `tee_lookahead <https://iteration-utilities.readthedocs.io/en/latest/generated/tee_lookahead.html>`_ `third <https://iteration-utilities.readthedocs.io/en/latest/generated/third.html>`_ `unique_everseen <https://iteration-utilities.readthedocs.io/en/latest/generated/unique_everseen.html>`_ `unique_justseen <https://iteration-utilities.readthedocs.io/en/latest/generated/unique_justseen.html>`_ ========================================================================================================== ================================================================================================================================ ============================================================================================================== ====================================================================================================== But also some convenience classes providing a lazy and chainable interface for function evaluation: - `Iterable <https://iteration-utilities.readthedocs.io/en/latest/generated/Iterable.html>`_ - `InfiniteIterable <https://iteration-utilities.readthedocs.io/en/latest/generated/InfiniteIterable.html>`_ - `ManyIterables <https://iteration-utilities.readthedocs.io/en/latest/generated/ManyIterables.html>`_ .. image:: https://img.shields.io/pypi/pyversions/iteration_utilities.svg :target: https://www.python.org/ :alt: Supported Python versions Documentation ^^^^^^^^^^^^^ .. image:: https://readthedocs.org/projects/iteration-utilities/badge/?version=stable :target: http://iteration-utilities.readthedocs.io/en/stable/?badge=stable :alt: Documentation Status .. image:: https://readthedocs.org/projects/iteration-utilities/badge/?version=latest :target: http://iteration-utilities.readthedocs.io/en/latest/?badge=latest :alt: Documentation Status Downloads ^^^^^^^^^ .. image:: https://img.shields.io/pypi/v/iteration_utilities.svg :target: https://pypi.python.org/pypi/iteration_utilities :alt: PyPI Project .. image:: https://img.shields.io/github/release/MSeifert04/iteration_utilities.svg :target: https://github.com/MSeifert04/iteration_utilities/releases :alt: GitHub Project .. image:: https://anaconda.org/conda-forge/iteration_utilities/badges/version.svg :target: https://anaconda.org/conda-forge/iteration_utilities :alt: Anaconda-Server Badge Test status ^^^^^^^^^^^ .. image:: https://ci.appveyor.com/api/projects/status/7dcitqxmh82d0x0m?svg=true :target: https://ci.appveyor.com/project/MSeifert04/iteration-utilities :alt: AppVeyor Status .. image:: https://codecov.io/gh/MSeifert04/iteration_utilities/branch/master/graph/badge.svg :target: https://codecov.io/gh/MSeifert04/iteration_utilities :alt: Coverage Status .. image:: https://img.shields.io/badge/benchmarked%20by-asv-green.svg?style=flat :target: https://mseifert04.github.io/iutils_benchmarks/ :alt: Benchmarks 0707010000000E000081A400000000000000000000000165E3BCDA000002AB000000000000000000000000000000000000002800000000iteration_utilities-0.12.1/appveyor.ymlimage: Visual Studio 2022 environment: matrix: # For Python versions available on Appveyor, see # https://www.appveyor.com/docs/windows-images-software/#python - PYTHON: "C:\\Python37" - PYTHON: "C:\\Python38" - PYTHON: "C:\\Python39" - PYTHON: "C:\\Python310" - PYTHON: "C:\\Python311" - PYTHON: "C:\\Python311-x64" install: - "%PYTHON%\\python.exe -m pip install pip setuptools --upgrade" - "%PYTHON%\\python.exe -m pip install . --no-deps -vv" build: off before_test: - "%PYTHON%\\python.exe -c \"import iteration_utilities\"" - "%PYTHON%\\python.exe -m pip install pytest" test_script: - "%PYTHON%\\python.exe -m pytest tests/" 0707010000000F000041ED00000000000000000000001265E3BCDA00000000000000000000000000000000000000000000002600000000iteration_utilities-0.12.1/benchmarks07070100000010000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/benchmarks/accumulate07070100000011000081A400000000000000000000000165E3BCDA0000019C000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/benchmarks/accumulate/accumulate.pyimport iteration_utilities import itertools def bench_iu_accumulate(iterable, func=iteration_utilities.accumulate): iteration_utilities.consume(func(iterable), None) def bench_itertools_accumulate(iterable, func=itertools.accumulate): iteration_utilities.consume(func(iterable), None) def args_list_length(): for exponent in range(2, 16): size = 2**exponent yield size, [1] * size 07070100000012000081A400000000000000000000000165E3BCDA000002D3000000000000000000000000000000000000004500000000iteration_utilities-0.12.1/benchmarks/accumulate/accumulate_binop.pyfrom operator import add import iteration_utilities import itertools import toolz import cytoolz def bench_iu_accumulate(iterable, func=iteration_utilities.accumulate): iteration_utilities.consume(func(iterable, add), None) def bench_itertools_accumulate(iterable, func=itertools.accumulate): iteration_utilities.consume(func(iterable, add), None) def bench_toolz_accumulate(iterable, func=toolz.accumulate): iteration_utilities.consume(func(add, iterable), None) def bench_cytoolz_accumulate(iterable, func=cytoolz.accumulate): iteration_utilities.consume(func(add, iterable), None) def args_list_length(): for exponent in range(2, 20): size = 2**exponent yield size, [1] * size 07070100000013000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/benchmarks/all_distinct07070100000014000081A400000000000000000000000165E3BCDA000001CE000000000000000000000000000000000000004300000000iteration_utilities-0.12.1/benchmarks/all_distinct/all_distinct.pyimport iteration_utilities import toolz import cytoolz def bench_iu_all_distinct(iterable, func=iteration_utilities.all_distinct): return func(iterable) def bench_toolz_isdistinct(iterable, func=toolz.isdistinct): return func(iterable) def bench_cytoolz_isdistinct(iterable, func=cytoolz.isdistinct): return func(iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, list(range(size)) 07070100000015000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003000000000iteration_utilities-0.12.1/benchmarks/all_equal07070100000016000081A400000000000000000000000165E3BCDA0000016F000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/benchmarks/all_equal/all_equal.pyimport iteration_utilities import more_itertools def bench_iu_all_equal(iterable, func=iteration_utilities.all_equal): return func(iterable) def bench_more_itertools_all_equal(iterable, func=more_itertools.all_equal): return func(iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [1] * size 07070100000017000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/benchmarks/all_monotone07070100000018000081A400000000000000000000000165E3BCDA000001A9000000000000000000000000000000000000004300000000iteration_utilities-0.12.1/benchmarks/all_monotone/all_monotone.pyimport iteration_utilities import more_itertools def bench_iu(iterable, func=iteration_utilities.all_monotone): return func(iterable) def bench_mi(iterable, func=more_itertools.is_sorted): return func(iterable) def bench_sorted(iterable, func=sorted): return func(iterable) == iterable def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, list(range(size)) 07070100000019000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003000000000iteration_utilities-0.12.1/benchmarks/applyfunc0707010000001A000081A400000000000000000000000165E3BCDA000002D9000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/benchmarks/applyfunc/applyfunc.pyimport iteration_utilities import toolz import cytoolz import more_itertools def bench_iu_applyfunc(n, func=iteration_utilities.applyfunc): iteration_utilities.consume(func(iteration_utilities.return_True, 1), n) def bench_more_itertools_iterate(n, func=more_itertools.iterate): iteration_utilities.consume(func(iteration_utilities.return_True, 1), n) def bench_toolz_iterate(n, func=toolz.iterate): iteration_utilities.consume(func(iteration_utilities.return_True, 1), n) def bench_cytoolz_iterate(n, func=cytoolz.iterate): iteration_utilities.consume(func(iteration_utilities.return_True, 1), n) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, size 0707010000001B000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/benchmarks/count_items0707010000001C000081A400000000000000000000000165E3BCDA00000169000000000000000000000000000000000000004100000000iteration_utilities-0.12.1/benchmarks/count_items/count_items.pyimport iteration_utilities import more_itertools def bench_iu_count_items(iterable, func=iteration_utilities.count_items): return func(iterable) def bench_more_itertools_ilen(iterable, func=more_itertools.ilen): return func(iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [1] * size 0707010000001D000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/benchmarks/deepflatten0707010000001E000081A400000000000000000000000165E3BCDA00000205000000000000000000000000000000000000004100000000iteration_utilities-0.12.1/benchmarks/deepflatten/deepflatten.pyimport iteration_utilities import more_itertools import pydash def bench_iu(iterable, func=iteration_utilities.deepflatten): return iteration_utilities.consume(func(iterable), None) def bench_mi(iterable, func=more_itertools.collapse): return iteration_utilities.consume(func(iterable), None) def bench_pd(iterable, func=pydash.flatten_deep): return func(iterable) def args_list_length(): for exponent in range(0, 15): size = 2**exponent yield size * 4, [[[(0, ) * 2]] * 2] * size 0707010000001F000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003000000000iteration_utilities-0.12.1/benchmarks/groupedby07070100000020000081A400000000000000000000000165E3BCDA00000237000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/benchmarks/groupedby/groupedby.pyimport iteration_utilities import toolz import cytoolz def bench_iu_groupedby(iterable, func=iteration_utilities.groupedby): return func(iterable, iteration_utilities.return_identity) def bench_toolz_groupby(iterable, func=toolz.groupby): return func(iteration_utilities.return_identity, iterable) def bench_cytoolz_groupby(iterable, func=cytoolz.groupby): return func(iteration_utilities.return_identity, iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [i % 10 for i in range(size)] 07070100000021000081A400000000000000000000000165E3BCDA0000027D000000000000000000000000000000000000004900000000iteration_utilities-0.12.1/benchmarks/groupedby/groupedby_with_reduce.pyimport operator import iteration_utilities import toolz import cytoolz def bench_iu_groupedby(iterable, func=iteration_utilities.groupedby): return func(iterable, iteration_utilities.return_identity, reduce=operator.add) def bench_toolz_reduceby(iterable, func=toolz.reduceby): return func(iteration_utilities.return_identity, operator.add, iterable) def bench_cytoolz_reduceby(iterable, func=cytoolz.reduceby): return func(iteration_utilities.return_identity, operator.add, iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [i % 10 for i in range(size)] 07070100000022000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002E00000000iteration_utilities-0.12.1/benchmarks/grouper07070100000023000081A400000000000000000000000165E3BCDA00000493000000000000000000000000000000000000003C00000000iteration_utilities-0.12.1/benchmarks/grouper/grouper_n2.pyimport iteration_utilities import more_itertools import toolz import cytoolz import pydash def bench_iu_grouper(iterable, func=iteration_utilities.grouper): iteration_utilities.consume(func(iterable, 2), None) def bench_more_itertools_grouper(iterable, func=more_itertools.grouper): iteration_utilities.consume(func(iterable, 2), None) def bench_more_itertools_chunked(iterable, func=more_itertools.chunked): iteration_utilities.consume(func(iterable, 2), None) def bench_toolz_partition(iterable, func=toolz.partition): iteration_utilities.consume(func(2, iterable), None) def bench_cytoolz_partition(iterable, func=cytoolz.partition): iteration_utilities.consume(func(2, iterable), None) def bench_toolz_partition_all(iterable, func=toolz.partition_all): iteration_utilities.consume(func(2, iterable), None) def bench_cytoolz_partition_all(iterable, func=cytoolz.partition_all): iteration_utilities.consume(func(2, iterable), None) def bench_pd(iterable, func=pydash.chunk): func(iterable, 2) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [i % 10 for i in range(size)] 07070100000024000081A400000000000000000000000165E3BCDA0000049B000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/benchmarks/grouper/grouper_n50.pyimport iteration_utilities import more_itertools import toolz import cytoolz import pydash def bench_iu_grouper(iterable, func=iteration_utilities.grouper): iteration_utilities.consume(func(iterable, 50), None) def bench_more_itertools_grouper(iterable, func=more_itertools.grouper): iteration_utilities.consume(func(iterable, 50), None) def bench_more_itertools_chunked(iterable, func=more_itertools.chunked): iteration_utilities.consume(func(iterable, 50), None) def bench_toolz_partition(iterable, func=toolz.partition): iteration_utilities.consume(func(50, iterable), None) def bench_cytoolz_partition(iterable, func=cytoolz.partition): iteration_utilities.consume(func(50, iterable), None) def bench_toolz_partition_all(iterable, func=toolz.partition_all): iteration_utilities.consume(func(50, iterable), None) def bench_cytoolz_partition_all(iterable, func=cytoolz.partition_all): iteration_utilities.consume(func(50, iterable), None) def bench_pd(iterable, func=pydash.chunk): func(iterable, 50) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [i % 10 for i in range(size)] 07070100000025000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/benchmarks/intersperse07070100000026000081A400000000000000000000000165E3BCDA00000328000000000000000000000000000000000000004100000000iteration_utilities-0.12.1/benchmarks/intersperse/intersperse.pyimport iteration_utilities import more_itertools import toolz import cytoolz import pydash def bench_iu_intersperse(iterable, func=iteration_utilities.intersperse): iteration_utilities.consume(func(iterable, 2), None) def bench_more_itertools_intersperse(iterable, func=more_itertools.intersperse): iteration_utilities.consume(func(2, iterable), None) def bench_toolz_interpose(iterable, func=toolz.interpose): iteration_utilities.consume(func(2, iterable), None) def bench_cytoolz_interpose(iterable, func=cytoolz.interpose): iteration_utilities.consume(func(2, iterable), None) def bench_pd(iterable, func=pydash.intersperse): func(iterable, 2) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [i % 10 for i in range(size)] 07070100000027000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002C00000000iteration_utilities-0.12.1/benchmarks/merge07070100000028000081A400000000000000000000000165E3BCDA00000354000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/benchmarks/merge/merge.pyimport iteration_utilities import more_itertools import toolz import cytoolz import heapq import itertools def bench_iu_merge(iterables, func=iteration_utilities.merge): iteration_utilities.consume(func(*iterables), None) def bench_heapq_merge(iterables, func=heapq.merge): iteration_utilities.consume(func(*iterables), None) def bench_builtin_sorted(iterables, func=sorted): return func(itertools.chain.from_iterable(iterables)) def bench_toolz_merge_sorted(iterables, func=toolz.merge_sorted): iteration_utilities.consume(func(*iterables), None) def bench_cytoolz_merge_sorted(iterables, func=cytoolz.merge_sorted): iteration_utilities.consume(func(*iterables), None) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield (size, [list(range(size // 2)), list(range(size // 2))]) 07070100000029000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/benchmarks/roundrobin0707010000002A000081A400000000000000000000000165E3BCDA0000032E000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/benchmarks/roundrobin/roundrobin.pyimport iteration_utilities import more_itertools import toolz import cytoolz import pydash def bench_iu_roundrobin(iterables, func=iteration_utilities.roundrobin): iteration_utilities.consume(func(*iterables), None) def bench_more_itertools_roundrobin(iterables, func=more_itertools.roundrobin): iteration_utilities.consume(func(*iterables), None) def bench_toolz_interleave(iterables, func=toolz.interleave): iteration_utilities.consume(func(iterables), None) def bench_cytoolz_interleave(iterables, func=cytoolz.interleave): iteration_utilities.consume(func(iterables), None) def bench_pd(iterables, func=pydash.interleave): func(*iterables) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [[0] * (size // 2), [1] * (size // 2)] 0707010000002B000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002C00000000iteration_utilities-0.12.1/benchmarks/split0707010000002C000081A400000000000000000000000165E3BCDA000001D0000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/benchmarks/split/split.pyimport iteration_utilities import more_itertools def bench_iu_split(iterable, func=iteration_utilities.split): iteration_utilities.consume(func(iterable, lambda x: x % 5 == 0), None) def bench_more_itertools_split_at(iterable, func=more_itertools.split_at): iteration_utilities.consume(func(iterable, lambda x: x % 5 == 0), None) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, list(range(size)) 0707010000002D000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/benchmarks/successive0707010000002E000081A400000000000000000000000165E3BCDA0000034A000000000000000000000000000000000000004200000000iteration_utilities-0.12.1/benchmarks/successive/successive_n2.pyimport iteration_utilities import toolz import cytoolz import more_itertools def bench_iu_successive(iterable, func=iteration_utilities.successive): iteration_utilities.consume(func(iterable), None) def bench_toolz_sliding_window(iterable, func=toolz.sliding_window): iteration_utilities.consume(func(2, iterable), None) def bench_cytoolz_sliding_window(iterable, func=cytoolz.sliding_window): iteration_utilities.consume(func(2, iterable), None) def bench_more_itertools_pairwise(iterable, func=more_itertools.pairwise): iteration_utilities.consume(func(iterable), None) def bench_more_itertools_windowed(iterable, func=more_itertools.windowed): iteration_utilities.consume(func(iterable, 2), None) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [0] * size 0707010000002F000081A400000000000000000000000165E3BCDA000002CF000000000000000000000000000000000000004300000000iteration_utilities-0.12.1/benchmarks/successive/successive_n50.pyimport iteration_utilities import toolz import cytoolz import more_itertools def bench_iu_successive(iterable, func=iteration_utilities.successive): iteration_utilities.consume(func(iterable, 50), None) def bench_toolz_sliding_window(iterable, func=toolz.sliding_window): iteration_utilities.consume(func(50, iterable), None) def bench_cytoolz_sliding_window(iterable, func=cytoolz.sliding_window): iteration_utilities.consume(func(50, iterable), None) def bench_more_itertools_windowed(iterable, func=more_itertools.windowed): iteration_utilities.consume(func(iterable, 50), None) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [0] * size 07070100000030000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/benchmarks/unique_everseen07070100000031000081A400000000000000000000000165E3BCDA0000030A000000000000000000000000000000000000005700000000iteration_utilities-0.12.1/benchmarks/unique_everseen/unique_everseen_no_duplicates.pyimport iteration_utilities import toolz import cytoolz import more_itertools import pydash def bench_iu_unique_everseen(iterable, func=iteration_utilities.unique_justseen): iteration_utilities.consume(func(iterable), None) def bench_more_itertools_unique_everseen(iterable, func=more_itertools.unique_everseen): iteration_utilities.consume(func(iterable), None) def bench_toolz_unique(iterable, func=toolz.unique): iteration_utilities.consume(func(iterable), None) def bench_cytoolz_unique(iterable, func=cytoolz.unique): iteration_utilities.consume(func(iterable), None) def bench_pd(iterable, func=pydash.uniq): func(iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, list(range(size)) 07070100000032000081A400000000000000000000000165E3BCDA00000303000000000000000000000000000000000000005900000000iteration_utilities-0.12.1/benchmarks/unique_everseen/unique_everseen_only_duplicates.pyimport iteration_utilities import toolz import cytoolz import more_itertools import pydash def bench_iu_unique_everseen(iterable, func=iteration_utilities.unique_everseen): iteration_utilities.consume(func(iterable), None) def bench_more_itertools_unique_everseen(iterable, func=more_itertools.unique_everseen): iteration_utilities.consume(func(iterable), None) def bench_toolz_unique(iterable, func=toolz.unique): iteration_utilities.consume(func(iterable), None) def bench_cytoolz_unique(iterable, func=cytoolz.unique): iteration_utilities.consume(func(iterable), None) def bench_pd(iterable, func=pydash.uniq): func(iterable) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, [0] * size 07070100000033000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/benchmarks/unique_justseen07070100000034000081A400000000000000000000000165E3BCDA000001C8000000000000000000000000000000000000004900000000iteration_utilities-0.12.1/benchmarks/unique_justseen/unique_justseen.pyimport iteration_utilities import more_itertools def bench_iu_unique_justseen(iterable, func=iteration_utilities.unique_justseen): iteration_utilities.consume(func(iterable), None) def bench_more_itertools_unique_justseen(iterable, func=more_itertools.unique_justseen): iteration_utilities.consume(func(iterable), None) def args_list_length(): for exponent in range(2, 18): size = 2**exponent yield size, sorted(range(size)) 07070100000035000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000001E00000000iteration_utilities-0.12.1/ci07070100000036000081A400000000000000000000000165E3BCDA00000416000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/ci/collect_benchmarks.pyimport pathlib import shutil import subprocess import sys if __name__ == '__main__': benchmark_directory = pathlib.Path('./benchmarks/') result_folder = pathlib.Path('./.benchmark_results/') # Create result folder and remove all png files there result_folder.mkdir(exist_ok=True) for path in result_folder.glob('**/*.png'): path.unlink() # Run the benchmarks benchmark_paths = list(benchmark_directory.glob('**/*.py')) print([str(path) for path in benchmark_paths]) for path in benchmark_paths: inputfile = str(path) outputfile = str(result_folder.joinpath(path.with_suffix('.png').name)) # Using sys.executables ensures that at least the same executable is used # as this file was executed. This avoid having to hardcode "python" # which may not work when only "python3" is present or a virtualenv # is used. subprocess.run([sys.executable, "-m", "simple_benchmark", inputfile, outputfile, '-v', '--time-per-benchmark', '0.05'], check=True) 07070100000037000081A400000000000000000000000165E3BCDA00000188000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/ci/verify_checksum.pyimport hashlib import sys if __name__ == '__main__': filename = sys.argv[1] required_md5 = sys.argv[2] with open(filename, 'rb') as f: file_md5 = hashlib.md5(f.read()).hexdigest() print('filename: ', filename) print('file md5: ', file_md5) print('req. md5: ', required_md5) print('match: ', file_md5 == required_md5) assert file_md5 == required_md5 07070100000038000041ED00000000000000000000000365E3BCDA00000000000000000000000000000000000000000000002000000000iteration_utilities-0.12.1/docs07070100000039000081A400000000000000000000000165E3BCDA0000006A000000000000000000000000000000000000002C00000000iteration_utilities-0.12.1/docs/AUTHORS.rstContributors ------------ - Michael Seifert (owner) - Solomon Ucko (@sollyucko) - matsjoyce (@matsjoyce) 0707010000003A000081A400000000000000000000000165E3BCDA0000205D000000000000000000000000000000000000002C00000000iteration_utilities-0.12.1/docs/CHANGES.rstChangelog for "iteration_utilities" ----------------------------------- Version 0.12.1 (2024-03-03) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ - Source distribution now includes all files needed to run the tests from the tarball Version 0.12.0 (2023-10-13) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ - Improve performance of ``ipartition`` by evaluating the predicate only once per item. - Add benchmarks comparing some functions with other libraries. - Python 3.12 compatibility - Dropped Python 3.5 and 3.6 compatibility - The top level ``__version__`` property was removed. ``importlib.metadata.version`` from the Python standard library should be used if you need the version of ``iteration_utilities``. Version 0.11.0 (2020-11-19) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ - The functions implemented in C now parse boolean arguments based on their truth value instead of their integer value. This should be unnoticeable in typical usage. - Added ``always_iterable`` which wraps non-iterable inputs with an iterable. - Added ``empty`` as singleton representing an empty iterable. - The type of ``Placeholder`` which was previously accessible as ``PlaceholderType`` private. - Added Python 3.9 support. Version 0.10.1 (2019-11-20) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ - Fixed reference counting bugs in ``merge``, ``minmax``, and ``sideeffects``. Version 0.10.0 (2019-11-16) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ - Exception message for ``one`` if the *iterable* contains more than one element does show the first two elements. - Fixed a bug in ``grouper`` that lead to a fatal exception. Version 0.9.0 (2019-11-02) ^^^^^^^^^^^^^^^^^^^^^^^^^^ This library does not support Python 2.7 anymore. Attempting to install ``iteration_utilities`` on Python <= 3.4 should not work anymore. - Add the statistics functions introduced in Python 3.6 and 3.8 as methods on the ``Iterable`` class. This includes: - ``fmean`` (Python >= 3.8) - ``geometric_mean`` (Python >= 3.8) - ``harmonic_mean`` (Python >= 3.6) - ``multimode`` (Python >= 3.8) - ``quantiles`` (Python >= 3.8) - Add a new recipe from the ``itertools`` documentation: ``nth_combination``. Version 0.8.0 (2019-10-27) ^^^^^^^^^^^^^^^^^^^^^^^^^^ Support for Python 3.3 and 3.4 was dropped. But it now supports Python 3.7 and 3.8. This will be the last release supporting Python 2. - ``Iterable.islice`` will raise a more appropriate ``TypeError`` when called without arguments. - ``partial`` only allows plain ``str`` as keyword-names in CPython 3.8. - Some constants have been available in the module namespace that were intended to be private. These have been removed. This includes ``EQ_PY2``, ``GE_PY3`` and similar constants. - Added support for PyPy (3.5 and 3.6). - Use experimental vectorcall protocol (PEP 590) in a few places. Contributors: - Solomon Ucko (@sollyucko) Version 0.7.0 (2018-01-28) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - Add ``__sizeof__`` method for ``partial`` to return a more accurate size for the instance. - Fixed a problem when creating ``merge`` instances depending on the compiler. Version 0.6.1 (2017-04-15) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - The ``__next__`` or ``next`` method of the processed iterator is not cached in the ``iteration_utilities`` iterators anymore. This correctly handles the (rare) case that this method is reassigned or deleted. Version 0.6.0 (2017-04-08) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - Renamed ``func`` parameter name of ``partition`` to ``pred``. - Renamed ``function`` parameter name of ``tabulate`` to ``func``. - The ``key`` attribute of ``ItemIdxKey`` throws an ``AttributeError`` if it is not set and an attempt is made to get or delete it. - Added several attributes to classes. - Fixed a Bug in ``deepflatten`` when ``isinstance`` fails for the classes given as ``types`` or ``ignore`` parameter. - Changed internal package structure (shouldn't affect end-users that imported everything from ``iteration_utilities`` directly). - improved performance of ``all_isinstance`` and ``any_isinstance``. - improved performance of ``replicate``. - ``replicate`` now throws an exception if the ``times`` argument is smaller or equal to 1. - corrected handling of exceptions and overflow in ``__length_hint__`` methods. Version 0.5.2 (2017-03-30) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - fix release (again). Version 0.5.1 (2017-03-30) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - fixed major mistake that made 0.5.0 unusable. (``%R`` formatter isn't allowed in ``PyErr_Format``). Version 0.5.0 (2017-03-30) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - minor speedup for ``next(merge)``. - removed **unused** ``key`` parameter from ``combinations_from_relations``. - replaced ``Iterable.as_string`` parameter ``seperaror`` (sic!) by ``seperator``. - included signature for ``__reduce__``, ``__setstate__`` and ``__length_hint__`` methods. - fixed ``Seen.contains_add`` method signature. - fixed potential segfault in ``ItemIdxKey.__repr__``. - removed unnecessary ``__setstate__`` method for ``ItemIdxKey``. - various ``__setstate__`` and ``__reduce__`` methods were changed so they can't used to cause segmentation faults, ``SystemError`` or blatantly wrong behaviour. However, serializing or copying such an instance can significantly slower as a result of this change. Unpickling these instances from previous versions could be impossible and ``copy.copy`` is **not** supported (and probably won't be ever because ``itertools.tee`` interacts with ``__copy__`` methods). Affected iterators: ``chained``, ``deepflatten``, ``duplicates``, ``grouper``, ``intersperse``, ``merge``, ``roundrobin``, ``sideeffects``, ``split``, ``successive``, ``unique_everseen``, ``unique_justseen``. - added ``__repr__`` method for ``chained``, ``complement``, ``constant``, ``flip``, ``nth`` and ``packed``. - fixed a bug with ``partial`` when the function kept the arguments and a call only provided exactly the number of arguments as there are placeholders in the partial. - Applying ``flip`` on another ``flip`` instance now simply returns the original function. - ``chained`` now unwraps (if possible) other ``chained`` instances when creating a new instance. This is only done if this **won't** change the current behaviour. Version 0.4.0 (2017-03-20) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - ``Seen`` and ``ItemIdxKey`` can detect recursive objects in their ``repr``. - The representation for ``Seen`` and ``ItemIdxKey`` now uses the classname even for subclasses. - added ``partial`` callback class, which is essentially ``functools.partial`` but also allows positional placeholders. - several functions now interpret ``None`` as if that argument for the function wasn't given: - ``key`` argument for ``minmax``, ``merge``, ``argmin`` and ``argmax``. - ``reduce`` argument for ``groupedby``. - all arguments for ``Seen.__new__``. Version 0.3.0 (2017-03-09) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - implemented ``__length_hint__`` method for ``clamp``. However sensible results (!= 0) are only possible if ``remove=False`` or both ``low`` and ``high`` were not set. - fixed ``SystemError`` in several functions when accessing the next item of the iterable resulted in an Exception different from ``StopIteration``. - added ``starfilter`` iterator. - added ``packed`` callback class. - fixed a segfault in ``complement.__call__`` method when the function raised an Exception. - fixed a segfault in ``partition`` when ``bool(item)`` raised an Exception. - included a missing ``ValueError`` in ``split`` when two of the ``keep*`` parameters are True. The case where all three were given already raised the correct exception. - ``clamp`` now interprets ``low=None`` or ``high=None`` as if the corresponding value wasn't given. Before it tried to compare the items with ``None``. Version 0.2.1 (2017-03-01) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - fixed segfault in ``nth`` when ``retpred=True``. Version 0.2.0 (2017-02-27) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - added ``remove`` parameter to ``clamp``. - made ``deepflatten`` string-aware. For other recursive-iterable classes a ``RecursionException`` (or ``RuntimeException`` on python < 3.5) is raised instead of freezing. Version 0.1.0 (2017-01-25) ^^^^^^^^^^^^^^^^^^^^^^^^^^ - initial release 0707010000003B000081A400000000000000000000000165E3BCDA00001D25000000000000000000000000000000000000002900000000iteration_utilities-0.12.1/docs/Makefile# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # User-friendly check for sphinx-build ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest coverage gettext help: @echo "Please use \`make <target>' where <target> is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " applehelp to make an Apple Help Book" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" @echo " coverage to run coverage check of the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/iteration_utilities.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/iteration_utilities.qhc" applehelp: $(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp @echo @echo "Build finished. The help book is in $(BUILDDIR)/applehelp." @echo "N.B. You won't be able to view it unless you put it in" \ "~/Library/Documentation/Help or install it in your application" \ "bundle." devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/iteration_utilities" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/iteration_utilities" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." coverage: $(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage @echo "Testing of coverage in the sources finished, look at the " \ "results in $(BUILDDIR)/coverage/python.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." 0707010000003C000081A400000000000000000000000165E3BCDA00000041000000000000000000000000000000000000002800000000iteration_utilities-0.12.1/docs/api.rstAPI --- .. toctree:: :titlesonly: :glob: generated/* 0707010000003D000081A400000000000000000000000165E3BCDA00000394000000000000000000000000000000000000002E00000000iteration_utilities-0.12.1/docs/callbacks.rstCallbacks --------- Some constant and some very simple callbacks. Constant callbacks ^^^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.constant`, factory that creates an instance, that always returns the initial value. Equivalent to ``lambda x: value``. - :py:func:`~iteration_utilities.return_True`, equivalent to ``lambda x: True``. - :py:func:`~iteration_utilities.return_False`, equivalent to ``lambda x: False``. - :py:func:`~iteration_utilities.return_None`, equivalent to ``lambda x: None``. Simple callbacks ^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.packed`, equivalent to ``lambda func: lambda x: func(*x)``. - :py:func:`~iteration_utilities.return_identity`, equivalent to ``lambda x: x``. - :py:func:`~iteration_utilities.return_first_arg`, equivalent to ``lambda *args, **kwargs: args[0]``. - :py:func:`~iteration_utilities.return_called`, equivalent to ``lambda x: x()``. 0707010000003E000081A400000000000000000000000165E3BCDA00002D27000000000000000000000000000000000000002800000000iteration_utilities-0.12.1/docs/conf.py#!/usr/bin/env python3 # -*- coding: utf-8 -*- import sys import os import importlib.metadata # ############################################################################# # Get version of the package from the package itself. # ############################################################################# project = 'iteration_utilities' # ############################################################################# # Custom stuff # ############################################################################# author = 'Michael Seifert' project_description = 'Utilities based on Pythons iterators and generators.' project_category = 'Miscellaneous' project_startyear = '2016' # ############################################################################# # Official options # ############################################################################# # sys.path.insert(0, os.path.abspath('.')) # -- General configuration ---------------------------------------------------- extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.mathjax', 'numpydoc', ] source_suffix = '.rst' # source_encoding # source_parsers master_doc = 'index' exclude_patterns = ['_build', '_templates'] templates_path = ['_templates'] # template_bridge # rst_epilog # rst_prolog # primary_domain # default_role # keep_warnings # suppress_warnings # needs_sphinx = '1.0' # needs_extensions nitpicky = True # nitpick_ignore # numfig # numfig_format # numfig_secnum_depth # tls_verify # tls_cacerts # -- Project information ------------------------------------------------------ project = 'iteration_utilities' copyright = project_startyear + ', ' + author version = importlib.metadata.version("iteration_utilities") release = importlib.metadata.version("iteration_utilities") # today # today_fmt # highlight_language # highlight_options pygments_style = 'sphinx' # add_function_parentheses # add_module_names # show_authors # modindex_common_prefix # trim_footnote_reference_space # trim_doctest_flags # -- Options for internationalization ----------------------------------------- language = 'en' # locale_dirs # gettext_compact # gettext_uuid # gettext_location # gettext_auto_build # gettext_additional_targets # figure_language_filename # -- Options for HTML output -------------------------------------------------- html_theme = 'nature' # html_theme_options # html_theme_path # html_style # html_title # html_short_title # html_context # html_logo # html_favicon # html_static_path = ['_static'] # html_extra_path # html_last_updated_fmt # html_use_smartypants # html_add_permalinks # html_sidebars # html_additional_pages # html_domain_indices # html_use_modindex # html_use_index # html_split_index # html_copy_source # html_show_sourcelink # html_sourcelink_suffix # html_use_opensearch # html_file_suffix # html_link_suffix # html_translator_class # html_show_copyright # html_show_sphinx # html_output_encoding # html_compact_lists # html_secnumber_suffix # html_search_language # html_search_options # html_search_scorer # html_scaled_image_link htmlhelp_basename = project + 'doc' # -- Options for Apple Help output -------------------------------------------- # applehelp_bundle_name # applehelp_bundle_id # applehelp_dev_region # applehelp_bundle_version # applehelp_icon # applehelp_kb_product # applehelp_kb_url # applehelp_remote_url # applehelp_index_anchors # applehelp_min_term_length # applehelp_stopwords # applehelp_locale # applehelp_title # applehelp_codesign_identity # applehelp_codesign_flags # applehelp_indexer_path # applehelp_codesign_path # applehelp_disable_external_tools # -- Options for epub output -------------------------------------------------- # epub_basename # epub_theme # epub_theme_options epub_title = project # epub_description epub_author = author # epub_contributor # epub_language epub_publisher = author epub_copyright = copyright # epub_identifier # epub_scheme # epub_uid # epub_cover # epub_guide # epub_pre_files # epub_post_files epub_exclude_files = ['search.html'] # epub_tocdepth # epub_tocdup # epub_tocscope # epub_fix_images # epub_max_image_width # epub_show_urls # epub_use_index # epub_writing_mode # epub3_page_progression_direction # -- Options for LaTeX output ------------------------------------------------- # latex_engine latex_documents = [(master_doc, # startdocname project + '.tex', # targetname project.replace('_', '\\_') + ' Documentation', # title author, # author 'manual', # documentclass False), # toctree_only ] # latex_logo # latex_toplevel_sectioning # latex_use_parts # latex_appendices # latex_domain_indices # latex_use_modindex # latex_show_pagerefs # latex_show_urls # latex_keep_old_macro_names # latex_elements # latex_docclass # latex_additional_files # latex_preamble # latex_paper_size # latex_font_size # -- Options for text output -------------------------------------------------- # text_newlines # text_sectionchars # -- Options for manual page output ------------------------------------------- man_pages = [(master_doc, # startdocname project, # name project + ' Documentation', # description author, # authors 1), # section ] # man_show_urls # -- Options for Texinfo output ----------------------------------------------- texinfo_documents = [(master_doc, # startdocname project, # targetname project + ' Documentation', # title author, # author project, # dir_entry project_description, # description project_category, # category False), # toctree_only ] # texinfo_appendices # texinfo_domain_indices # texinfo_show_urls # texinfo_no_detailmenu # texinfo_elements # -- Options for the linkcheck builder ---------------------------------------- # linkcheck_ignore # linkcheck_retries # linkcheck_timeout # linkcheck_workers # linkcheck_anchors # linkcheck_anchors_ignore # -- Options for the XML builder ---------------------------------------------- # xml_pretty # -- Options for the C++ domain ----------------------------------------------- # cpp_index_common_prefix # cpp_id_attributes # cpp_paren_attributes # ############################################################################# # sphinx.ext.autodoc # ############################################################################# autoclass_content = "both" # autodoc_member_order autodoc_default_options = {'members': True, 'inherited-members': True} autodoc_docstring_signature = True # autodoc_mock_imports # ############################################################################# # sphinx.ext.autosummary # ############################################################################# # Workaround for https://github.com/sphinx-doc/sphinx/issues/6695 # otherwise we could also use = False here. autosummary_generate = [] # ############################################################################# # sphinx.ext.coverage # ############################################################################# # coverage_ignore_modules # coverage_ignore_functions # coverage_ignore_classes # coverage_c_path # coverage_c_regexes # coverage_ignore_c_items # coverage_write_headline # coverage_skip_undoc_in_source # ############################################################################# # sphinx.ext.doctest # ############################################################################# # doctest_default_flags # doctest_path # doctest_global_setup # doctest_global_cleanup # doctest_test_doctest_blocks # ############################################################################# # sphinx.ext.extlinks # ############################################################################# # extlinks # ############################################################################# # sphinx.ext.graphviz # ############################################################################# # graphviz_dot # graphviz_dot_args # graphviz_output_format # ############################################################################# # sphinx.ext.inheritance_diagram # ############################################################################# # inheritance_graph_attrs # inheritance_node_attrs # inheritance_edge_attrs # ############################################################################# # sphinx.ext.intersphinx # ############################################################################# intersphinx_mapping = {'python': ('https://docs.python.org/3.8/', # target None), # inventory } # intersphinx_cache_limit # intersphinx_timeout # ############################################################################# # sphinx.ext.linkcode # ############################################################################# # linkcode_resolve # ############################################################################# # All math # ############################################################################# # math_number_all # ############################################################################# # sphinx.ext.imgmath # ############################################################################# # imgmath_image_format # imgmath_latex # imgmath_dvipng # imgmath_dvisvgm # imgmath_latex_args # imgmath_latex_preamble # imgmath_dvipng_args # imgmath_dvisvgm_args # imgmath_use_preview # imgmath_add_tooltips # imgmath_font_size # ############################################################################# # sphinx.ext.mathjax # ############################################################################# # mathjax_path # ############################################################################# # sphinx.ext.jsmath # ############################################################################# # jsmath_path # ############################################################################# # sphinx.ext.todo # ############################################################################# # todo_include_todos # todo_emit_warnings # todo_link_only # ############################################################################# # sphinx.ext.napoleon # ############################################################################# # napoleon_google_docstring # napoleon_numpy_docstring # napoleon_include_init_with_doc # napoleon_include_private_with_doc # napoleon_include_special_with_doc # napoleon_use_admonition_for_examples # napoleon_use_admonition_for_notes # napoleon_use_admonition_for_references # napoleon_use_ivar # napoleon_use_param # napoleon_use_keyword # napoleon_use_rtype # ############################################################################# # numpydoc # ############################################################################# # numpydoc_use_plots numpydoc_show_class_members = False # numpydoc_show_inherited_class_members # numpydoc_class_members_toctree 0707010000003F000081A400000000000000000000000165E3BCDA0000044F000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/copy_iterators.rstIterators and copy ------------------ Often it is simply not necessary to make a copy of an iterator. However there are certain cases where a copy is wanted. In these cases there three major options: - :py:func:`itertools.tee` - :py:mod:`pickle` - :py:func:`copy.deepcopy` Given these three possibilities the iterators in :py:mod:`iteration_utilities` should **never** be copied using :py:func:`copy.copy`. Generally :py:func:`itertools.tee` should be the best choice except when file serialization is needed. Then :py:mod:`pickle` should be used. The :py:func:`copy.deepcopy` function makes, like :py:mod:`pickle` a deep (recursive) copy but doesn't support file serialization. A simple example using :py:func:`itertools.tee`:: >>> from iteration_utilities import roundrobin >>> from itertools import tee >>> # Create an iterator >>> a = roundrobin([1, 2, 3], [4, 5], [6]) >>> # Create a new iterator >>> # Overwrite "a" because the input for "tee" shouldn't be reused after >>> # the call to "tee". >>> a, b = tee(a, 2) >>> next(a) 1 >>> next(b) 1 07070100000040000081A400000000000000000000000165E3BCDA00000480000000000000000000000000000000000000002E00000000iteration_utilities-0.12.1/docs/functools.rstFunctools --------- Some classes that are applied to functions and change the way the function is called. Some of those listed are present in the ``functools`` [0]_ module. Functools for one function ^^^^^^^^^^^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.complement`, negate the return value of the function. Equivalent to ``lambda func: not func``. - :py:func:`~iteration_utilities.flip`, reverse the order of the arguments passed to the function. Equivalent to ``lambda func: lambda *args, **kwargs: func(*list(reversed(args)), **kwargs)`` - :py:func:`functools.lru_cache`, cache the return value of a function. - :py:func:`functools.partial`, partially set the arguments of a function. - :py:func:`~iteration_utilities.partial`, partially set the arguments of a function; also accepting placeholders. - :py:class:`functools.partialmethod`, same as :py:func:`functools.partial` but works on methods. Functools for several functions ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.chained`, apply several functions (successively). References ~~~~~~~~~~ .. [0] https://docs.python.org/library/functools.html 07070100000041000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002A00000000iteration_utilities-0.12.1/docs/generated07070100000042000081A400000000000000000000000165E3BCDA00000149000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/docs/generated/InfiniteIterable.rstInfiniteIterable ================ .. currentmodule:: iteration_utilities .. autoclass:: InfiniteIterable .. method:: __getitem__(idx) See :py:func:`~iteration_utilities.getitem`. If the `idx` is a :py:class:`slice` then it is appropriately converted for the :py:func:`~iteration_utilities.getitem` call. 07070100000043000081A400000000000000000000000165E3BCDA00000130000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/ItemIdxKey.rstItemIdxKey ========== .. currentmodule:: iteration_utilities .. autoclass:: ItemIdxKey .. method:: __lt__(other) Compare to other :py:class:`~iteration_utilities.ItemIdxKey` instance. .. method:: __gt__(other) Compare to other :py:class:`~iteration_utilities.ItemIdxKey` instance.07070100000044000081A400000000000000000000000165E3BCDA000001C3000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/docs/generated/Iterable.rstIterable ======== .. currentmodule:: iteration_utilities .. autoclass:: Iterable .. method:: __getitem__(idx) See :py:func:`~iteration_utilities.getitem`. If the `idx` is a :py:class:`slice` then it is appropriately converted for the :py:func:`~iteration_utilities.getitem` call. .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000045000081A400000000000000000000000165E3BCDA00000061000000000000000000000000000000000000003C00000000iteration_utilities-0.12.1/docs/generated/ManyIterables.rstManyIterables ============= .. currentmodule:: iteration_utilities .. autoclass:: ManyIterables07070100000046000081A400000000000000000000000165E3BCDA0000015B000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/Placeholder.rstPlaceholder =========== .. currentmodule:: iteration_utilities .. data:: Placeholder The object that is recognized as placeholder for :py:class:`iteration_utilities.partial`. .. versionchanged:: 0.11.0 The type of this instance (previously accessible as ``iteration_utilities.PlaceholderType``) is now considered private. 07070100000047000081A400000000000000000000000165E3BCDA0000028F000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/Seen.rstSeen ==== .. currentmodule:: iteration_utilities .. autoclass:: Seen .. method:: __contains__(x) Returns if :py:class:`~iteration_utilities.Seen` contains `x`; either in :py:attr:`.seenset` if `x` is hashable or :py:attr:`.seenlist` if not. .. method:: __len__() Returns the number of items in :py:attr:`.seenset` and :py:attr:`.seenlist`. .. method:: __eq__(other) Check if the `other` :py:class:`~iteration_utilities.Seen` instance contains the same elements. .. method:: __ne__(other) Check if the `other` :py:class:`~iteration_utilities.Seen` instance contains different elements. 07070100000048000081A400000000000000000000000165E3BCDA000000EB000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/accumulate.rstaccumulate ========== .. currentmodule:: iteration_utilities .. autoclass:: accumulate .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000049000081A400000000000000000000000165E3BCDA00000061000000000000000000000000000000000000003B00000000iteration_utilities-0.12.1/docs/generated/all_distinct.rstall_distinct ============ .. currentmodule:: iteration_utilities .. autofunction:: all_distinct0707010000004A000081A400000000000000000000000165E3BCDA00000058000000000000000000000000000000000000003800000000iteration_utilities-0.12.1/docs/generated/all_equal.rstall_equal ========= .. currentmodule:: iteration_utilities .. autofunction:: all_equal0707010000004B000081A400000000000000000000000165E3BCDA00000067000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/docs/generated/all_isinstance.rstall_isinstance ============== .. currentmodule:: iteration_utilities .. autofunction:: all_isinstance0707010000004C000081A400000000000000000000000165E3BCDA00000061000000000000000000000000000000000000003B00000000iteration_utilities-0.12.1/docs/generated/all_monotone.rstall_monotone ============ .. currentmodule:: iteration_utilities .. autofunction:: all_monotone0707010000004D000081A400000000000000000000000165E3BCDA0000006A000000000000000000000000000000000000003E00000000iteration_utilities-0.12.1/docs/generated/always_iterable.rstalways_iterable =============== .. currentmodule:: iteration_utilities .. autofunction:: always_iterable0707010000004E000081A400000000000000000000000165E3BCDA00000067000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/docs/generated/any_isinstance.rstany_isinstance ============== .. currentmodule:: iteration_utilities .. autofunction:: any_isinstance0707010000004F000081A400000000000000000000000165E3BCDA00000055000000000000000000000000000000000000003800000000iteration_utilities-0.12.1/docs/generated/applyfunc.rstapplyfunc ========= .. currentmodule:: iteration_utilities .. autoclass:: applyfunc07070100000050000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/argmax.rstargmax ====== .. currentmodule:: iteration_utilities .. autofunction:: argmax07070100000051000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/argmin.rstargmin ====== .. currentmodule:: iteration_utilities .. autofunction:: argmin07070100000052000081A400000000000000000000000165E3BCDA00000058000000000000000000000000000000000000003800000000iteration_utilities-0.12.1/docs/generated/argsorted.rstargsorted ========= .. currentmodule:: iteration_utilities .. autofunction:: argsorted07070100000053000081A400000000000000000000000165E3BCDA000002EB000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/chained.rstchained ======= .. currentmodule:: iteration_utilities .. autoclass:: chained .. method:: __call__(*args, **kwargs) Depending on the `reverse` and `all` argument the function returns: ======= ===== =========================================================== reverse all returns ======= ===== =========================================================== False False ``func_1(...(func_n(*args, **kwargs)))`` True False ``func_n(...(func_1(*args, **kwargs)))`` False True ``(func_1(*args, **kwargs), ..., func_n(*args, **kwargs))`` True True ``(func_n(*args, **kwargs), ..., func_1(*args, **kwargs))`` ======= ===== ===========================================================07070100000054000081A400000000000000000000000165E3BCDA000000DC000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/clamp.rstclamp ===== .. currentmodule:: iteration_utilities .. autoclass:: clamp .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000055000081A400000000000000000000000165E3BCDA0000008E000000000000000000000000000000000000004A00000000iteration_utilities-0.12.1/docs/generated/combinations_from_relations.rstcombinations_from_relations =========================== .. currentmodule:: iteration_utilities .. autofunction:: combinations_from_relations07070100000056000081A400000000000000000000000165E3BCDA000000C7000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/complement.rstcomplement ========== .. currentmodule:: iteration_utilities .. autoclass:: complement .. method:: __call__(*args, **kwargs) Returns ``not func(*args, **kwargs)`` using :py:attr:`.func`.07070100000057000081A400000000000000000000000165E3BCDA000000BB000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/docs/generated/constant.rstconstant ======== .. currentmodule:: iteration_utilities .. autoclass:: constant .. method:: __call__(*args, **kwargs) Returns :attr:`.item` given when creating the instance.07070100000058000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/consume.rstconsume ======= .. currentmodule:: iteration_utilities .. autofunction:: consume07070100000059000081A400000000000000000000000165E3BCDA0000005E000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/count_items.rstcount_items =========== .. currentmodule:: iteration_utilities .. autofunction:: count_items0707010000005A000081A400000000000000000000000165E3BCDA0000005B000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/deepflatten.rstdeepflatten =========== .. currentmodule:: iteration_utilities .. autoclass:: deepflatten0707010000005B000081A400000000000000000000000165E3BCDA0000005B000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/dotproduct.rstdotproduct ========== .. currentmodule:: iteration_utilities .. autofunction:: dotproduct0707010000005C000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/double.rstdouble ====== .. currentmodule:: iteration_utilities .. autofunction:: double0707010000005D000081A400000000000000000000000165E3BCDA00000058000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/duplicates.rstduplicates ========== .. currentmodule:: iteration_utilities .. autoclass:: duplicates0707010000005E000081A400000000000000000000000165E3BCDA00000079000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/empty.rstempty ===== .. currentmodule:: iteration_utilities .. data:: empty An empty iterator. .. versionadded:: 0.11.0 0707010000005F000081A400000000000000000000000165E3BCDA000000E1000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/first.rstfirst ===== .. currentmodule:: iteration_utilities .. function:: first(iterable[, default, pred, truthy, retpred, retidx]) This callable is equivalent to ``nth(0)``. .. seealso:: :py:class:`~iteration_utilities.nth`07070100000060000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/flatten.rstflatten ======= .. currentmodule:: iteration_utilities .. autofunction:: flatten07070100000061000081A400000000000000000000000165E3BCDA000000C2000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/flip.rstflip ==== .. currentmodule:: iteration_utilities .. autoclass:: flip .. method:: __call__(*args, **kwargs) Returns ``func(*tuple(reversed(args)), **kwargs)`` using :py:attr:`.func`.07070100000062000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/getitem.rstgetitem ======= .. currentmodule:: iteration_utilities .. autofunction:: getitem07070100000063000081A400000000000000000000000165E3BCDA00000058000000000000000000000000000000000000003800000000iteration_utilities-0.12.1/docs/generated/groupedby.rstgroupedby ========= .. currentmodule:: iteration_utilities .. autofunction:: groupedby07070100000064000081A400000000000000000000000165E3BCDA000000E2000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/grouper.rstgrouper ======= .. currentmodule:: iteration_utilities .. autoclass:: grouper .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000065000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/insert.rstinsert ====== .. currentmodule:: iteration_utilities .. autofunction:: insert07070100000066000081A400000000000000000000000165E3BCDA000000EE000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/intersperse.rstintersperse =========== .. currentmodule:: iteration_utilities .. autoclass:: intersperse .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000067000081A400000000000000000000000165E3BCDA0000005B000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/ipartition.rstipartition ========== .. currentmodule:: iteration_utilities .. autofunction:: ipartition07070100000068000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/is_None.rstis_None ======= .. currentmodule:: iteration_utilities .. autofunction:: is_None07070100000069000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/is_even.rstis_even ======= .. currentmodule:: iteration_utilities .. autofunction:: is_even0707010000006A000081A400000000000000000000000165E3BCDA0000005E000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/is_iterable.rstis_iterable =========== .. currentmodule:: iteration_utilities .. autofunction:: is_iterable0707010000006B000081A400000000000000000000000165E3BCDA0000005E000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/is_not_None.rstis_not_None =========== .. currentmodule:: iteration_utilities .. autofunction:: is_not_None0707010000006C000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/is_odd.rstis_odd ====== .. currentmodule:: iteration_utilities .. autofunction:: is_odd0707010000006D000081A400000000000000000000000165E3BCDA0000005B000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/iter_except.rstiter_except =========== .. currentmodule:: iteration_utilities .. autoclass:: iter_except0707010000006E000081A400000000000000000000000165E3BCDA00000067000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/docs/generated/itersubclasses.rstitersubclasses ============== .. currentmodule:: iteration_utilities .. autofunction:: itersubclasses0707010000006F000081A400000000000000000000000165E3BCDA000000DF000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/last.rstlast ==== .. currentmodule:: iteration_utilities .. function:: last(iterable[, default, pred, truthy, retpred, retidx]) This callable is equivalent to ``nth(-1)``. .. seealso:: :py:class:`~iteration_utilities.nth`07070100000070000081A400000000000000000000000165E3BCDA000000DC000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/merge.rstmerge ===== .. currentmodule:: iteration_utilities .. autoclass:: merge .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000071000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/minmax.rstminmax ====== .. currentmodule:: iteration_utilities .. autofunction:: minmax07070100000072000081A400000000000000000000000165E3BCDA00000053000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/ncycles.rstncycles ======= .. currentmodule:: iteration_utilities .. autofunction:: ncycles 07070100000073000081A400000000000000000000000165E3BCDA000000BA000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/docs/generated/nth.rstnth === .. currentmodule:: iteration_utilities .. autoclass:: nth .. method:: __call__(iterable[, default, pred, truthy, retpred, retidx]) Find the :py:attr:`.n`-th element.07070100000074000081A400000000000000000000000165E3BCDA00000059000000000000000000000000000000000000003E00000000iteration_utilities-0.12.1/docs/generated/nth_combination.rstinsert ====== .. currentmodule:: iteration_utilities .. autofunction:: nth_combination 07070100000075000081A400000000000000000000000165E3BCDA00000046000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/docs/generated/one.rstone === .. currentmodule:: iteration_utilities .. autofunction:: one07070100000076000081A400000000000000000000000165E3BCDA000000B0000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/packed.rstpacked ====== .. currentmodule:: iteration_utilities .. autoclass:: packed .. method:: __call__(x, **kwargs) Returns ``func(*x, **kwargs)`` using :py:attr:`.func`.07070100000077000081A400000000000000000000000165E3BCDA00000046000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/docs/generated/pad.rstpad === .. currentmodule:: iteration_utilities .. autofunction:: pad07070100000078000081A400000000000000000000000165E3BCDA00000348000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/partial.rstpartial ======= .. currentmodule:: iteration_utilities .. autoclass:: partial .. method:: __call__(*additional_args, **additional_kwargs) Returns ``func(*all_args, **all_kwargs)`` using :py:attr:`.func`. The ``all_args`` are the :py:attr:`.args` combined with the ``additional_args``. Likewise the ``all_kwargs`` is a mapping created from the :py:attr:`keywords` with the ``additional_kwargs``. .. method:: __sizeof__() Returns size of the instance in memory, in bytes. .. attribute:: _ (:py:data:`~iteration_utilities.Placeholder`) Allows easy access to a placeholder without having to import :py:data:`~iteration_utilities.Placeholder`. .. attribute:: __dict__ (:py:class:`dict`) instances have a normal ``__dict__`` member and support instance attributes.07070100000079000081A400000000000000000000000165E3BCDA00000058000000000000000000000000000000000000003800000000iteration_utilities-0.12.1/docs/generated/partition.rstpartition ========= .. currentmodule:: iteration_utilities .. autofunction:: partition0707010000007A000081A400000000000000000000000165E3BCDA00000055000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/docs/generated/powerset.rstpowerset ======== .. currentmodule:: iteration_utilities .. autofunction:: powerset0707010000007B000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/radd.rstradd ==== .. currentmodule:: iteration_utilities .. autofunction:: radd0707010000007C000081A400000000000000000000000165E3BCDA00000073000000000000000000000000000000000000004100000000iteration_utilities-0.12.1/docs/generated/random_combination.rstrandom_combination ================== .. currentmodule:: iteration_utilities .. autofunction:: random_combination0707010000007D000081A400000000000000000000000165E3BCDA00000073000000000000000000000000000000000000004100000000iteration_utilities-0.12.1/docs/generated/random_permutation.rstrandom_permutation ================== .. currentmodule:: iteration_utilities .. autofunction:: random_permutation0707010000007E000081A400000000000000000000000165E3BCDA00000067000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/docs/generated/random_product.rstrandom_product ============== .. currentmodule:: iteration_utilities .. autofunction:: random_product0707010000007F000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/rdiv.rstrdiv ==== .. currentmodule:: iteration_utilities .. autofunction:: rdiv07070100000080000081A400000000000000000000000165E3BCDA0000005B000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/reciprocal.rstreciprocal ========== .. currentmodule:: iteration_utilities .. autofunction:: reciprocal07070100000081000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/remove.rstremove ====== .. currentmodule:: iteration_utilities .. autofunction:: remove07070100000082000081A400000000000000000000000165E3BCDA0000005B000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/repeatfunc.rstrepeatfunc ========== .. currentmodule:: iteration_utilities .. autofunction:: repeatfunc07070100000083000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003600000000iteration_utilities-0.12.1/docs/generated/replace.rstreplace ======= .. currentmodule:: iteration_utilities .. autofunction:: replace07070100000084000081A400000000000000000000000165E3BCDA000000E8000000000000000000000000000000000000003800000000iteration_utilities-0.12.1/docs/generated/replicate.rstreplicate ========= .. currentmodule:: iteration_utilities .. autoclass:: replicate .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000085000081A400000000000000000000000165E3BCDA000000E1000000000000000000000000000000000000003B00000000iteration_utilities-0.12.1/docs/generated/return_False.rstreturn_False ============ .. currentmodule:: iteration_utilities .. function:: return_False(*args, **kwargs) This callable is equivalent to ``constant(False)``. .. seealso:: :py:class:`~iteration_utilities.constant`07070100000086000081A400000000000000000000000165E3BCDA000000DD000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/return_None.rstreturn_None =========== .. currentmodule:: iteration_utilities .. function:: return_None(*args, **kwargs) This callable is equivalent to ``constant(None)``. .. seealso:: :py:class:`~iteration_utilities.constant`07070100000087000081A400000000000000000000000165E3BCDA000000DD000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/return_True.rstreturn_True =========== .. currentmodule:: iteration_utilities .. function:: return_True(*args, **kwargs) This callable is equivalent to ``constant(True)``. .. seealso:: :py:class:`~iteration_utilities.constant`07070100000088000081A400000000000000000000000165E3BCDA00000064000000000000000000000000000000000000003C00000000iteration_utilities-0.12.1/docs/generated/return_called.rstreturn_called ============= .. currentmodule:: iteration_utilities .. autofunction:: return_called07070100000089000081A400000000000000000000000165E3BCDA0000006D000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/docs/generated/return_first_arg.rstreturn_first_arg ================ .. currentmodule:: iteration_utilities .. autofunction:: return_first_arg0707010000008A000081A400000000000000000000000165E3BCDA0000006A000000000000000000000000000000000000003E00000000iteration_utilities-0.12.1/docs/generated/return_identity.rstreturn_identity =============== .. currentmodule:: iteration_utilities .. autofunction:: return_identity0707010000008B000081A400000000000000000000000165E3BCDA0000004C000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/rfdiv.rstrfdiv ===== .. currentmodule:: iteration_utilities .. autofunction:: rfdiv0707010000008C000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/rmod.rstrmod ==== .. currentmodule:: iteration_utilities .. autofunction:: rmod0707010000008D000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/rmul.rstrmul ==== .. currentmodule:: iteration_utilities .. autofunction:: rmul0707010000008E000081A400000000000000000000000165E3BCDA000000EB000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/roundrobin.rstroundrobin ========== .. currentmodule:: iteration_utilities .. autoclass:: roundrobin .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 0707010000008F000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/rpow.rstrpow ==== .. currentmodule:: iteration_utilities .. autofunction:: rpow07070100000090000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/rsub.rstrsub ==== .. currentmodule:: iteration_utilities .. autofunction:: rsub07070100000091000081A400000000000000000000000165E3BCDA000000E4000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/second.rstsecond ====== .. currentmodule:: iteration_utilities .. function:: second(iterable[, default, pred, truthy, retpred, retidx]) This callable is equivalent to ``nth(1)``. .. seealso:: :py:class:`~iteration_utilities.nth`07070100000092000081A400000000000000000000000165E3BCDA000000EE000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/docs/generated/sideeffects.rstsideeffects =========== .. currentmodule:: iteration_utilities .. autoclass:: sideeffects .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000093000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/split.rstsplit ===== .. currentmodule:: iteration_utilities .. autoclass:: split07070100000094000081A400000000000000000000000165E3BCDA0000004F000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/docs/generated/square.rstsquare ====== .. currentmodule:: iteration_utilities .. autofunction:: square07070100000095000081A400000000000000000000000165E3BCDA00000058000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/starfilter.rststarfilter ========== .. currentmodule:: iteration_utilities .. autoclass:: starfilter07070100000096000081A400000000000000000000000165E3BCDA000000EB000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/docs/generated/successive.rstsuccessive ========== .. currentmodule:: iteration_utilities .. autoclass:: successive .. method:: __length_hint__() Tries to estimate for the length of the instance (returns ``0`` if an estimation is not possible). 07070100000097000081A400000000000000000000000165E3BCDA00000052000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/docs/generated/tabulate.rsttabulate ======== .. currentmodule:: iteration_utilities .. autoclass:: tabulate07070100000098000081A400000000000000000000000165E3BCDA00000049000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/docs/generated/tail.rsttail ==== .. currentmodule:: iteration_utilities .. autofunction:: tail07070100000099000081A400000000000000000000000165E3BCDA00000064000000000000000000000000000000000000003C00000000iteration_utilities-0.12.1/docs/generated/tee_lookahead.rsttee_lookahead ============= .. currentmodule:: iteration_utilities .. autofunction:: tee_lookahead0707010000009A000081A400000000000000000000000165E3BCDA000000E1000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/docs/generated/third.rstthird ===== .. currentmodule:: iteration_utilities .. function:: third(iterable[, default, pred, truthy, retpred, retidx]) This callable is equivalent to ``nth(2)``. .. seealso:: :py:class:`~iteration_utilities.nth`0707010000009B000081A400000000000000000000000165E3BCDA00000067000000000000000000000000000000000000003E00000000iteration_utilities-0.12.1/docs/generated/unique_everseen.rstunique_everseen =============== .. currentmodule:: iteration_utilities .. autoclass:: unique_everseen0707010000009C000081A400000000000000000000000165E3BCDA00000067000000000000000000000000000000000000003E00000000iteration_utilities-0.12.1/docs/generated/unique_justseen.rstunique_justseen =============== .. currentmodule:: iteration_utilities .. autoclass:: unique_justseen0707010000009D000081A400000000000000000000000165E3BCDA00000FA0000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/docs/generators.rstGenerators ---------- Generators are lazy-evaluating data structures. The values are generated on demand which allows processing iterables without loading all of the iterable into memory at once. This makes chaining several generators very efficient. .. warning:: Generators have one disadvantage over data structures like i.e. lists or tuples: They can be only processed once. Once the generator is exhausted the generator cannot be processed again. Generators can be created in very different contexts, in this section these are grouped into three categories: processing an iterable, processing a value and from a function. Processing an iterable ^^^^^^^^^^^^^^^^^^^^^^ - :py:func:`itertools.accumulate` - :py:func:`~iteration_utilities.accumulate` - :py:meth:`itertools.chain.from_iterable` (implemented as `flatten` in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.clamp` - :py:func:`itertools.combinations` - :py:func:`~iteration_utilities.combinations_from_relations` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`itertools.combinations_with_replacement` - :py:func:`itertools.compress` - :py:func:`~iteration_utilities.consume` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`itertools.cycle` - :py:func:`~iteration_utilities.deepflatten` - :py:func:`itertools.dropwhile` - :py:func:`~iteration_utilities.duplicates` - :py:func:`enumerate` - :py:func:`filter` - :py:func:`itertools.filterfalse` - :py:func:`~iteration_utilities.flatten` - :py:func:`~iteration_utilities.getitem` - :py:func:`itertools.groupby` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.grouper` - :py:func:`~iteration_utilities.insert` - :py:func:`~iteration_utilities.intersperse` - :py:func:`~iteration_utilities.ipartition` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`itertools.islice` - :py:func:`iter` (one argument) (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.ncycles` - :py:func:`~iteration_utilities.pad` - :py:func:`itertools.permutations` - :py:func:`~iteration_utilities.powerset` - :py:func:`~iteration_utilities.remove` - :py:func:`~iteration_utilities.replace` - :py:func:`~iteration_utilities.replicate` - :py:func:`reversed` - :py:func:`~iteration_utilities.sideeffects` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.split` - :py:func:`~iteration_utilities.starfilter` - :py:func:`itertools.starmap` - :py:func:`~iteration_utilities.successive` - :py:func:`~iteration_utilities.tail` - :py:func:`itertools.takewhile` - :py:func:`itertools.tee` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.tee_lookahead` (not implemented in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.unique_everseen` - :py:func:`~iteration_utilities.unique_justseen` Processing several iterables ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. note:: These functions are implemented as methods for :py:class:`~iteration_utilities.ManyIterables`. - :py:func:`itertools.chain` - :py:func:`map` - :py:func:`heapq.merge` - :py:func:`~iteration_utilities.merge` - :py:func:`itertools.product` - :py:func:`~iteration_utilities.roundrobin` - :py:func:`zip` - :py:func:`itertools.zip_longest` Processing a value ^^^^^^^^^^^^^^^^^^ - :py:func:`itertools.count` - :py:func:`~iteration_utilities.itersubclasses` - :py:func:`itertools.repeat` From a function ^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.applyfunc` - :py:func:`iter` (two arguments) (implemented as `from_iterfunc_sentinel` in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.iter_except` (implemented as `from_iterfunc_exception` in :py:class:`~iteration_utilities.Iterable`) - :py:func:`~iteration_utilities.repeatfunc` - :py:func:`~iteration_utilities.tabulate` 0707010000009E000081A400000000000000000000000165E3BCDA00001BFD000000000000000000000000000000000000002A00000000iteration_utilities-0.12.1/docs/index.rstWelcome to iteration_utilities's documentation! =============================================== .. module:: iteration_utilities :synopsis: Utilities based on Pythons iterators and generators. :py:mod:`iteration_utilities` is a general purpose collection around the concept of functional programming based on and utilizing iterators and generators. Most of the functions presented here are inspired by the :py:mod:`itertools` module, especially the "recipes" section [0]_, but also by the ``toolz`` [1]_ package. It should be noted that there are lots more packages with similar functions, for example ``more-itertools`` [2]_, ``pydash`` [3]_ and many, many more. Large fractions of the code are implemented in C to achieve very good overall performance. However this library cannot compete (intentionally) with specialized libraries like ``NumPy``, ``pandas`` or ``SciPy``. .. note:: The documentation also presents functionality from several built-in modules and functions to provide a general overview over the available functionality. .. warning:: This library is under on-going development and may change its API! Overview -------- There are three classes providing a sequential functional interface for several built-in and additional functions: - :py:func:`~iteration_utilities.Iterable` - :py:func:`~iteration_utilities.InfiniteIterable` - :py:func:`~iteration_utilities.ManyIterables` And a complete list of all functionality provided in this package: ================================================ =========================================================== ================================================== ============================================== :py:func:`~iteration_utilities.accumulate` :py:func:`~iteration_utilities.all_distinct` :py:func:`~iteration_utilities.all_equal` :py:func:`~iteration_utilities.all_isinstance` :py:func:`~iteration_utilities.all_monotone` :py:func:`~iteration_utilities.always_iterable` :py:func:`~iteration_utilities.any_isinstance` :py:func:`~iteration_utilities.applyfunc` :py:func:`~iteration_utilities.argmax` :py:func:`~iteration_utilities.argmin` :py:func:`~iteration_utilities.argsorted` :py:func:`~iteration_utilities.chained` :py:func:`~iteration_utilities.clamp` :py:func:`~iteration_utilities.combinations_from_relations` :py:func:`~iteration_utilities.complement` :py:func:`~iteration_utilities.constant` :py:func:`~iteration_utilities.consume` :py:func:`~iteration_utilities.count_items` :py:func:`~iteration_utilities.deepflatten` :py:func:`~iteration_utilities.dotproduct` :py:func:`~iteration_utilities.double` :py:func:`~iteration_utilities.duplicates` :py:func:`~iteration_utilities.empty` :py:func:`~iteration_utilities.first` :py:func:`~iteration_utilities.flatten` :py:func:`~iteration_utilities.flip` :py:func:`~iteration_utilities.getitem` :py:func:`~iteration_utilities.groupedby` :py:func:`~iteration_utilities.grouper` :py:func:`~iteration_utilities.InfiniteIterable` :py:func:`~iteration_utilities.insert` :py:func:`~iteration_utilities.intersperse` :py:func:`~iteration_utilities.ipartition` :py:func:`~iteration_utilities.is_even` :py:func:`~iteration_utilities.is_iterable` :py:func:`~iteration_utilities.is_None` :py:func:`~iteration_utilities.is_not_None` :py:func:`~iteration_utilities.is_odd` :py:func:`~iteration_utilities.ItemIdxKey` :py:func:`~iteration_utilities.iter_except` :py:func:`~iteration_utilities.Iterable` :py:func:`~iteration_utilities.itersubclasses` :py:func:`~iteration_utilities.last` :py:func:`~iteration_utilities.ManyIterables` :py:func:`~iteration_utilities.merge` :py:func:`~iteration_utilities.minmax` :py:func:`~iteration_utilities.ncycles` :py:func:`~iteration_utilities.nth` :py:func:`~iteration_utilities.nth_combination` :py:func:`~iteration_utilities.one` :py:func:`~iteration_utilities.packed` :py:func:`~iteration_utilities.pad` :py:func:`~iteration_utilities.partial` :py:func:`~iteration_utilities.partition` :py:func:`~iteration_utilities.Placeholder` :py:func:`~iteration_utilities.powerset` :py:func:`~iteration_utilities.radd` :py:func:`~iteration_utilities.random_combination` :py:func:`~iteration_utilities.random_permutation` :py:func:`~iteration_utilities.random_product` :py:func:`~iteration_utilities.rdiv` :py:func:`~iteration_utilities.reciprocal` :py:func:`~iteration_utilities.remove` :py:func:`~iteration_utilities.repeatfunc` :py:func:`~iteration_utilities.replace` :py:func:`~iteration_utilities.replicate` :py:func:`~iteration_utilities.return_called` :py:func:`~iteration_utilities.return_False` :py:func:`~iteration_utilities.return_first_arg` :py:func:`~iteration_utilities.return_identity` :py:func:`~iteration_utilities.return_None` :py:func:`~iteration_utilities.return_True` :py:func:`~iteration_utilities.rfdiv` :py:func:`~iteration_utilities.rmod` :py:func:`~iteration_utilities.rmul` :py:func:`~iteration_utilities.roundrobin` :py:func:`~iteration_utilities.rpow` :py:func:`~iteration_utilities.rsub` :py:func:`~iteration_utilities.second` :py:func:`~iteration_utilities.Seen` :py:func:`~iteration_utilities.sideeffects` :py:func:`~iteration_utilities.split` :py:func:`~iteration_utilities.square` :py:func:`~iteration_utilities.starfilter` :py:func:`~iteration_utilities.successive` :py:func:`~iteration_utilities.tabulate` :py:func:`~iteration_utilities.tail` :py:func:`~iteration_utilities.tee_lookahead` :py:func:`~iteration_utilities.third` :py:func:`~iteration_utilities.unique_everseen` :py:func:`~iteration_utilities.unique_justseen` ================================================ =========================================================== ================================================== ============================================== Contents: .. toctree:: :maxdepth: 2 installation iterable generators reduce functools random callbacks operators tipps copy_iterators misc .. toctree:: :maxdepth: 1 api license CHANGES AUTHORS .. [0] https://docs.python.org/library/itertools.html#itertools-recipes .. [1] https://toolz.readthedocs.io .. [2] https://more-itertools.readthedocs.io/en/latest/ .. [3] https://pydash.readthedocs.io/en/latest/ Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` 0707010000009F000081A400000000000000000000000165E3BCDA00000AEB000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/docs/installation.rstInstallation ------------ The :py:mod:`iteration_utilities` package is a C extension python package supporting the Python versions: - Python 3.7+ Using pip ^^^^^^^^^ and can be installed with ``pip`` [0]_:: python -m pip install iteration_utilities or to install the development version:: python -m pip install git+https://github.com/MSeifert04/iteration_utilities.git@master Using conda ^^^^^^^^^^^ It can by installed with ``conda`` [2]_ from the ``conda-forge`` channel:: conda install -c conda-forge iteration_utilities Manual installation ^^^^^^^^^^^^^^^^^^^ To manually install the package you need to download the development version from ``git`` [1]_ and install it:: git clone https://github.com/MSeifert04/iteration_utilities.git cd iteration_utilities python -m pip install . with the clone from ``git`` one can also run the tests after the installation:: python -m pytest tests/ Or build the documentation:: sphinx-build -b html -W -a -n docs/ build/sphinx/html/ # local documentation build Testing debug installation ^^^^^^^^^^^^^^^^^^^^^^^^^^ The best way to test against a debug build is to use a Python installation that has been compiled in debug mode. One could use such a ``Dockerfile``. .. code-block:: docker FROM gcc:latest RUN \ wget https://www.python.org/ftp/python/3.8.0/Python-3.8.0.tgz -q && \ tar xzf Python-3.8.0.tgz && \ cd Python-3.8.0 && \ ./configure --with-pydebug && \ make altinstall -s -j4 && \ cd .. && \ python3.8 -c "import os; os.remove('./Python-3.8.0.tgz'); import shutil; shutil.rmtree('./Python-3.8.0/')" && \ python3.8 -m pip install pip --upgrade --user && \ python3.8 -m pip install setuptools wheel --upgrade --user && \ python3.8 -m pip install pytest --user This uses Python 3.8.0, you can obviously sdapt this for the actual Python version you want to build. Building the image, the library, and the tests. .. code-block:: none docker build -t pythondebug . docker run -it --rm -v INSERTDIRECTORYHERE:/io pythondebug python3.8 -m pip install /io python3.8 -m pytest /io/tests -s -v Dependencies ^^^^^^^^^^^^ Installation: - Python 3.7+ - setuptools - C compiler Developer Dependencies ^^^^^^^^^^^^^^^^^^^^^^ All dependencies can be installed using:: python -m pip install iteration_utilities[all] Or individual dependencies:: python -m pip install iteration_utilities[test] python -m pip install iteration_utilities[documentation] Tests: - pytest Documentation: - sphinx - numpydoc References ~~~~~~~~~~ .. [0] https://github.com/MSeifert04/iteration_utilities .. [1] https://pypi.python.org/pypi/iteration_utilities .. [2] https://www.continuum.io/ 070701000000A0000081A400000000000000000000000165E3BCDA00000F6E000000000000000000000000000000000000002D00000000iteration_utilities-0.12.1/docs/iterable.rstIterable, InfiniteIterable and ManyIterables -------------------------------------------- .. warning:: :py:class:`~iteration_utilities.Iterable`, :py:class:`~iteration_utilities.InfiniteIterable` and :py:class:`~iteration_utilities.ManyIterables` are currently experimental. :py:mod:`iteration_utilities` introduces these three classes that can be used as wrapper for Python iterables. These classes implement the generators present in the Python built-ins, the :py:mod:`itertools`-module and :py:mod:`iteration_utilities` as methods. These can be broadly classified in 4 categories: Creating an Iterable ^^^^^^^^^^^^^^^^^^^^ The constructor allows wrapping a specified `iterable` like a :py:class:`list` or :py:class:`range` object. But it also has several staticmethods for creating an :py:class:`~iteration_utilities.Iterable` by other means, these have the prefix ``from_``. For example the :py:meth:`iteration_utilities.Iterable.from_repeat` allows to create an :py:class:`~iteration_utilities.Iterable` using :py:func:`itertools.repeat`. Modifying and chaining operations ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ As soon as the :py:class:`~iteration_utilities.Iterable` is created one can process it. Each of the normal (not prefixed methods) returns the result of the operation (as generator!) so these can be arbitrarily chained. This allows to chain several operations sequentially. This can be demonstrated best with an actual example. Suppose we have a :py:class:`list` of strings of numbers and we want to convert each letter to an integer and then sum the numbers below 3:: >>> # Python example >>> from itertools import chain >>> def less_than_three(x): ... return x < 3 >>> inp = ['12314253', '12368412612', '7812358', '12381531'] >>> sum(filter(less_than_three, map(int, chain.from_iterable(inp)))) 23 >>> # Example with Iterable >>> from iteration_utilities import Iterable >>> sum(Iterable(inp).flatten().map(int).filter(less_than_three)) 23 Conversion methods ^^^^^^^^^^^^^^^^^^ The :py:class:`~iteration_utilities.Iterable` implements the iteration protocol so it's possible to use it everywhere where an iterable is needed. For example with ``for item in ...`` or to construct containers, i.e. ``list()``. For convenience (and to prevent some problems with infinite iterables) finite :py:class:`~iteration_utilities.Iterable` also have methods to convert them to the desired class. These are prefixed with ``as_``. :py:class:`~iteration_utilities.InfiniteIterable` **don't** have these to avoid creating an infinitely long :py:class:`list`. .. warning:: However :py:class:`~iteration_utilities.InfiniteIterable` also implement the iteration protocol and could be passed to :py:class:`list`, with severe consequences. So use the ``as_*`` and ``get_*`` methods which will still throw an :py:class:`AttributeError` but at least they won't create an :py:class:`MemoryError` or freeze your computer! You have been warned! Currently folding methods like :py:func:`sum` are implemented with the prefix ``get_``. .. note:: See the documentation of :py:class:`~iteration_utilities.Iterable` to see which methods are possible or read the next chapters for more background information. Operating on several iterables ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The :py:class:`~iteration_utilities.ManyIterables` class implements the methods that operate on several iterables and return a single :py:class:`~iteration_utilities.Iterable` or :py:class:`~iteration_utilities.InfiniteIterable`. However it is very important that the `iterables` given to :py:class:`~iteration_utilities.ManyIterables` clearly indicate if they are infinite, otherwise the methods won't know if the result should be finite or infinite. These infinite iterables should be wrapped in :py:class:`~iteration_utilities.InfiniteIterable` or created by the ``Iterable.from_*`` methods. 070701000000A1000081A400000000000000000000000165E3BCDA000001D0000000000000000000000000000000000000002C00000000iteration_utilities-0.12.1/docs/license.rstLicenses -------- License of iteration_utilities ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ :py:mod:`iteration_utilities` is licensed under the **Apache License Version 2.0**. .. literalinclude:: ../LICENSE.txt Additional licenses ^^^^^^^^^^^^^^^^^^^ Some parts of the library were taken from the Python documentation and source code and are therefore licensed under the **Python Software Foundation License**. .. literalinclude:: ../licenses/LICENSE_PYTHON.rst 070701000000A2000081A400000000000000000000000165E3BCDA00001B5F000000000000000000000000000000000000002900000000iteration_utilities-0.12.1/docs/make.bat@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^<target^>` where ^<target^> is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. xml to make Docutils-native XML files echo. pseudoxml to make pseudoxml-XML files for display purposes echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled echo. coverage to run coverage check of the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) REM Check if sphinx-build is available and fallback to Python version if any %SPHINXBUILD% 2> nul if errorlevel 9009 goto sphinx_python goto sphinx_ok :sphinx_python set SPHINXBUILD=python -m sphinx.__init__ %SPHINXBUILD% 2> nul if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.http://sphinx-doc.org/ exit /b 1 ) :sphinx_ok if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\iteration_utilities.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\iteration_utilities.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdf" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf cd %~dp0 echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdfja" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf-ja cd %~dp0 echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) if "%1" == "coverage" ( %SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage if errorlevel 1 exit /b 1 echo. echo.Testing of coverage in the sources finished, look at the ^ results in %BUILDDIR%/coverage/python.txt. goto end ) if "%1" == "xml" ( %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml if errorlevel 1 exit /b 1 echo. echo.Build finished. The XML files are in %BUILDDIR%/xml. goto end ) if "%1" == "pseudoxml" ( %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml if errorlevel 1 exit /b 1 echo. echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. goto end ) :end 070701000000A3000081A400000000000000000000000165E3BCDA0000029E000000000000000000000000000000000000002900000000iteration_utilities-0.12.1/docs/misc.rstMiscellanea ----------- :py:mod:`iteration_utilities` provides some general utilities that were useful in some of the implementations. Some of these might even be useful in other contexts, so these are summarized here. - :py:func:`~iteration_utilities.ItemIdxKey`, a class to facilitate stable sorting supporting `reverse` and `key`. - :py:func:`~iteration_utilities.Seen`, a class that wraps a :py:class:`set` and a :py:class:`list` supporting :py:meth:`in <iteration_utilities.Seen.__contains__>` operations and a :py:meth:`~iteration_utilities.Seen.contains_add` method to facilitate keeping track of already seen objects (even if they are unhashable). 070701000000A4000081A400000000000000000000000165E3BCDA0000262B000000000000000000000000000000000000002E00000000iteration_utilities-0.12.1/docs/operators.rstOperators --------- The Python operator module [0]_ contains a large variety of operators and :py:mod:`iteration_utilities` only tries to fill in some missing ones: Reverse arithmetic operators ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.reciprocal`, equivalent to ``lambda x: 1 / x`` +-------------------------------------------+--------------------------+ | Function | Equivalent | +===========================================+==========================+ | :py:func:`~iteration_utilities.radd` | ``lambda x, y: y + x`` | +-------------------------------------------+--------------------------+ | :py:func:`~iteration_utilities.rsub` | ``lambda x, y: y - x`` | +-------------------------------------------+--------------------------+ | :py:func:`~iteration_utilities.rmul` | ``lambda x, y: y * x`` | +-------------------------------------------+--------------------------+ | :py:func:`~iteration_utilities.rdiv` | ``lambda x, y: y / x`` | +-------------------------------------------+--------------------------+ | :py:func:`~iteration_utilities.rfdiv` | ``lambda x, y: y // x`` | +-------------------------------------------+--------------------------+ | :py:func:`~iteration_utilities.rpow` | ``lambda x, y: y ** x`` | +-------------------------------------------+--------------------------+ | :py:func:`~iteration_utilities.rmod` | ``lambda x, y: y % x`` | +-------------------------------------------+--------------------------+ Math operators ^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.double`, equivalent to ``lambda x: x * 2`` - :py:func:`~iteration_utilities.square`, equivalent to ``lambda x: x ** 2`` - :py:func:`~iteration_utilities.reciprocal`, equivalent to ``lambda x: 1 / x`` And of course the standard operators from the operator module: +------------------------------------------+--------------------------+ | Function | Equivalent | +==========================================+==========================+ | :py:func:`operator.add` (iadd) | ``lambda x, y: x + y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.sub` (isub) | ``lambda x, y: x - y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.mul` (imul) | ``lambda x, y: x * y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.truediv` (itruediv) | ``lambda x, y: x / y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.floordiv` (ifloordiv) | ``lambda x, y: x // y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.pow` (ipow) | ``lambda x, y: x ** y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.mod` (imod) | ``lambda x, y: x % y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.matmul` (imatmul) | ``lambda x, y: x @ y`` | +------------------------------------------+--------------------------+ | :py:func:`operator.abs` | ``lambda x: abs(x)`` | +------------------------------------------+--------------------------+ | :py:func:`operator.pos` | ``lambda x: +x`` | +------------------------------------------+--------------------------+ | :py:func:`operator.neg` | ``lambda x: -x`` | +------------------------------------------+--------------------------+ And the bitwise operators: +--------------------------------------+--------------------------+ | Function | Equivalent | +======================================+==========================+ | :py:func:`operator.lshift` (ilshift) | ``lambda x, y: x << y`` | +--------------------------------------+--------------------------+ | :py:func:`operator.rshift` (irshift) | ``lambda x, y: x >> y`` | +--------------------------------------+--------------------------+ | :py:func:`operator.and_` (iand) | ``lambda x, y: x & y`` | +--------------------------------------+--------------------------+ | :py:func:`operator.or_` (ior) | ``lambda x, y: x | y`` | +--------------------------------------+--------------------------+ | :py:func:`operator.xor` (ixor) | ``lambda x, y: x ^ y`` | +--------------------------------------+--------------------------+ | :py:func:`operator.inv` | ``lambda x: ~x`` | +--------------------------------------+--------------------------+ .. note:: The :mod:`math` module contains several more! Comparison operators ^^^^^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.is_even`, equivalent to ``lambda x: (x % 2) == 0``. - :py:func:`~iteration_utilities.is_odd`, equivalent to ``lambda x: (x % 2) != 0``. - :py:func:`~iteration_utilities.is_None`, equivalent to ``lambda x: x is None``. - :py:func:`~iteration_utilities.is_not_None`, equivalent to ``lambda x: x is not None``. - :py:func:`~iteration_utilities.is_iterable`, roughly equivalent to ``lambda x: isinstance(x, collections.Iterable)``. And the comparison operators from the Python library: +-----------------------------------+-----------------------------+ | Function | Equivalent | +===================================+=============================+ | :py:func:`operator.lt` | ``lambda x, y: x < y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.le` | ``lambda x, y: x <= y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.eq` | ``lambda x, y: x == y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.ne` | ``lambda x, y: x |= y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.ge` | ``lambda x, y: x >= y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.gt` | ``lambda x, y: x < y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.is_` | ``lambda x, y: x is y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.is_not` | ``lambda x, y: x is not y`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.truth` | ``lambda x: not not x`` | +-----------------------------------+-----------------------------+ | :py:func:`operator.not_` | ``lambda x: not x`` | +-----------------------------------+-----------------------------+ Misc ^^^^ And some misc operators +------------------------------------------+----------------------------------------------------+ | Function | Equivalent | +==========================================+====================================================+ | :py:func:`operator.index` * | ``lambda x: x__index__()`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.concat` (iconcat) * | ``lambda x, y: x + y`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.contains` | ``lambda x, y: y in x`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.countOf` * | ``lambda x, y: y.count(x)`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.indexOf` * | ``lambda x, y: y.index(x)`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.getitem` | ``lambda x, y: x[y]`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.setitem` | ``lambda x, y, z: x[y] = z`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.delitem` | ``lambda x, y: del x[y]`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.itemgetter` * | ``lambda x, lambda y: y[x]`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.attrgetter` * | ``lambda x, lambda y: y.x`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.methodcaller` * | ``lambda x, lambda y: y.x()`` | +------------------------------------------+----------------------------------------------------+ | :py:func:`operator.length_hint` * | ``lambda x, y: len(x) or x.__length_hint__() or y``| +------------------------------------------+----------------------------------------------------+ Marked (``*``) functions only have a rough equivalent and may be more sophisticated! References ~~~~~~~~~~ .. [0] https://docs.python.org/library/operator.html 070701000000A5000081A400000000000000000000000165E3BCDA000001B9000000000000000000000000000000000000002B00000000iteration_utilities-0.12.1/docs/random.rstRandom ------ Some convenience functions based on functions of the :py:mod:`random` and :py:mod:`itertools` module. Combine random and itertools ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - :py:func:`~iteration_utilities.random_combination`, return a random combination. - :py:func:`~iteration_utilities.random_permutation`, return a random permutation. - :py:func:`~iteration_utilities.random_product`, return a random value from each iterable. 070701000000A6000081A400000000000000000000000165E3BCDA00001D59000000000000000000000000000000000000002B00000000iteration_utilities-0.12.1/docs/reduce.rstFold functions -------------- Fold functions [0]_ reduce an iterable to a single value. Built-ins ^^^^^^^^^ There are several instances of fold functions Python library: - :py:func:`all`, reduces the iterable based on the truthiness of all elements. - :py:func:`any`, reduces the iterable based on the truthiness of all elements. - :py:func:`len`, reduces the iterable to the number of all elements. Does not work with generators! - :py:func:`max`, reduces the iterable to the maximum of all elements. - :py:func:`min`, reduces the iterable to the minimum of all elements. - :py:func:`sum`, reduces the iterable to the sum of all elements. and also several fold operators: - the boolean ``and`` and ``or`` operator. - the mathematical operators ``+``, ``-``, ``*``, ``/``, ``//``, ``%`` and ``**``. - the bitwise operators ``<<``, ``>>``, ``|``, ``^`` and ``&``. - the comparison operators ``<``, ``<=``, ``==``, ``!=``, ``>=``, ``>``. Builtin Library functions ^^^^^^^^^^^^^^^^^^^^^^^^^ - :py:func:`functools.reduce`, reduces the iterable by successively applying a binary function. :py:func:`functools.reduce` is probably the most general function that could be used to recreate all the builtin functions. For example: - ``reduce(lambda x, y: x and y, iterable)`` is equivalent to ``all()`` - ``reduce(lambda x, y: x or y, iterable)`` is equivalent to ``any()`` - ``reduce(lambda x, y: x + y, iterable)`` is equivalent to ``sum()`` - ``reduce(lambda x, y: x if x < y else y, iterable)`` is equivalent to ``min()`` - ``reduce(lambda x, y: x if x > y else y, iterable)`` is equivalent to ``max()`` - ``reduce(lambda x, y: x + 1, iterable, 0)`` is equivalent to ``len()`` .. warning:: These :py:func:`functools.reduce` functions are much slower than the built-ins! There are several other fold functions in the standard library and in third-party packages, most notably: - :py:func:`math.fsum` - ``statistics`` [1]_ - ``operator`` [4]_ - ``NumPy`` [2]_ - ``pandas`` [3]_ Additional ^^^^^^^^^^ The :py:mod:`iteration_utilities` package includes some additional fold functions: - :py:func:`~iteration_utilities.all_distinct`, reduces the iterable to a boolean value indicating if all the items are distinct. - :py:func:`~iteration_utilities.all_equal`, reduces the iterable to a boolean value indicating if all the items are equal. - :py:func:`~iteration_utilities.all_monotone`, reduces the iterable to a boolean value indicating if all the items are (strictly) bigger or smaller than their predecessor. - :py:func:`~iteration_utilities.argmax`, reduces the iterable to the index of the maximum. - :py:func:`~iteration_utilities.argmin`, reduces the iterable to the index of the minimum. - :py:func:`~iteration_utilities.count_items`, reduces the iterable to the number of (matching) items. - :py:func:`~iteration_utilities.minmax`, reduces the iterable to a tuple containing the minimum and maximum value. - :py:func:`~iteration_utilities.nth`, reduces the iterable to it's nth value. - :py:func:`~iteration_utilities.first`, reduces the iterable to it's first value. See also :py:func:`~iteration_utilities.nth`. - :py:func:`~iteration_utilities.second`, reduces the iterable to it's second value. See also :py:func:`~iteration_utilities.nth`. - :py:func:`~iteration_utilities.third`, reduces the iterable to it's third value. See also :py:func:`~iteration_utilities.nth`. - :py:func:`~iteration_utilities.last`, reduces the iterable to it's last value. See also :py:func:`~iteration_utilities.nth`. - :py:func:`~iteration_utilities.nth_combination`, creates the *nth* combination from the elements in the iterable without having to create the previous combinations. Helper functions ^^^^^^^^^^^^^^^^ Included in the :py:mod:`iteration_utilities` package are several helper functions that are based on normal Python code but chosen to evaluate faster than alternatives: - :py:func:`~iteration_utilities.all_isinstance`, reduces the iterable to the truthiness of :py:func:`isinstance` applied to all items. - :py:func:`~iteration_utilities.any_isinstance`, reduces the iterable to the truthiness of :py:func:`isinstance` applied to all items. - :py:func:`~iteration_utilities.dotproduct`, reduces two iterables to the result of the dotproduct. Fold to other data structure ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Most fold functions reduce an iterable by discarding most of the iterable. However :py:mod:`iteration_utilities` includes functions that discard no elements or only a few: - :py:func:`~iteration_utilities.argsorted`, create a list of indices that would sort the iterable. - :py:func:`~iteration_utilities.groupedby`, create a dictionary containing lists representing the groups of values of the iterable. - :py:func:`heapq.nlargest`, create a list containing the `n` largest items. - :py:func:`heapq.nsmallest`, create a list containing the `n` smallest items. - :py:func:`~iteration_utilities.partition`, create a list containing the items which do not fulfill some predicate and one containing the items that do. - :py:func:`sorted`, create a sorted list from an iterable. This list contains some builtin Python functions for completeness. Short-circuit functions ^^^^^^^^^^^^^^^^^^^^^^^ Short-circuit functions [5]_ stop as soon as the exit condition is met. These functions can yield significant speedups over functions that eagerly process the operand. There are several instances of short-circuit functions Python library: - :py:func:`all`, stops as soon as one item in the iterable is falsy. - :py:func:`any`, stops as soon as one item in the iterable is truthy. - :py:func:`next`, get the next item of an iterable. and also two short-circuit operators: - ``and``, evaluates the right side only if the left side is truthy. - ``or``, evaluates the right side only if the left side is falsy. :py:mod:`iteration_utilities` includes some additional short-circuit functions: - :py:func:`~iteration_utilities.all_distinct`, stops as soon as a duplicate item is found. - :py:func:`~iteration_utilities.all_equal`, stops as soon as a deviating item is found. - :py:func:`~iteration_utilities.all_monotone`, stops as soon as a item is found violating monotony. - :py:func:`~iteration_utilities.one`, get the one and only item of an iterable. - :py:func:`~iteration_utilities.nth`, stops after the nth item. - :py:func:`~iteration_utilities.first`, like :py:func:`~iteration_utilities.nth` this function stops after the first item. - :py:func:`~iteration_utilities.second`, like :py:func:`~iteration_utilities.nth` this function stops after the second item. - :py:func:`~iteration_utilities.third`, like :py:func:`~iteration_utilities.nth` this function stops after the third item. Included in the :py:mod:`iteration_utilities` package are several helper functions that are based on normal Python code but chosen to evaluate faster than alternatives: - :py:func:`~iteration_utilities.all_isinstance`, stops as soon as one item is not an instance of the specified types. - :py:func:`~iteration_utilities.any_isinstance`, stops as soon as one item is an instance of the specified types. References ~~~~~~~~~~ .. [0] https://en.wikipedia.org/wiki/Fold_(higher-order_function) .. [1] https://docs.python.org/library/statistics.html .. [2] http://www.numpy.org/ .. [3] http://pandas.pydata.org/ .. [4] https://docs.python.org/library/operator.html .. [5] https://en.wikipedia.org/wiki/Short-circuit_evaluation 070701000000A7000081A400000000000000000000000165E3BCDA00000D02000000000000000000000000000000000000002A00000000iteration_utilities-0.12.1/docs/tipps.rstTipps and Tricks ---------------- Sometimes it is not hard to speed up some simple tasks. This page shows some templates that might help to improve the performance of your code. .. note:: This page is more or less a stub right now. If you have any interesting facts to share please open a Pull Request or Issue. Map ^^^ :py:func:`map` can be much faster than list comprehensions or generator expressions *if and only if* the `function` is implemented in C without Python attribute lookup. All Python built-ins are written in C and some (there are exceptions like :py:func:`abs`) that perform really fast with :py:func:`map`: .. code:: python >>> import random >>> l1 = [random.randint(0, 1000) for _ in range(20000)] >>> l2 = [random.randint(0, 1000) for _ in range(20000)] >>> l3 = [random.randint(0, 1000) for _ in range(20000)] >>> %timeit [min(i) for i in zip(l1, l2, l3)] # doctest: +SKIP 100 loops, best of 3: 4.94 ms per loop >>> %timeit list(map(min, l1, l2, l3)) # doctest: +SKIP 100 loops, best of 3: 3.24 ms per loop Sometimes it is not possible to use such a function directly with :py:func:`map` but before you use :py:func:`functools.partial` you can always use :py:func:`itertools.repeat`! .. code:: python >>> from itertools import repeat >>> lst = [0]*100000 >>> %timeit [isinstance(i, int) for i in lst] # doctest: +SKIP 100 loops, best of 3: 17.4 ms per loop >>> %timeit list(map(isinstance, lst, repeat(int))) # doctest: +SKIP 100 loops, best of 3: 7.99 ms per loop .. note:: Using :py:func:`itertools.repeat` is only faster for very **few** functions. :py:func:`isinstance` is one of those! Predicate functions ^^^^^^^^^^^^^^^^^^^ Sometimes one needs a predicate function or filter out some items. One little (although sometimes impossible!) trick is to use methods as predicate: .. code:: python >>> import random >>> from iteration_utilities import consume >>> lst = [random.random() for _ in range(200000)] >>> %timeit consume((i for i in lst if i > 0.5), None) # doctest: +SKIP 100 loops, best of 3: 9.51 ms per loop >>> %timeit consume(filter((0.5).__lt__, lst), None) # doctest: +SKIP 100 loops, best of 3: 8.03 ms per loop This shows only a slight improvement but it's not always possible to use a generator expression or list comprehension. If you do the same with :py:func:`operator.lt` and :py:func:`functools.partial` or with a custom function you'll see the performance increase: .. code:: python >>> from functools import partial >>> from operator import lt >>> partial_gt_05 = partial(lt, 0.5) >>> %timeit consume(filter(lambda x: x > 0.5, lst), None) # doctest: +SKIP 10 loops, best of 3: 22.3 ms per loop >>> %timeit consume(filter(partial_gt_05, lst), None) # doctest: +SKIP 100 loops, best of 3: 17 ms per loop .. warning:: Using the ``__lt__`` and equivalent methods is not always possible, for example this bypasses Pythons data model. For example the following will fail: ``(5).__lt__(10.2)`` because integer don't compare to floats. In that case you need to use: ``(5.0).__lt__(10.2)``. However public methods are always available as well as several special methods like: ``__len__``, ``__contains__``, ... 070701000000A8000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002200000000iteration_utilities-0.12.1/helper070701000000A9000081A400000000000000000000000165E3BCDA000007BC000000000000000000000000000000000000003C00000000iteration_utilities-0.12.1/helper/create_functions_table.py# Licensed under Apache License Version 2.0 - see LICENSE """This is a helper that prints the content of the function overview tables . - docs/index.rst - README.rst Both contain a table of functions defined in iteration_utilities and manually updating them is a pain. Therefore this file can be executed and the contents can be copy pasted there. Just use:: >>> python helper/create_functions_table.py Unfortunately the header lines of these tables have to be removed manually, I haven't found a way to remove them programmatically using the astropy.io.ascii.RST class. It's actually important to call this helper from the main repo directory so the file resolution works correctly. """ def _create_overview_table(repo_path, readme=False): """Creates an RST table to insert in the "Readme.rst" file for the complete overview of the package. Requires `astropy`! """ from iteration_utilities import Iterable from astropy.table import Table from astropy.io.ascii import RST import pathlib p = pathlib.Path(repo_path).joinpath('docs', 'generated') funcs = sorted([file.name.split('.rst')[0] for file in p.glob('*.rst')], key=str.lower) if readme: rtd_link = ('`{0} <https://iteration-utilities.readthedocs.io/' 'en/latest/generated/{0}.html>`_') else: rtd_link = ':py:func:`~iteration_utilities.{0}`' it = (Iterable(funcs) # Create a Sphinx link from function name and module .map(rtd_link.format) # Group into 4s so we get a 4 column Table .grouper(4, fillvalue='') # Convert to list because Table expects it. .as_list()) print('\n'.join(RST().write(Table(rows=it)))) if __name__ == '__main__': import pathlib repo_path = pathlib.Path.cwd() _create_overview_table(repo_path=repo_path, readme=False) print('\n\n\n') _create_overview_table(repo_path=repo_path, readme=True) 070701000000AA000081A400000000000000000000000165E3BCDA00000110000000000000000000000000000000000000004100000000iteration_utilities-0.12.1/helper/git_clean_keep_ide_settings.pyimport subprocess # Keep the following directories # - .vscode (Visual Studio code project settings) # - .vs (Visual Studio project files) # - .idea (PyCharm project files) subprocess.run(["git", "clean", "-dfx", "-e", ".vscode", "-e", ".idea", "-e", ".vs"], check=True) 070701000000AB000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002400000000iteration_utilities-0.12.1/licenses070701000000AC000081A400000000000000000000000165E3BCDA00000964000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/licenses/LICENSE_PYTHON.rst1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and the Individual or Organization ("Licensee") accessing and otherwise using Python 3.5.2 software in source or binary form and its associated documentation. 2. Subject to the terms and conditions of this License Agreement, PSF hereby grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python 3.5.2 alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright © 2001-2016 Python Software Foundation; All Rights Reserved" are retained in Python 3.5.2 alone or in any derivative version prepared by Licensee. 3. In the event Licensee prepares a derivative work that is based on or incorporates Python 3.5.2 or any part thereof, and wants to make the derivative work available to others as provided herein, then Licensee hereby agrees to include in any such work a brief summary of the changes made to Python 3.5.2. 4. PSF is making Python 3.5.2 available to Licensee on an "AS IS" basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 3.5.2 WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON 3.5.2 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 3.5.2, OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 6. This License Agreement will automatically terminate upon a material breach of its terms and conditions. 7. Nothing in this License Agreement shall be deemed to create any relationship of agency, partnership, or joint venture between PSF and Licensee. This License Agreement does not grant permission to use PSF trademarks or trade name in a trademark sense to endorse or promote products or services of Licensee, or any third party. 8. By copying, installing or otherwise using Python 3.5.2, Licensee agrees to be bound by the terms and conditions of this License Agreement.070701000000AD000081A400000000000000000000000165E3BCDA0000077A000000000000000000000000000000000000002A00000000iteration_utilities-0.12.1/pyproject.toml[build-system] requires = ["setuptools", "wheel"] build-backend = "setuptools.build_meta" [project] name = "iteration_utilities" version = "0.12.1" description = "Utilities based on Pythons iterators and generators." readme = "README.rst" requires-python = ">=3.7" authors = [ {name = "Michael Seifert", email = "michaelseifert04@yahoo.de"}, ] license = {text = "Apache License Version 2.0"} classifiers = [ "Development Status :: 5 - Production/Stable", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", "Operating System :: MacOS :: MacOS X", "Operating System :: Microsoft :: Windows", "Operating System :: POSIX :: Linux", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Utilities", ] keywords = ["functional", "functools", "generator", "itertools", "iteration", "iterator", "operators", "performance", "reduce", "utility"] [project.urls] Homepage = "https://github.com/MSeifert04/iteration_utilities" Documentation = "https://iteration-utilities.readthedocs.io/en/latest/" Repository = "https://github.com/MSeifert04/iteration_utilities.git" Changelog = "https://github.com/MSeifert04/iteration_utilities/blob/master/docs/CHANGES.rst" [project.optional-dependencies] test = ["pytest"] documentation = ["sphinx>=2.2", "numpydoc"] [tool.pytest.ini_options] addopts = "--doctest-glob='docs/*.rst' --ignore='setup.py'" testpaths = [ "tests", "docs" ] [tool.coverage.run] branch = true omit = [ "setup.py", "tests/*", "*_iteration_utilities*" ] [tool.coverage.report] show_missing = true precision = "2" 070701000000AE000081A400000000000000000000000165E3BCDA000004CB000000000000000000000000000000000000002400000000iteration_utilities-0.12.1/setup.pyfrom setuptools import setup, Extension from os import path import sys files = [ 'accumulate.c', 'alldistinct.c', 'allequal.c', 'allisinstance.c', 'allmonotone.c', 'always_iterable.c', 'anyisinstance.c', 'applyfunc.c', 'argminmax.c', 'chained.c', 'clamp.c', 'complement.c', 'constant.c', 'countitems.c', 'deepflatten.c', 'dotproduct.c', 'duplicates.c', 'empty.c', 'exported_helper.c', 'flip.c', 'groupedby.c', 'grouper.c', 'helper.c', 'intersperse.c', 'isx.c', 'itemidxkey.c', 'iterexcept.c', 'mathematical.c', 'merge.c', 'minmax.c', 'nth.c', 'one.c', 'packed.c', 'partial.c', 'partition.c', 'placeholder.c', 'replicate.c', 'returnx.c', 'roundrobin.c', 'seen.c', 'sideeffect.c', 'split.c', 'starfilter.c', 'successive.c', 'tabulate.c', 'uniqueever.c', 'uniquejust.c', '_iteration_utilities.c' ] setup( ext_modules=[ Extension( 'iteration_utilities._iteration_utilities', sources=[path.join('src', 'iteration_utilities', '_iteration_utilities', filename) for filename in files] ) ] ) 070701000000AF000041ED00000000000000000000000365E3BCDA00000000000000000000000000000000000000000000001F00000000iteration_utilities-0.12.1/src070701000000B0000041ED00000000000000000000000365E3BCDA00000000000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/src/iteration_utilities070701000000B1000081A400000000000000000000000165E3BCDA0000010B000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/src/iteration_utilities/__init__.py# Licensed under Apache License Version 2.0 - see LICENSE """Utilities based on Pythons iterators and generators.""" from ._iteration_utilities import * from ._convenience import * from ._recipes import * from ._additional_recipes import * from ._classes import * 070701000000B2000081A400000000000000000000000165E3BCDA0000546F000000000000000000000000000000000000004A00000000iteration_utilities-0.12.1/src/iteration_utilities/_additional_recipes.py# Licensed under Apache License Version 2.0 - see LICENSE """ API: Additional recipes ----------------------- """ # Built-ins from collections import OrderedDict from itertools import chain, islice, repeat, product, combinations from operator import itemgetter # This module from iteration_utilities import nth, unique_justseen, chained from ._recipes import tail __all__ = ['argsorted', 'combinations_from_relations', 'getitem', 'insert', 'itersubclasses', 'pad', 'remove', 'replace'] def argsorted(iterable, key=None, reverse=False): """Returns the indices that would sort the `iterable`. Parameters ---------- iterable : iterable The `iterable` to sort. key : callable, None, optional If ``None`` sort the items in the `iterable`, otherwise sort the ``key(items)``. Default is ``None``. reverse : :py:class:`bool`, optional If ``False`` sort the `iterable` in increasing order otherwise in decreasing order. Default is ``False``. Returns ------- sortindices : :py:class:`list` The indices that would sort the `iterable`. Notes ----- See :py:func:`sorted` for more explanations to the parameters. Examples -------- To get the indices that would sort a sequence in increasing order:: >>> from iteration_utilities import argsorted >>> argsorted([3, 1, 2]) [1, 2, 0] It also works when sorting in decreasing order:: >>> argsorted([3, 1, 2], reverse=True) [0, 2, 1] And when applying a `key` function:: >>> argsorted([3, 1, -2], key=abs) [1, 2, 0] """ if key is None: key = itemgetter(1) else: key = chained(itemgetter(1), key) return [i[0] for i in sorted(enumerate(iterable), key=key, reverse=reverse)] def combinations_from_relations(dictionary, r): """Yield combinations where only one item (or None) of each equivalence class is present. Parameters ---------- dictionary : :py:class:`dict` with iterable values or convertible to one. A dictionary defining the equivalence classes, each key should contain all equivalent items as it's value. .. warning:: Each ``value`` in the `dictionary` must be iterable. .. note:: If the `dictionary` isn't ordered then the order in the combinations and their order of appearance is not-deterministic. .. note:: If the `dictionary` isn't :py:class:`dict`-like it will be converted to an :py:class:`collections.OrderedDict`. r : :py:class:`int` or None, optional The number of combinations, if ``None`` it defaults to the length of the `dictionary`. Returns ------- combinations : generator The combinations from the dictionary. Examples -------- In general the :py:class:`collections.OrderedDict` should be used to call the function. But it will also be automatically converted to one if one inserts an iterable that is convertible to a dict:: >>> from iteration_utilities import combinations_from_relations >>> classes = [('a', [1, 2]), ('b', [3, 4]), ('c', [5, 6])] >>> for comb in combinations_from_relations(classes, 2): ... print(comb) (1, 3) (1, 4) (2, 3) (2, 4) (1, 5) (1, 6) (2, 5) (2, 6) (3, 5) (3, 6) (4, 5) (4, 6) This is equivalent to creating the :py:class:`collections.OrderedDict` manually:: >>> from collections import OrderedDict >>> odct = OrderedDict(classes) >>> for comb in combinations_from_relations(odct, 3): ... print(comb) (1, 3, 5) (1, 3, 6) (1, 4, 5) (1, 4, 6) (2, 3, 5) (2, 3, 6) (2, 4, 5) (2, 4, 6) """ if not isinstance(dictionary, dict): dictionary = OrderedDict(dictionary) for keycomb in combinations(dictionary, r): yield from product(*itemgetter(*keycomb)(dictionary)) def itersubclasses(cls, seen=None): """Iterate over the subclasses of `cls`. Recipe based on the snippet of Gabriel Genellina ([0]_) but modified. Parameters ---------- cls : :py:class:`type` The class for which to iterate over the subclasses. seen : set, None, optional Classes to exclude from iteration or ``None`` if all subclasses should be returned. Default is ``None``. Returns ------- subclasses : generator The subclasses of `cls`. Examples -------- It works with any class and also handles diamond inheritance structures:: >>> class A: pass >>> class B(A): pass >>> class C(B): pass >>> class D(C): pass >>> class E(C): pass >>> class F(D, E): pass >>> list(i.__name__ for i in itersubclasses(A)) ['B', 'C', 'D', 'F', 'E'] There is mostly no need to specify `seen` but this can be used to exclude the class and all subclasses for it:: >>> [i.__name__ for i in itersubclasses(A, seen={C})] ['B'] And it also works for objects subclassing :py:class:`type`:: >>> class Z(type): pass >>> class Y(Z): pass >>> [i.__name__ for i in itersubclasses(Z)] ['Y'] The reverse operation: To iterate over the superclasses is possible using the ``class_to_test.__mro__`` attribute:: >>> [i.__name__ for i in F.__mro__] ['F', 'D', 'E', 'C', 'B', 'A', 'object'] References ---------- .. [0] http://code.activestate.com/recipes/576949/ """ if seen is None: seen = set() try: subs = cls.__subclasses__() except TypeError: # fails if cls is "type" subs = cls.__subclasses__(cls) # This part is some combination of unique_everseen and flatten, however # I did not found a way to use these here. for sub in subs: if sub not in seen: seen.add(sub) yield sub yield from itersubclasses(sub, seen) def pad(iterable, fillvalue=None, nlead=0, ntail=0): """Pad the `iterable` with `fillvalue` in front and behind. Parameters ---------- iterable : iterable The `iterable` to pad. fillvalue : any type, optional The padding value. Default is ``None``. nlead, ntail : :py:class:`int` or None, optional The number of times to pad in front (`nlead`) and after (`ntail`) the `iterable`. If `ntail` is ``None`` pad indefinitely (not possible for `nlead`). Default is ``0``. Returns ------- padded_iterable : generator The padded `iterable`. Examples -------- >>> from iteration_utilities import pad, getitem >>> list(pad([1,2,3], 0, 5)) [0, 0, 0, 0, 0, 1, 2, 3] >>> list(pad([1,2,3], 0, ntail=5)) [1, 2, 3, 0, 0, 0, 0, 0] >>> list(pad([1,2,3], 0, nlead=5, ntail=5)) [0, 0, 0, 0, 0, 1, 2, 3, 0, 0, 0, 0, 0] >>> list(getitem(pad([1,2,3], 0, ntail=None), stop=10)) [1, 2, 3, 0, 0, 0, 0, 0, 0, 0] .. warning:: This will return an infinitely long generator if ``ntail`` is ``None``, so do not try to do something like ``list(pad([], ntail=None))``! """ prepend = repeat(fillvalue, nlead) if ntail is None: append = repeat(fillvalue) else: append = repeat(fillvalue, ntail) return chain(prepend, iterable, append) # ============================================================================= # List-like interface methods # # getitem: list[x] # insert: list[x:x] = item # replace: list[x:y] = item # remove: del list[x:y] # # ============================================================================= def getitem(iterable, idx=None, start=None, stop=None, step=None): """Get the item at `idx` or the items specified by `start`, `stop` and `step`. Parameters ---------- iterable : iterable The iterable from which to extract the items. idx : positive :py:class:`int`, -1, tuple/list thereof, or None, optional If not ``None``, get the item at `idx`. If it's a tuple or list get all the items specified in the tuple (they will be sorted so the specified indices are retrieved). Default is ``None``. .. note:: This parameter must not be ``None`` if also `start`, `stop` and `step` are ``None``. start : :py:class:`int` or None, optional If ``None`` then take all items before `stop`, otherwise take only the items starting by `start`. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. stop : :py:class:`int` or None, optional If ``None`` then take all items starting by `start`, otherwise only take the items before `stop`. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. step : positive :py:class:`int` or None, optional If ``None`` then take all items separated by `step`, otherwise take successive items. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. Returns ------- items : any type or generator If `idx` was not ``None`` then it returns the item, otherwise it returns the items specified by `start`, `stop` and `step`. Examples -------- The main bulk of examples is in :py:meth:`~iteration_utilities.Iterable.getitem` because that's where this function was originally implemented. """ if idx is None and start is None and stop is None and step is None: raise TypeError('one of "idx", "start" or "stop" must be given.') it = iter(iterable) if idx is not None: if not isinstance(idx, (tuple, list)): if idx < -1: raise ValueError('index must be -1 or bigger.') return nth(idx)(iterable) elif not idx: return [] else: # A list of indices, we sort it (insert -1 at the end because it's # the last one) and then extract all the values. idx = sorted(idx, key=lambda x: x if x != -1 else float('inf')) if idx[0] < -1: raise ValueError('index must be -1 or bigger.') current = 0 ret = [] for i in unique_justseen(idx): ret.append(nth(i-current)(it)) current = i+1 return ret start_gt_0 = start is None or start > 0 step_gt_0 = step is None or step > 0 start_lt_0 = start is not None and start < 0 stop_lt_0 = stop is not None and stop < 0 step_lt_0 = step is not None and step < 0 # Several possibilities: # - start None, stop None, step None = self # if start is None and stop is None and step is None: # return iterable # - start None or > 0, stop None, step None or > 0 = islice if start_gt_0 and stop is None and step_gt_0: return islice(iterable, start, stop, step) # - start None or > 0, stop > 0, step None or > 0 = finite islice elif start_gt_0 and stop is not None and stop > 0 and step_gt_0: return islice(iterable, start, stop, step) # There could be valid cases with negative steps, for example if # reversed can be applied. But I won't go down that road! elif step_lt_0: raise ValueError('negative "step" is not possible.') # Any other combination requires the start to be not None and # negative. elif start_lt_0: # - start < 0, stop < 0, step None or > 0 = tail then islice. if stop_lt_0 and step_gt_0: it = tail(iterable, -start) it = islice(it, 0, stop-start, step) return it # - start < 0, stop None, step None = tail elif stop is None and step is None: it = tail(iterable, -start) return it # - start < 0, stop None, step > 0 = tail and islice elif stop is None and step > 0: it = tail(iterable, -start) it = islice(it, 0, None, step) return it else: raise ValueError('{0} cannot be subscripted with any ' 'combination of negative "start", "stop" or ' '"step". This combination wasn\'t allowed.') def insert(iterable, element, idx, unpack=False): """Insert one `element` into `iterable`. Parameters ---------- iterable : iterable The `iterable` in which to insert the `element`. element : any type The `element` to insert to the `iterable`. idx : positive :py:class:`int` or :py:class:`str` The index at which to insert the `element`. If it's a string it must be ``'start'`` if the `element` should be prepended to `iterable` or ``'end'`` if it should be appended. unpack : :py:class:`bool`, optional If ``False`` the `element` is inserted as it is. If ``True`` then the `element` must be an iterable and it is unpacked into the `iterable`. Default is ``False``. Returns ------- inserted : generator The `element` inserted into `iterable` at `idx` as generator. Examples -------- To prepend a value:: >>> from iteration_utilities import insert >>> list(insert(range(10), 100, 'start')) # 'start' is equivalent to 0 [100, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9] To append a value:: >>> list(insert(range(10), 100, 'end')) [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 100] Or to insert it at a given index:: >>> list(insert(range(10), 100, 2)) [0, 1, 100, 2, 3, 4, 5, 6, 7, 8, 9] It is also possible to unpack another iterable into another one with the `unpack` argument:: >>> list(insert(range(10), [1, 2, 3], 0, unpack=True)) [1, 2, 3, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9] If the `unpack` argument is not given the iterable is inserted as it is:: >>> list(insert(range(10), [1, 2, 3], 0)) [[1, 2, 3], 0, 1, 2, 3, 4, 5, 6, 7, 8, 9] """ if not unpack: element = [element] it = iter(iterable) # TODO: Implement multiple indices at which to insert the item, this is # quite nontrivial while supporting "start" and "end"... if idx == 'start': return chain(element, it) elif idx == 'end': return chain(it, element) else: return chain(islice(it, idx), element, it) def replace(iterable, element, idx=None, start=None, stop=None, unpack=False): """Removes the item at `idx` or from `start` (inclusive) to `stop` (exclusive) and then inserts the `element` there. Parameters ---------- iterable : iterable The iterable in which to replace the item(s). element : any type The element to insert after removing. idx : positive :py:class:`int`, list/tuple thereof, None, optional If not ``None``, remove the item at `idx` and insert `element` there. If it's a tuple or list the `element` is inserted at each of the indices in the `idx` (the values are sorted before, so the element is always inserted at the given indices). Default is ``None``. .. note:: This parameter must not be ``None`` if also `start` and `stop` are ``None``. start : positive :py:class:`int` or None, optional If ``None`` then remove all items before `stop`, otherwise remove only the items starting by `start`. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. stop : positive :py:class:`int` or None, optional If ``None`` then remove all items starting by `start`, otherwise only remove the items before `stop`. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. unpack : :py:class:`bool`, optional If ``False`` the `element` is inserted as it is. If ``True`` then the `element` must be an iterable and it is unpacked into the `iterable`. Default is ``False``. Returns ------- replaced : generator The `iterable` with the specified items removed and `element` inserted in their place. Examples -------- To replace one item:: >>> from iteration_utilities import replace >>> list(replace(range(10), 100, idx=2)) [0, 1, 100, 3, 4, 5, 6, 7, 8, 9] To replace multiple items:: >>> list(replace(range(10), 100, (3, 5, 1))) [0, 100, 2, 100, 4, 100, 6, 7, 8, 9] To replace slices:: >>> list(replace(range(10), 100, start=2)) [0, 1, 100] >>> list(replace(range(10), 100, stop=2)) [100, 2, 3, 4, 5, 6, 7, 8, 9] >>> list(replace(range(10), 100, start=2, stop=5)) [0, 1, 100, 5, 6, 7, 8, 9] """ if idx is None and start is None and stop is None: raise TypeError('one of "idx", "start" or "stop" must be given.') if not unpack: element = [element] it = iter(iterable) if idx is not None: if not isinstance(idx, (list, tuple)): return chain(islice(it, idx), element, islice(it, 1, None)) elif not idx: return iterable else: idx = sorted(idx) ret = [] current = 0 for num, i in enumerate(unique_justseen(idx)): if not num: ret.append(islice(it, i)) else: ret.append(islice(it, 1, i-current)) ret.append(element) current = i ret.append(islice(it, 1, None)) return chain.from_iterable(ret) if start is not None and stop is not None: range_ = stop - start if range_ <= 0: raise ValueError('"stop" must be greater than "start".') return chain(islice(it, start), element, islice(it, range_, None)) elif start is not None: return chain(islice(it, start), element) else: # elif stop is not None! return chain(element, islice(it, stop, None)) def remove(iterable, idx=None, start=None, stop=None): """Removes the item at `idx` or from `start` (inclusive) to `stop` (exclusive). Parameters ---------- iterable : iterable The iterable in which to remove the item(s). idx : positive :py:class:`int`, list/tuple thereof, None, optional If not ``None``, remove the item at `idx`. If it's a tuple or list then replace all the present indices (they will be sorted so only the specified indices are removed). Default is ``None``. .. note:: This parameter must not be ``None`` if also `start` and `stop` are ``None``. start : positive :py:class:`int` or None, optional If ``None`` then remove all items before `stop`, otherwise remove only the items starting by `start`. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. stop : positive :py:class:`int` or None, optional If ``None`` then remove all items starting by `start`, otherwise only remove the items before `stop`. Default is ``None``. .. note:: This parameter is ignored if `idx` is not ``None``. Returns ------- replaced : generator The `iterable` with the specified items removed. Examples -------- To remove one item:: >>> from iteration_utilities import remove >>> list(remove(range(10), idx=2)) [0, 1, 3, 4, 5, 6, 7, 8, 9] To remove several items just provide a tuple as idx (the values are sorted, so exactly the specified elements are removed):: >>> list(remove(range(10), (4, 6, 8, 5, 1))) [0, 2, 3, 7, 9] To remove a slice:: >>> list(remove(range(10), start=2)) [0, 1] >>> list(remove(range(10), stop=2)) [2, 3, 4, 5, 6, 7, 8, 9] >>> list(remove(range(10), start=2, stop=5)) [0, 1, 5, 6, 7, 8, 9] """ if idx is None and start is None and stop is None: raise TypeError('one of "idx", "start" or "stop" must be given.') it = iter(iterable) if idx is not None: if not isinstance(idx, (list, tuple)): return chain(islice(it, idx), islice(it, 1, None)) elif not idx: return iterable else: idx = sorted(idx) ret = [] current = 0 for num, i in enumerate(unique_justseen(idx)): if not num: ret.append(islice(it, i)) else: ret.append(islice(it, 1, i-current)) current = i ret.append(islice(it, 1, None)) return chain.from_iterable(ret) if start is not None and stop is not None: range_ = stop - start if range_ < 0: raise ValueError('"stop" must be greater than or equal to ' '"start".') return chain(islice(it, start), islice(it, range_, None)) elif start is not None: return islice(it, start) else: return islice(it, stop, None) 070701000000B3000081A400000000000000000000000165E3BCDA000142A4000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/src/iteration_utilities/_classes.py# Licensed under Apache License Version 2.0 - see LICENSE """ API: Chainable iteration_utilities ---------------------------------- """ # Built-ins from collections import Counter, OrderedDict from functools import reduce from heapq import nlargest, nsmallest from itertools import (chain, combinations, combinations_with_replacement, compress, count, cycle, dropwhile, filterfalse, islice, permutations, product, repeat, starmap, takewhile, zip_longest) from math import fsum from operator import length_hint import statistics # This module from iteration_utilities._utils import _default # - generators from iteration_utilities import (accumulate, always_iterable, applyfunc, clamp, deepflatten, duplicates, empty, flatten, getitem, grouper, insert, intersperse, itersubclasses, iter_except, ncycles, pad, powerset, remove, repeatfunc, replace, replicate, split, starfilter, successive, tabulate, tail, unique_everseen, unique_justseen) # - folds from iteration_utilities import (all_distinct, all_equal, all_monotone, argmax, argmin, argsorted, count_items, first, groupedby, last, minmax, nth, one, partition, second, third) # - multiple_iterables from iteration_utilities import merge, roundrobin # - helper from iteration_utilities import all_isinstance, any_isinstance # - private helpers (must be imported from the private module!!!) from ._iteration_utilities import _parse_args, _parse_kwargs __all__ = ['Iterable', 'InfiniteIterable', 'ManyIterables'] class _Base: """Base class for method definitions that are shared by :py:class:`.Iterable` and :py:class:`.InfiniteIterable`. """ __slots__ = ('_iterable',) def __init__(self, iterable): self._iterable = iterable def __iter__(self): return iter(self._iterable) def __getitem__(self, idx): """see `get`.""" if isinstance(idx, (int, tuple, list)): return getitem(self._iterable, idx=idx) elif isinstance(idx, slice): if (isinstance(self, InfiniteIterable) and any(x is not None and x < 0 for x in [idx.start, idx.stop, idx.step])): raise TypeError('subscripting InfiniteIterables requires ' '"start", "stop" and "step" to be positive ' 'integers or None.') if idx.stop is not None and idx.stop > 0: meth = self._call_finite else: meth = self._call return meth(getitem, 0, start=idx.start, stop=idx.stop, step=idx.step) raise TypeError('can only subscript {0} with integers and slices.' ''.format(self.__class__.__name__)) def __repr__(self): return '<{0.__class__.__name__}: {0._iterable!r}>'.format(self) def _call(self, fn, pos, *args, **kwargs): args = _parse_args(args, self._iterable, pos) _parse_kwargs(kwargs, _default) return self.__class__(fn(*args, **kwargs)) def _call_finite(self, *args, **kwargs): res = self._call(*args, **kwargs) if isinstance(res, Iterable): return res return Iterable(res._iterable) def _call_infinite(self, *args, **kwargs): res = self._call(*args, **kwargs) if isinstance(res, InfiniteIterable): # There is no use-case to wrap an already infinite iterable with # something that newly creates an infinite iterable. # For example cycle(count()) makes no sense because we never end # with count so cycle never triggers. # That may change but I found no useful combination so there is # this Exception. raise TypeError('impossible to wrap an infinite iterable with ' 'another infinite iterable.') return InfiniteIterable(res._iterable) @staticmethod def from_count(start=_default, step=_default): """See :py:func:`itertools.count`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable.from_count().islice(10).as_list() [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] >>> Iterable.from_count(4, 3).islice(10).as_list() [4, 7, 10, 13, 16, 19, 22, 25, 28, 31] >>> Iterable.from_count(start=4, step=3).islice(10).as_list() [4, 7, 10, 13, 16, 19, 22, 25, 28, 31] .. warning:: This returns an :py:class:`.InfiniteIterable`. """ kwargs = {} if start is not _default: kwargs['start'] = start if step is not _default: kwargs['step'] = step return InfiniteIterable(count(**kwargs)) @staticmethod def from_empty(): """Creates an empty :py:class:`Iterable`. .. versionadded:: 0.11.0 Examples -------- >>> from iteration_utilities import Iterable >>> Iterable.from_empty().as_list() [] """ return Iterable(empty) @staticmethod def from_maybe_iterable(obj, excluded_types=_default, empty_if_none=_default): """See :py:func:`~iteration_utilities.always_iterable`. .. versionadded:: 0.11.0 Examples -------- >>> from iteration_utilities import Iterable >>> Iterable.from_maybe_iterable([1, 2, 3]).as_list() [1, 2, 3] >>> Iterable.from_maybe_iterable(1).as_list() [1] >>> Iterable.from_maybe_iterable([1, 2, 3], excluded_types=list).as_list() [[1, 2, 3]] >>> Iterable.from_maybe_iterable(None, empty_if_none=True).as_list() [] """ kwargs = {'excluded_types': excluded_types, 'empty_if_none': empty_if_none} _parse_kwargs(kwargs, _default) return Iterable(always_iterable(obj, **kwargs)) @staticmethod def from_repeat(object, times=_default): """See :py:func:`itertools.repeat`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable.from_repeat(5).islice(10).as_list() [5, 5, 5, 5, 5, 5, 5, 5, 5, 5] >>> Iterable.from_repeat(5, 5).as_list() [5, 5, 5, 5, 5] >>> Iterable.from_repeat(object=5, times=5).as_list() [5, 5, 5, 5, 5] .. warning:: This returns an :py:class:`.InfiniteIterable` if `times` is not given. """ if times is not _default: return Iterable(repeat(object, times)) else: return InfiniteIterable(repeat(object)) @staticmethod def from_itersubclasses(object): """See \ :py:func:`~iteration_utilities.itersubclasses`. Examples -------- >>> from iteration_utilities import Iterable >>> class A: pass >>> class B(A): pass >>> class C(A): pass >>> class D(C): pass >>> Iterable.from_itersubclasses(A).as_list() [<class 'iteration_utilities._classes.B'>, \ <class 'iteration_utilities._classes.C'>, \ <class 'iteration_utilities._classes.D'>] >>> Iterable.from_itersubclasses(C).as_list() [<class 'iteration_utilities._classes.D'>] """ return Iterable(itersubclasses(object)) @staticmethod def from_applyfunc(func, initial): """See :py:func:`~iteration_utilities.applyfunc`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable.from_applyfunc(lambda x: x*2, 10).islice(5).as_list() [20, 40, 80, 160, 320] >>> Iterable.from_applyfunc(func=lambda x: x*2, ... initial=10).islice(5).as_list() [20, 40, 80, 160, 320] .. warning:: This returns an :py:class:`.InfiniteIterable`. """ return InfiniteIterable(applyfunc(func=func, initial=initial)) @staticmethod def from_iterfunc_sentinel(func, sentinel): """See :py:func:`python:iter`. Examples -------- >>> from iteration_utilities import Iterable >>> class Func: ... def __init__(self): ... self.val = 0 ... def __call__(self): ... self.val += 1 ... return 4 if self.val < 8 else 10 >>> Iterable.from_iterfunc_sentinel(Func(), 10).as_list() [4, 4, 4, 4, 4, 4, 4] """ # TODO: Update example to something useful return Iterable(iter(func, sentinel)) @staticmethod def from_iterfunc_exception(func, exception, first=_default): """See :py:func:`~iteration_utilities.iter_except`. Examples -------- >>> from iteration_utilities import Iterable >>> class Func: ... def __init__(self): ... self.val = 0 ... def setlim(self, val=3): ... self.val = val ... return 'init' ... def __call__(self): ... self.val += 1 ... if self.val < 8: ... return 3 ... raise ValueError() >>> Iterable.from_iterfunc_exception(Func(), ValueError).as_list() [3, 3, 3, 3, 3, 3, 3] >>> f = Func() >>> Iterable.from_iterfunc_exception(f, ValueError, f.setlim).as_list() ['init', 3, 3, 3, 3] """ if first is _default: return Iterable(iter_except(func, exception)) else: return Iterable(iter_except(func, exception, first=first)) @staticmethod def from_repeatfunc(func, *args, **times): """See :py:func:`~iteration_utilities.repeatfunc`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable.from_repeatfunc(int).islice(10).as_list() [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] >>> Iterable.from_repeatfunc(int, times=10).as_list() [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] >>> import random >>> # Something more useful: Creating 10 random integer >>> Iterable.from_repeatfunc(random.randint, 0, 5, times=10).as_list() # doctest: +SKIP [1, 3, 1, 3, 5, 2, 4, 1, 0, 1] .. warning:: This returns an :py:class:`.InfiniteIterable` if `times` is not given. """ if times: return Iterable(repeatfunc(func, *args, **times)) else: return InfiniteIterable(repeatfunc(func, *args)) @staticmethod def from_tabulate(func, start=_default): """See :py:func:`~iteration_utilities.tabulate`. Examples -------- >>> from iteration_utilities import Iterable, chained >>> roundint = chained(round, int) >>> import operator >>> Iterable.from_tabulate(operator.neg).islice(8).as_list() [0, -1, -2, -3, -4, -5, -6, -7] >>> from math import gamma >>> Iterable.from_tabulate(gamma, 1).islice(8).map(roundint).as_tuple() (1, 1, 2, 6, 24, 120, 720, 5040) >>> Iterable.from_tabulate(func=gamma, start=2).islice(7).map(roundint).as_tuple() (1, 2, 6, 24, 120, 720, 5040) .. warning:: This returns an :py:class:`.InfiniteIterable`. """ if start is _default: return InfiniteIterable(tabulate(func)) else: return InfiniteIterable(tabulate(func, start=start)) def accumulate(self, func=_default, start=_default): """See :py:func:`~iteration_utilities.accumulate`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).accumulate().as_list() [1, 3, 6, 10, 15, 21, 28, 36, 45] >>> from operator import mul >>> Iterable(range(1, 10)).accumulate(mul, 2).as_list() [2, 4, 12, 48, 240, 1440, 10080, 80640, 725760] >>> Iterable(range(1, 10)).accumulate(func=mul, start=3).as_list() [3, 6, 18, 72, 360, 2160, 15120, 120960, 1088640] """ return self._call(accumulate, 0, func=func, start=start) def clamp(self, low=_default, high=_default, inclusive=_default, remove=_default): """See :py:func:`~iteration_utilities.clamp`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).clamp(2, 7, True).as_list() [3, 4, 5, 6] >>> Iterable(range(10)).clamp(low=2, high=7, inclusive=True).as_list() [3, 4, 5, 6] >>> Iterable(range(10)).clamp(low=2, high=7, remove=False).as_list() [2, 2, 2, 3, 4, 5, 6, 7, 7, 7] """ return self._call(clamp, 0, low=low, high=high, inclusive=inclusive, remove=remove) def combinations(self, r): """See :py:func:`itertools.combinations`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 4)).combinations(2).as_list() [(1, 2), (1, 3), (2, 3)] >>> Iterable(range(1, 4)).combinations(r=2).as_list() [(1, 2), (1, 3), (2, 3)] """ return self._call(combinations, 0, r=r) def combinations_with_replacement(self, r): """See :py:func:`itertools.combinations_with_replacement`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 4)).combinations_with_replacement(2).as_list() [(1, 1), (1, 2), (1, 3), (2, 2), (2, 3), (3, 3)] >>> Iterable(range(1, 4)).combinations_with_replacement(r=2).as_list() [(1, 1), (1, 2), (1, 3), (2, 2), (2, 3), (3, 3)] """ return self._call(combinations_with_replacement, 0, r=r) def compress(self, selectors): """See :py:func:`itertools.compress`. Examples -------- >>> from iteration_utilities import Iterable >>> sel = [0, 1, 0, 1, 0, 1, 1, 1, 0] >>> Iterable(range(1, 10)).compress(sel).as_list() [2, 4, 6, 7, 8] >>> Iterable(range(1, 10)).compress(selectors=sel).as_list() [2, 4, 6, 7, 8] """ return self._call(compress, 0, selectors=selectors) def cycle(self): """See :py:func:`itertools.cycle`. Examples -------- >>> from iteration_utilities import Iterable >>> it = Iterable([1, 2]).cycle() >>> for item in it.islice(5): ... print(item) 1 2 1 2 1 """ return self._call_infinite(cycle, 0) def deepflatten(self, depth=_default, types=_default, ignore=_default): """See \ :py:func:`~iteration_utilities.deepflatten`. Examples -------- >>> from iteration_utilities import Iterable >>> lst = [1, 2, 3, [1, 2, 3, [1, 2, 3]]] >>> Iterable(lst).deepflatten().as_list() [1, 2, 3, 1, 2, 3, 1, 2, 3] >>> Iterable(lst).deepflatten(1, list, str).as_list() [1, 2, 3, 1, 2, 3, [1, 2, 3]] >>> Iterable(lst).deepflatten(depth=1, ... types=list, ignore=str).as_list() [1, 2, 3, 1, 2, 3, [1, 2, 3]] """ return self._call(deepflatten, 0, depth=depth, types=types, ignore=ignore) def dropwhile(self, predicate): """See :py:func:`itertools.dropwhile`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).dropwhile(lambda x: x < 5).as_list() [5, 6, 7, 8, 9] >>> Iterable(range(1, 10)).dropwhile( ... predicate=lambda x: x < 3).as_list() [3, 4, 5, 6, 7, 8, 9] """ return self._call(dropwhile, 1, predicate) def duplicates(self, key=_default): """See :py:func:`~iteration_utilities.duplicates`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 1, 2, 1]).duplicates().as_list() [1, 1] >>> Iterable([1, -1, 2, 1]).duplicates(abs).as_list() [-1, 1] >>> Iterable([1, -1, 2, 1]).duplicates(key=abs).as_list() [-1, 1] """ return self._call(duplicates, 0, key=key) def enumerate(self, start=_default): """See :py:func:`python:enumerate`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 8)).enumerate().as_list() [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)] >>> Iterable(range(1, 8)).enumerate(4).as_list() [(4, 1), (5, 2), (6, 3), (7, 4), (8, 5), (9, 6), (10, 7)] >>> Iterable(range(1, 8)).enumerate(start=2).as_list() [(2, 1), (3, 2), (4, 3), (5, 4), (6, 5), (7, 6), (8, 7)] """ return self._call(enumerate, 0, start=start) def filter(self, function): """See :py:func:`python:filter`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).filter(None).as_list() [1, 2, 3, 4, 5, 6, 7, 8, 9] >>> from iteration_utilities import is_even >>> Iterable(range(1, 10)).filter(is_even).as_list() [2, 4, 6, 8] >>> Iterable(range(1, 10)).filter(function=is_even).as_list() [2, 4, 6, 8] """ return self._call(filter, 1, function) def filterfalse(self, predicate): """See :py:func:`itertools.filterfalse`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).filterfalse(None).as_list() [] >>> from iteration_utilities import is_odd >>> Iterable(range(1, 10)).filterfalse(is_odd).as_list() [2, 4, 6, 8] >>> Iterable(range(1, 10)).filterfalse(predicate=is_odd).as_list() [2, 4, 6, 8] """ return self._call(filterfalse, 1, predicate) def flatten(self): """See :py:func:`~iteration_utilities.flatten`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([(1, 2, 3), [3, 2, 1]]).flatten().as_list() [1, 2, 3, 3, 2, 1] """ return self._call(flatten, 0) def getitem(self, item): """See \ :py:func:`~iteration_utilities.getitem` Parameters ---------- item : :py:class:`int` or :py:class:`slice` The item or items to retrieve Returns ------- parts : any type or generator If `item` was an integer the return is a singly item otherwise it returns a generator of the items. Examples -------- With integers:: >>> from iteration_utilities import Iterable >>> it = Iterable(range(10)) >>> it[2] 2 >>> it[-1] # -1 is the **only** allowed negative integer. 9 With a tuple of integer (they will be sorted internally!):: >>> Iterable(range(100))[-1, 8, 3, 10, 46] # -1 indicates last [3, 8, 10, 46, 99] >>> Iterable(range(100))[3, 8, 10, 46] [3, 8, 10, 46] With slices:: >>> it[1:].as_list() [1, 2, 3, 4, 5, 6, 7, 8, 9] >>> it[1:8:2].as_list() [1, 3, 5, 7] Slices with negative values (only these cases are possible!):: >>> # start and stop negative; step None >>> it[-5:-2].as_list() [5, 6, 7] >>> # start and stop negative; step positive >>> it[-6:-1:2].as_list() [4, 6, 8] >>> # start negative, stop and step None >>> it[-6:].as_list() [4, 5, 6, 7, 8, 9] >>> # start negative, stop None, step positive >>> it[-6::2].as_list() [4, 6, 8] It's also possible to use ``getitem`` method directly, but you have to pass in the appropriate value(s) or :py:class:`slice`:: >>> Iterable(range(10)).getitem(3) 3 >>> Iterable(range(10)).getitem(slice(5, 8)).as_tuple() (5, 6, 7) .. note:: This function might also turn an :py:class:`.InfiniteIterable` into an :py:class:`.Iterable` if the slice has a positive stop. >>> Iterable.from_count()[:4] # doctest: +ELLIPSIS <Iterable: <itertools.islice object at ...>> """ return self[item] def grouper(self, n, fillvalue=_default, truncate=_default): """See :py:func:`~iteration_utilities.grouper`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).grouper(2).as_list() [(1, 2), (3, 4), (5, 6), (7, 8), (9,)] >>> Iterable(range(1, 10)).grouper(2, None).as_list() [(1, 2), (3, 4), (5, 6), (7, 8), (9, None)] >>> Iterable(range(1, 10)).grouper(n=2, fillvalue=None).as_list() [(1, 2), (3, 4), (5, 6), (7, 8), (9, None)] >>> Iterable(range(1, 10)).grouper(n=2, truncate=True).as_list() [(1, 2), (3, 4), (5, 6), (7, 8)] """ return self._call(grouper, 0, n=n, fillvalue=fillvalue, truncate=truncate) def islice(self, *args): """See :py:func:`itertools.islice`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).islice(2).as_list() [1, 2] >>> Iterable(range(1, 10)).islice(2, 6).as_list() [3, 4, 5, 6] >>> Iterable(range(1, 10)).islice(2, 6, 2).as_list() [3, 5] >>> Iterable([1, 2, 3, 4]).islice(1, None).as_list() [2, 3, 4] >>> Iterable([1, 2, 3, 4]).islice(None).as_list() [1, 2, 3, 4] .. note:: This method converts an :py:class:`.InfiniteIterable` to a normal :py:class:`.Iterable` if a `stop` is given. """ nargs = len(args) meth = self._call if nargs == 1: if args[0] is not None: meth = self._call_finite elif nargs > 1: if args[1] is not None: meth = self._call_finite return meth(islice, 0, *args) def insert(self, element, idx, unpack=_default): """See :py:func:`~iteration_utilities.insert` Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).insert(100, 2).as_list() [0, 1, 100, 2, 3, 4, 5, 6, 7, 8, 9] .. warning:: This returns an :py:class:`.InfiniteIterable` if ``unpack=True`` and the `element` is an :py:class:`.InfiniteIterable`. >>> Iterable(range(10)).insert(Iterable.from_count(), 3, unpack=True) \ # doctest: +ELLIPSIS <InfiniteIterable: <itertools.chain object at ...>> """ if unpack and isinstance(element, InfiniteIterable): meth = self._call_infinite else: meth = self._call return meth(insert, 0, element=element, idx=idx, unpack=unpack) def intersperse(self, e): """See :py:func:`~iteration_utilities.intersperse`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).intersperse(0).as_list() [1, 0, 2, 0, 3, 0, 4, 0, 5, 0, 6, 0, 7, 0, 8, 0, 9] >>> Iterable(range(1, 10)).intersperse(e=0).as_list() [1, 0, 2, 0, 3, 0, 4, 0, 5, 0, 6, 0, 7, 0, 8, 0, 9] """ return self._call(intersperse, 0, e=e) def map(self, function): """See :py:func:`python:map`. Examples -------- >>> from iteration_utilities import Iterable, square >>> Iterable(range(1, 10)).map(square).as_list() [1, 4, 9, 16, 25, 36, 49, 64, 81] >>> Iterable(range(1, 10)).map(function=square).as_list() [1, 4, 9, 16, 25, 36, 49, 64, 81] """ return self._call(map, 1, function) def ncycles(self, n): """See :py:func:`~iteration_utilities.ncycles`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 4)).ncycles(3).as_list() [1, 2, 3, 1, 2, 3, 1, 2, 3] >>> Iterable(range(1, 4)).ncycles(n=3).as_list() [1, 2, 3, 1, 2, 3, 1, 2, 3] """ return self._call(ncycles, 0, n=n) def pad(self, fillvalue=_default, nlead=_default, ntail=_default): """See :py:func:`~iteration_utilities.pad`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([2]).pad(None, ntail=None).islice(10).as_list() [2, None, None, None, None, None, None, None, None, None] >>> Iterable([2]).pad(nlead=9).as_list() [None, None, None, None, None, None, None, None, None, 2] >>> Iterable([2]).pad(0, ntail=9).as_list() [2, 0, 0, 0, 0, 0, 0, 0, 0, 0] >>> Iterable([2]).pad(0, 1, 2).as_list() [0, 2, 0, 0] .. warning:: This returns an :py:class:`.InfiniteIterable` if ``ntail=None``. """ if ntail is None: meth = self._call_infinite else: meth = self._call return meth(pad, 0, fillvalue=fillvalue, nlead=nlead, ntail=ntail) def permutations(self, r=_default): """See :py:func:`itertools.permutations`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 4)).permutations().as_list() [(1, 2, 3), (1, 3, 2), (2, 1, 3), (2, 3, 1), (3, 1, 2), (3, 2, 1)] >>> Iterable(range(1, 4)).permutations(2).as_list() [(1, 2), (1, 3), (2, 1), (2, 3), (3, 1), (3, 2)] >>> Iterable(range(1, 4)).permutations(r=2).as_list() [(1, 2), (1, 3), (2, 1), (2, 3), (3, 1), (3, 2)] """ return self._call(permutations, 0, r=r) def powerset(self): """See :py:func:`~iteration_utilities.powerset`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 4)).powerset().as_list() [(), (1,), (2,), (3,), (1, 2), (1, 3), (2, 3), (1, 2, 3)] """ return self._call(powerset, 0) def remove(self, idx=_default, start=_default, stop=_default): """See :py:func:`~iteration_utilities.remove`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).remove(idx=2).as_list() [0, 1, 3, 4, 5, 6, 7, 8, 9] .. note:: This function might also turn an :py:class:`.InfiniteIterable` into an :py:class:`.Iterable` if `idx` and `stop` are ``None``. >>> Iterable.from_count().remove(start=4) # doctest: +ELLIPSIS <Iterable: <itertools.islice object at ...>> """ if ((idx is _default or idx is None) and (stop is None or stop is _default)): meth = self._call_finite else: meth = self._call return meth(remove, 0, idx=idx, start=start, stop=stop) def replace(self, element, idx=_default, start=_default, stop=_default, unpack=_default): """See :py:func:`~iteration_utilities.replace`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).replace(10, idx=2).as_list() [0, 1, 10, 3, 4, 5, 6, 7, 8, 9] .. warning:: This returns an :py:class:`.InfiniteIterable` if ``unpack=True`` and the `element` is an :py:class:`.InfiniteIterable`. >>> Iterable(range(10)).replace(Iterable.from_count(), 4, unpack=True)\ # doctest: +ELLIPSIS <InfiniteIterable: <itertools.chain object at ...>> .. note:: But this function might also turn an :py:class:`.InfiniteIterable` into an :py:class:`.Iterable` if `idx` and `stop` are ``None``. >>> Iterable.from_count().replace(10, start=4) # doctest: +ELLIPSIS <Iterable: <itertools.chain object at ...>> """ if unpack and isinstance(element, InfiniteIterable): meth = self._call_infinite elif ((idx is _default or idx is None) and (stop is None or stop is _default)): meth = self._call_finite else: meth = self._call return meth(replace, 0, element=element, idx=idx, start=start, stop=stop, unpack=unpack) def replicate(self, times): """See :py:func:`~iteration_utilities.replicate`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 4)).replicate(3).as_list() [1, 1, 1, 2, 2, 2, 3, 3, 3] >>> Iterable(range(1, 4)).replicate(times=4).as_list() [1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3] """ return self._call(replicate, 0, times=times) def split(self, key, maxsplit=_default, keep=_default, keep_before=_default, keep_after=_default, eq=_default): """See :py:func:`~iteration_utilities.split`. Examples -------- >>> from iteration_utilities import Iterable, is_even >>> Iterable(range(1, 10)).split(is_even).as_list() [[1], [3], [5], [7], [9]] >>> Iterable(range(1, 10)).split(is_even, 2).as_list() [[1], [3], [5, 6, 7, 8, 9]] >>> Iterable(range(1, 10)).split(3, 1, True, False, False, True).as_list() [[1, 2], [3], [4, 5, 6, 7, 8, 9]] >>> Iterable(range(1, 10)).split(3, 1, False, True, False, True).as_list() [[1, 2, 3], [4, 5, 6, 7, 8, 9]] >>> Iterable(range(1, 10)).split(3, 1, False, False, True, True).as_list() [[1, 2], [3, 4, 5, 6, 7, 8, 9]] >>> Iterable(range(1, 10)).split(key=2, maxsplit=1, ... keep=True, eq=True).as_list() [[1], [2], [3, 4, 5, 6, 7, 8, 9]] """ return self._call(split, 0, key=key, maxsplit=maxsplit, keep=keep, keep_before=keep_before, keep_after=keep_after, eq=eq) def starfilter(self, pred): """See :py:func:`iteration_utilities.starfilter`. Examples -------- >>> from iteration_utilities import Iterable >>> from operator import eq >>> Iterable([1] * 20).enumerate().starfilter(eq).as_list() [(1, 1)] """ return self._call(starfilter, 1, pred) def starmap(self, function): """See :py:func:`itertools.starmap`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).enumerate().starmap(pow).as_list() [0, 1, 8, 81, 1024, 15625, 279936, 5764801, 134217728] """ return self._call(starmap, 1, function) def successive(self, times): """See :py:func:`~iteration_utilities.successive`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).successive(2).as_list() [(1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7), (7, 8), (8, 9)] >>> Iterable(range(1, 10)).successive(times=2).as_list() [(1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7), (7, 8), (8, 9)] """ return self._call(successive, 0, times=times) def tail(self, n): """See :py:func:`~iteration_utilities.tail`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).tail(2).as_list() [8, 9] >>> Iterable(range(1, 10)).tail(n=3).as_list() [7, 8, 9] """ return self._call(tail, 0, n=n) def takewhile(self, predicate): """See :py:func:`itertools.takewhile`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).takewhile(lambda x: x < 4).as_list() [1, 2, 3] >>> Iterable(range(1, 10)).takewhile( ... predicate=lambda x: x < 5).as_list() [1, 2, 3, 4] .. warning:: This method converts an :py:class:`.InfiniteIterable` to a normal :py:class:`.Iterable`. """ return self._call_finite(takewhile, 1, predicate) def unique_everseen(self, key=_default): """See :py:func:`~iteration_utilities.unique_everseen`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(1, 10)).unique_everseen().as_list() [1, 2, 3, 4, 5, 6, 7, 8, 9] >>> Iterable(range(1, 10)).unique_everseen(lambda x: x // 3).as_list() [1, 3, 6, 9] >>> from iteration_utilities import is_even >>> Iterable(range(1, 10)).unique_everseen(key=is_even).as_list() [1, 2] """ return self._call(unique_everseen, 0, key=key) def unique_justseen(self, key=_default): """See :py:func:`~iteration_utilities.unique_justseen`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable('aaAAbbBcCcddDDDEEEee').unique_justseen().as_list() ['a', 'A', 'b', 'B', 'c', 'C', 'c', 'd', 'D', 'E', 'e'] >>> from operator import methodcaller >>> Iterable('aaAAbbBcCcddDDDEEEee').unique_justseen( ... methodcaller('upper')).as_list() ['a', 'b', 'c', 'd', 'E'] >>> Iterable('aaAAbbBcCcddDDDEEEee').unique_justseen( ... key=methodcaller('lower')).as_list() ['a', 'b', 'c', 'd', 'E'] """ return self._call(unique_justseen, 0, key=key) class Iterable(_Base): """A convenience class that allows chaining the :py:mod:`iteration_utilities` functions. Parameters ---------- iterable : iterable Any kind of `iterable`. Notes ----- .. warning:: If the `iterable` is infinite you should **not** create the :py:class:`.Iterable` instance directly (i.e. ``Iterable(count())``. You could use the ``Iterable.from_count()`` or create an :py:class:`.InfiniteIterable`: ``InfiniteIterable(count())``. Available methods: =================================================== ====================================================== Method Reference =================================================== ====================================================== :py:meth:`~.Iterable.accumulate` See :py:func:`~iteration_utilities.accumulate`. :py:meth:`~.Iterable.as_` Convert :py:class:`.Iterable` to other class. :py:meth:`~.Iterable.as_counter` See :py:meth:`.as_`. :py:meth:`~.Iterable.as_dict` See :py:meth:`.as_`. :py:meth:`~.Iterable.as_frozenset` See :py:meth:`.as_`. :py:meth:`~.Iterable.as_list` See :py:meth:`.as_`. :py:meth:`~.Iterable.as_ordereddict` See :py:meth:`.as_`. :py:meth:`~.Iterable.as_set` See :py:meth:`.as_`. :py:meth:`~.Iterable.as_string` Get the iterable as string. :py:meth:`~.Iterable.as_tuple` See :py:meth:`.as_`. :py:meth:`~.Iterable.clamp` See :py:func:`~iteration_utilities.clamp`. :py:meth:`~.Iterable.combinations` See :py:func:`itertools.combinations`. :py:meth:`~.Iterable.combinations_with_replacement` See :py:func:`itertools.combinations_with_replacement`. :py:meth:`~.Iterable.compress` See :py:func:`itertools.compress`. :py:meth:`~.Iterable.cycle` See :py:func:`itertools.cycle`. :py:meth:`~.Iterable.deepflatten` See :py:func:`~iteration_utilities.deepflatten`. :py:meth:`~.Iterable.dropwhile` See :py:func:`itertools.dropwhile`. :py:meth:`~.Iterable.duplicates` See :py:func:`~iteration_utilities.duplicates`. :py:meth:`~.Iterable.enumerate` See :py:func:`python:enumerate`. :py:meth:`~.Iterable.filter` See :py:func:`python:filter`. :py:meth:`~.Iterable.filterfalse` See :py:func:`itertools.filterfalse`. :py:meth:`~.Iterable.flatten` See :py:func:`~iteration_utilities.flatten`. :py:meth:`~.Iterable.from_applyfunc` See :py:func:`~iteration_utilities.applyfunc`. :py:meth:`~.Iterable.from_count` See :py:func:`itertools.count`. :py:meth:`~.Iterable.from_empty` See :py:func:`~iteration_utilities.empty`. :py:meth:`~.Iterable.from_iterfunc_exception` See :py:func:`~iteration_utilities.iter_except`. :py:meth:`~.Iterable.from_iterfunc_sentinel` See :py:func:`python:iter`. :py:meth:`~.Iterable.from_itersubclasses` See :py:func:`~iteration_utilities.itersubclasses`. :py:meth:`~.Iterable.from_repeat` See :py:func:`itertools.repeat`. :py:meth:`~.Iterable.from_repeatfunc` See :py:func:`~iteration_utilities.repeatfunc`. :py:meth:`~.Iterable.from_tabulate` See :py:func:`~iteration_utilities.tabulate`. :py:meth:`~.Iterable.get_all` See :py:func:`python:all`. :py:meth:`~.Iterable.get_all_distinct` See :py:func:`~iteration_utilities.all_distinct`. :py:meth:`~.Iterable.get_all_equal` See :py:func:`~iteration_utilities.all_equal`. :py:meth:`~.Iterable.get_all_monotone` See :py:func:`~iteration_utilities.all_monotone`. :py:meth:`~.Iterable.get_any` See :py:func:`python:any`. :py:meth:`~.Iterable.get_argmax` See :py:func:`~iteration_utilities.argmax`. :py:meth:`~.Iterable.get_argmin` See :py:func:`~iteration_utilities.argmin`. :py:meth:`~.Iterable.get_argsorted` See :py:func:`~iteration_utilities.argsorted`. :py:meth:`~.Iterable.get_count_items` See :py:func:`~iteration_utilities.count_items`. :py:meth:`~.Iterable.get_first` See :py:func:`~iteration_utilities.nth`. :py:meth:`~.Iterable.get_fmean` See :py:func:`statistics.fmean`. (Python >= 3.8) :py:meth:`~.Iterable.get_fsum` See :py:func:`math.fsum`. :py:meth:`~.Iterable.get_geometric_mean` See :py:func:`statistics.geometric_mean`. (Python >= 3.8) :py:meth:`~.Iterable.get_groupedby` See :py:func:`~iteration_utilities.groupedby`. :py:meth:`~.Iterable.get_harmonic_mean` See :py:func:`statistics.harmonic_mean`. (Python >= 3.6) :py:meth:`~.Iterable.get_last` See :py:func:`~iteration_utilities.nth`. :py:meth:`~.Iterable.get_max` See :py:func:`python:max`. :py:meth:`~.Iterable.get_mean` See :py:func:`statistics.mean`. :py:meth:`~.Iterable.get_median` See :py:func:`statistics.median`. :py:meth:`~.Iterable.get_median_grouped` See :py:func:`statistics.median_grouped`. :py:meth:`~.Iterable.get_median_high` See :py:func:`statistics.median_high`. :py:meth:`~.Iterable.get_median_low` See :py:func:`statistics.median_low`. :py:meth:`~.Iterable.get_min` See :py:func:`python:min`. :py:meth:`~.Iterable.get_minmax` See :py:func:`~iteration_utilities.minmax`. :py:meth:`~.Iterable.get_mode` See :py:func:`statistics.mode`. :py:meth:`~.Iterable.get_multimode` See :py:func:`statistics.multimode`. (Python >= 3.8) :py:meth:`~.Iterable.get_nlargest` See :py:func:`heapq.nlargest`. :py:meth:`~.Iterable.get_nsmallest` See :py:func:`heapq.nsmallest`. :py:meth:`~.Iterable.get_nth` See :py:func:`~iteration_utilities.nth`. :py:meth:`~.Iterable.get_one` See :py:func:`~iteration_utilities.one`. :py:meth:`~.Iterable.get_partition` See :py:func:`~iteration_utilities.partition`. :py:meth:`~.Iterable.get_pstdev` See :py:func:`statistics.pstdev`. :py:meth:`~.Iterable.get_pvariance` See :py:func:`statistics.pvariance`. :py:meth:`~.Iterable.get_quantiles` See :py:func:`statistics.quantiles`. (Python >= 3.8) :py:meth:`~.Iterable.get_reduce` See :py:func:`functools.reduce`. :py:meth:`~.Iterable.get_second` See :py:func:`~iteration_utilities.nth`. :py:meth:`~.Iterable.get_sorted` See :py:func:`python:sorted`. :py:meth:`~.Iterable.get_stdev` See :py:func:`statistics.stdev`. :py:meth:`~.Iterable.get_sum` See :py:func:`python:sum`. :py:meth:`~.Iterable.get_third` See :py:func:`~iteration_utilities.nth`. :py:meth:`~.Iterable.get_variance` See :py:func:`statistics.variance`. :py:meth:`~.Iterable.getitem` See :py:func:`~iteration_utilities.getitem` :py:meth:`~.Iterable.grouper` See :py:func:`~iteration_utilities.grouper`. :py:meth:`~.Iterable.insert` See :py:func:`~iteration_utilities.insert` :py:meth:`~.Iterable.intersperse` See :py:func:`~iteration_utilities.intersperse`. :py:meth:`~.Iterable.islice` See :py:func:`itertools.islice`. :py:meth:`~.Iterable.map` See :py:func:`python:map`. :py:meth:`~.Iterable.ncycles` See :py:func:`~iteration_utilities.ncycles`. :py:meth:`~.Iterable.pad` See :py:func:`~iteration_utilities.pad`. :py:meth:`~.Iterable.permutations` See :py:func:`itertools.permutations`. :py:meth:`~.Iterable.powerset` See :py:func:`~iteration_utilities.powerset`. :py:meth:`~.Iterable.remove` See :py:func:`~iteration_utilities.remove`. :py:meth:`~.Iterable.replace` See :py:func:`~iteration_utilities.replace`. :py:meth:`~.Iterable.replicate` See :py:func:`~iteration_utilities.replicate`. :py:meth:`~.Iterable.reversed` See :py:func:`python:reversed`. :py:meth:`~.Iterable.split` See :py:func:`~iteration_utilities.split`. :py:meth:`~.Iterable.starfilter` See :py:func:`~iteration_utilities.starfilter`. :py:meth:`~.Iterable.starmap` See :py:func:`itertools.starmap`. :py:meth:`~.Iterable.successive` See :py:func:`~iteration_utilities.successive`. :py:meth:`~.Iterable.tail` See :py:func:`~iteration_utilities.tail`. :py:meth:`~.Iterable.takewhile` See :py:func:`itertools.takewhile`. :py:meth:`~.Iterable.unique_everseen` See :py:func:`~iteration_utilities.unique_everseen`. :py:meth:`~.Iterable.unique_justseen` See :py:func:`~iteration_utilities.unique_justseen`. =================================================== ====================================================== Examples -------- You can create an instance from any object that implements the iteration protocol. For example the Python types :py:class:`list`, :py:class:`tuple`, :py:class:`set`, :py:class:`frozenset`, :py:class:`str`, :py:class:`dict`, :py:meth:`dict.values`, :py:meth:`dict.items()`, :py:class:`range` just to name a few:: >>> from iteration_utilities import Iterable >>> Iterable([1,2,3,4]) <Iterable: [1, 2, 3, 4]> >>> Iterable('abcdefghijklmnopqrstuvwxyz') <Iterable: 'abcdefghijklmnopqrstuvwxyz'> :py:class:`.Iterable` is because allows chaining of several functions implemented in Python (:py:func:`map`, :py:func:`filter`, ...), :py:mod:`itertools` and :py:mod:`iteration_utilities`:: >>> Iterable([1,2,3,4]).islice(1,3).map(float).as_list() [2.0, 3.0] The methods :py:meth:`.islice` and :py:meth:`.map` are only evaluated on the iterable when :py:meth:`.as_list` is called. The class can also be used in ``for`` loops:: >>> from iteration_utilities import is_even >>> for item in Iterable(range(100, 120)).filter(is_even).accumulate(): ... print(item) 100 202 306 412 520 630 742 856 972 1090 Some methods (:py:meth:`.Iterable.cycle`) create an :py:class:`.InfiniteIterable`:: >>> Iterable(range(10)).cycle() # doctest: +ELLIPSIS <InfiniteIterable: <itertools.cycle object at ...>> As well as some of the staticmethods (``from_x``):: >>> Iterable.from_count() <InfiniteIterable: count(0)> >>> Iterable.from_repeat(10) <InfiniteIterable: repeat(10)> >>> Iterable.from_repeat(10, times=2) # but not always! <Iterable: repeat(10, 2)> This logic allows the class to be aware if the iterable is infinitely long or not and prevent accidental infinite loops. Some methods can also convert an :py:class:`.InfiniteIterable` to a normal :py:class:`.Iterable` again:: >>> Iterable.from_count().islice(2, 5) # doctest: +ELLIPSIS <Iterable: <itertools.islice object at ...>> >>> Iterable.from_count().takewhile(lambda x: x < 100) \ # doctest: +ELLIPSIS <Iterable: <itertools.takewhile object at ...>> :py:class:`.Iterable` implements some constructors for Python types as methods:: >>> Iterable(range(10)).as_list() [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] >>> # But also some less common ones, like OrderedDict >>> Iterable(range(6)).enumerate(4).as_ordereddict() OrderedDict({4: 0, 5: 1, 6: 2, 7: 3, 8: 4, 9: 5}) .. warning:: these latter methods are (obviously) not available for :py:class:`.InfiniteIterable`! """ __slots__ = ('_iterable',) def __length_hint__(self): return length_hint(self._iterable) def as_(self, cls): """Convert :py:class:`.Iterable` to other class. Parameters ---------- cls : :py:class:`type` Convert the content of :py:class:`.Iterable` to this class. Returns ------- iterable : cls The :py:class:`.Iterable` as `cls`. Notes ----- Be careful if you use this method because the :py:class:`.Iterable` may be infinite. """ return cls(self._iterable) def as_list(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(5)).as_list() [0, 1, 2, 3, 4] """ return self.as_(list) def as_tuple(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(5)).as_tuple() (0, 1, 2, 3, 4) """ return self.as_(tuple) def as_set(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1]).as_set() {1} """ return self.as_(set) def as_frozenset(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([5]).as_frozenset() frozenset({5}) """ return self.as_(frozenset) def as_dict(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1]).enumerate().as_dict() {0: 1} """ return self.as_(dict) def as_ordereddict(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(3, 6)).enumerate().as_ordereddict() OrderedDict({0: 3, 1: 4, 2: 5}) """ return self.as_(OrderedDict) def as_counter(self): """See :py:meth:`.as_`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable('supercalifragilisticexpialidocious').as_counter() Counter({'i': 7, 's': 3, 'c': 3, 'a': 3, 'l': 3, 'u': 2, 'p': 2, 'e': 2, 'r': 2, 'o': 2, 'f': 1, 'g': 1, 't': 1, 'x': 1, 'd': 1}) >>> Iterable([1, 1, 1]).as_counter() Counter({1: 3}) """ return self.as_(Counter) def as_string(self, seperator=''): """Get the :py:class:`.Iterable` as string. .. warning:: This method **does not** use :py:meth:`.as_` and differs from ``str(Iterable(sth))``; It uses :py:meth:`str.join`. Parameters ---------- seperator : :py:class:`str`, optional The separator between each item from the iterable in the output string. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(5)).as_string() '01234' >>> Iterable(range(5)).as_string(' ') '0 1 2 3 4' """ return seperator.join(map(str, self._iterable)) def reversed(self): """See :py:func:`python:reversed`. .. warning:: This method requires that the wrapped iterable is a `Sequence` or implements the `reversed` iterator protocol. Generally this does not work with generators! Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, 3]).reversed().as_list() [3, 2, 1] """ return self.__class__(reversed(self._iterable)) def _get(self, fn, pos, *args, **kwargs): args = _parse_args(args, self._iterable, pos) _parse_kwargs(kwargs, _default) return fn(*args, **kwargs) def _get_iter(self, fn, pos, *args, **kwargs): args = _parse_args(args, self._iterable, pos) _parse_kwargs(kwargs, _default) return fn(*args, **kwargs) def get_all(self): """See :py:func:`python:all`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).map(lambda x: x > 2).get_all() False >>> Iterable(range(10)).map(lambda x: x >= 0).get_all() True """ return self._get(all, 0) def get_all_distinct(self): """See :py:func:`~iteration_utilities.all_distinct`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_all_distinct() True >>> Iterable([1, 2, 3, 4, 5, 6, 7, 1]).get_all_distinct() False """ return self._get(all_distinct, 0) def get_all_equal(self): """See :py:func:`~iteration_utilities.all_equal`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_all_equal() False >>> Iterable([1]*100).get_all_equal() True """ return self._get(all_equal, 0) def get_all_monotone(self, decreasing=_default, strict=_default): """See :py:func:`~iteration_utilities.all_monotone`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_all_monotone() True >>> Iterable(range(10)).get_all_monotone(decreasing=False, \ strict=False) True >>> Iterable(range(10)).get_all_monotone(decreasing=True, strict=True) False """ return self._get(all_monotone, 0, decreasing=decreasing, strict=strict) def get_any(self): """See :py:func:`python:any`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).map(lambda x: x > 2).get_any() True >>> Iterable(range(10)).map(lambda x: x >= 10).get_any() False """ return self._get(any, 0) def get_argmax(self, key=_default, default=_default): """See :py:func:`~iteration_utilities.argmax`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, -5, 3, 4]).get_argmax() 4 >>> Iterable([1, 2, -5, 3, 4]).get_argmax(abs) 2 >>> Iterable([1, 2, -5, 3, 4]).get_argmax(key=abs) 2 >>> Iterable([]).get_argmax(key=abs, default=-1) -1 """ return self._get(argmax, 0, key=key, default=default) def get_argmin(self, key=_default, default=_default): """See :py:func:`~iteration_utilities.argmin`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, -5, 3, 4]).get_argmin() 2 >>> Iterable([1, 2, -5, 3, 4]).get_argmin(abs) 0 >>> Iterable([1, 2, -5, 3, 4]).get_argmin(key=abs) 0 >>> Iterable([]).get_argmin(key=abs, default=-1) -1 """ return self._get(argmin, 0, key=key, default=default) def get_argsorted(self, key=_default, reverse=_default): """See :py:func:`~iteration_utilities.argsorted`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, -5, 3, 4]).get_argsorted() [2, 0, 1, 3, 4] >>> Iterable([1, 2, -5, 3, 4]).get_argsorted(reverse=True) [4, 3, 1, 0, 2] >>> Iterable([1, 2, -5, 3, 4]).get_argsorted(key=abs) [0, 1, 3, 4, 2] >>> Iterable([1, 2, -5, 3, 4]).get_argsorted(abs, True) [2, 4, 3, 1, 0] """ return self._get(argsorted, 0, key=key, reverse=reverse) def get_count_items(self, pred=_default, eq=_default): """See :py:func:`~iteration_utilities.count_items`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable((i for i in range(10))).get_count_items() 10 >>> Iterable([1, 2, 3, 2, 1]).get_count_items(2, True) 2 >>> Iterable([1, 2, 3, 2, 1]).get_count_items(pred=2, eq=True) 2 """ return self._get(count_items, 0, pred=pred, eq=eq) def get_first(self, default=_default, pred=_default, truthy=_default, retpred=_default, retidx=_default): """See :py:func:`~iteration_utilities.nth`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_first() 0 >>> Iterable(range(1, 10, 2)).get_first(pred=lambda x: x > 5) 7 """ return self._get(first, 0, default=default, pred=pred, truthy=truthy, retpred=retpred, retidx=retidx) def get_fsum(self): """See :py:func:`math.fsum`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_fsum() 45.0 """ return self._get(fsum, 0) def get_groupedby(self, key, keep=_default, reduce=_default, reducestart=_default): """See :py:func:`~iteration_utilities.groupedby`. Examples -------- >>> from iteration_utilities import Iterable, is_even >>> grp = Iterable(range(10)).get_groupedby(is_even) >>> grp[True] [0, 2, 4, 6, 8] >>> grp[False] [1, 3, 5, 7, 9] """ return self._get(groupedby, 0, key, keep=keep, reduce=reduce, reducestart=reducestart) def get_last(self, default=_default, pred=_default, truthy=_default, retpred=_default, retidx=_default): """See :py:func:`~iteration_utilities.nth`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_last() 9 >>> Iterable(range(1, 10, 2)).get_last(pred=lambda x: x > 5) 9 """ return self._get(last, 0, default=default, pred=pred, truthy=truthy, retpred=retpred, retidx=retidx) def get_max(self, key=_default, default=_default): """See :py:func:`python:max`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, -5, 3, 4]).get_max() 4 >>> Iterable([1, 2, -5, 3, 4]).get_max(abs) -5 >>> Iterable([1, 2, -5, 3, 4]).get_max(key=abs) -5 >>> Iterable([]).get_max(key=abs, default=-1) -1 """ return self._get(max, 0, key=key, default=default) def get_min(self, key=_default, default=_default): """See :py:func:`python:min`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, -5, 3, 4]).get_min() -5 >>> Iterable([1, 2, -5, 3, 4]).get_min(abs) 1 >>> Iterable([1, 2, -5, 3, 4]).get_min(key=abs) 1 >>> Iterable([]).get_min(key=abs, default=-1) -1 """ return self._get(min, 0, key=key, default=default) def get_minmax(self, key=_default, default=_default): """See :py:func:`~iteration_utilities.minmax`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1, 2, -5, 3, 4]).get_minmax() (-5, 4) >>> Iterable([1, 2, -5, 3, 4]).get_minmax(abs) (1, -5) >>> Iterable([1, 2, -5, 3, 4]).get_minmax(key=abs) (1, -5) >>> Iterable([]).get_minmax(key=abs, default=-1) (-1, -1) """ return self._get(minmax, 0, key=key, default=default) def get_nth(self, n, default=_default, pred=_default, truthy=_default, retpred=_default, retidx=_default): """See :py:func:`~iteration_utilities.nth`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_nth(6) 6 >>> Iterable(range(1, 10, 2)).get_nth(0, pred=lambda x: x > 5) 7 """ return self._get(nth(n), 0, default=default, pred=pred, truthy=truthy, retpred=retpred, retidx=retidx) def get_nlargest(self, n, key=_default): """See :py:func:`heapq.nlargest`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([4,1,2,3,1,5,8,2,-10]).get_nlargest(3) [8, 5, 4] >>> Iterable([4,1,2,3,1,5,8,2,-10]).get_nlargest(3, key=abs) [-10, 8, 5] """ return self._get(nlargest, 1, n, key=key) def get_nsmallest(self, n, key=_default): """See :py:func:`heapq.nsmallest`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([4,1,2,3,1,5,8,2]).get_nsmallest(3) [1, 1, 2] """ return self._get(nsmallest, 1, n, key=key) def get_one(self): """See :py:func:`~iteration_utilities.one`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1]).get_one() 1 """ return self._get(one, 0) def get_partition(self, pred=_default): """See :py:func:`~iteration_utilities.partition`. Examples -------- >>> from iteration_utilities import Iterable, is_even >>> Iterable(range(5)).get_partition(is_even) ([1, 3], [0, 2, 4]) """ return self._get(partition, 0, pred=pred) def get_reduce(self, *args): """See :py:func:`functools.reduce`. Examples -------- >>> from iteration_utilities import Iterable >>> from operator import add >>> Iterable(range(5)).get_reduce(add) 10 """ return self._get(reduce, 1, *args) def get_second(self, default=_default, pred=_default, truthy=_default, retpred=_default, retidx=_default): """See :py:func:`~iteration_utilities.nth`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_second() 1 >>> Iterable(range(1, 10, 2)).get_second(pred=lambda x: x > 5) 9 """ return self._get(second, 0, default=default, pred=pred, truthy=truthy, retpred=retpred, retidx=retidx) def get_sorted(self, key=_default, reverse=_default): """See :py:func:`python:sorted`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([3, 1, 5, 12, 7]).get_sorted() [1, 3, 5, 7, 12] """ return self._get(sorted, 0, key=key, reverse=reverse) def get_sum(self, start=_default): """See :py:func:`python:sum`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([3, 1, 5, 12, 7]).get_sum() 28 >>> Iterable([3, 1, 5, 12, 7]).get_sum(10) 38 """ if start is _default: return self._get(sum, 0) return self._get(sum, 0, start) def get_third(self, default=_default, pred=_default, truthy=_default, retpred=_default, retidx=_default): """See :py:func:`~iteration_utilities.nth`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_third() 2 >>> Iterable(range(1, 10, 2)).get_third(default=-1, ... pred=lambda x: x > 5) -1 """ return self._get(third, 0, default=default, pred=pred, truthy=truthy, retpred=retpred, retidx=retidx) def get_mean(self): """See :py:func:`statistics.mean`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_mean() 4.5 """ return self._get_iter(statistics.mean, 0) def get_median(self): """See :py:func:`statistics.median`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(11)).get_median() 5 """ return self._get_iter(statistics.median, 0) def get_median_low(self): """See :py:func:`statistics.median_low`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_median_low() 4 """ return self._get_iter(statistics.median_low, 0) def get_median_high(self): """See :py:func:`statistics.median_high`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_median_high() 5 """ return self._get_iter(statistics.median_high, 0) def get_median_grouped(self, interval=_default): """See :py:func:`statistics.median_grouped`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable(range(10)).get_median_grouped(interval=4) 3.0 """ return self._get_iter(statistics.median_grouped, 0, interval=interval) def get_mode(self): """See :py:func:`statistics.mode`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_mode() 1 """ return self._get_iter(statistics.mode, 0) def get_pstdev(self, mu=_default): """See :py:func:`statistics.pstdev`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_pstdev() 2.635667953694125 """ return self._get_iter(statistics.pstdev, 0, mu=mu) def get_pvariance(self, mu=_default): """See :py:func:`statistics.pvariance`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_pvariance() \ # doctest: +ELLIPSIS 6.94674556... """ return self._get_iter(statistics.pvariance, 0, mu=mu) def get_stdev(self, xbar=_default): """See :py:func:`statistics.stdev`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_stdev() 2.743290182543769 """ return self._get_iter(statistics.stdev, 0, xbar=xbar) def get_variance(self, mu=_default): """See :py:func:`statistics.variance`. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_variance() 7.5256410256410255 """ return self._get_iter(statistics.variance, 0, mu=mu) def get_harmonic_mean(self): """See :py:func:`statistics.harmonic_mean`. .. note:: Python >= 3.6 is required for this function. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_harmonic_mean() # doctest: +ELLIPSIS 2.369791... """ return self._get_iter(statistics.harmonic_mean, 0) def get_fmean(self): """See :py:func:`statistics.fmean`. .. note:: Python >= 3.8 is required for this function. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_fmean() # doctest: +ELLIPSIS 4.230769... """ return self._get_iter(statistics.fmean, 0) def get_geometric_mean(self): """See :py:func:`statistics.geometric_mean`. .. note:: Python >= 3.8 is required for this function. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_geometric_mean() # doctest: +ELLIPSIS 3.250146... """ return self._get_iter(statistics.geometric_mean, 0) def get_multimode(self): """See :py:func:`statistics.multimode`. .. note:: Python >= 3.8 is required for this function. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,2,3,4,5,6,7,7,8,8]).get_multimode() [1, 2] """ return self._get_iter(statistics.multimode, 0) def get_quantiles(self, n=_default, method=_default): """See :py:func:`statistics.quantiles`. .. note:: Python >= 3.8 is required for this function. Examples -------- >>> from iteration_utilities import Iterable >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_quantiles() [1.5, 4.0, 7.0] >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_quantiles(n=10) [1.0, 1.0, 2.0, 2.6, 4.0, 5.4, 6.8, 7.2, 8.0] >>> Iterable([1,1,1,2,2,3,4,5,6,7,7,8,8]).get_quantiles(n=10, method='inclusive') [1.0, 1.4, 2.0, 2.8, 4.0, 5.2, 6.4, 7.0, 7.8] """ return self._get_iter(statistics.quantiles, 0, n=n, method=method) class InfiniteIterable(_Base): """Like :py:class:`.Iterable` but indicates that the wrapped iterable is infinitely long. .. warning:: The ``Iterable.as_*`` methods are not available for :py:class:`.InfiniteIterable` because it would be impossible to create these types. Use :py:meth:`.InfiniteIterable.islice` or :py:meth:`.InfiniteIterable.takewhile` to convert an infinite iterable to a finite iterable. It is still possible to iterate over the iterable with ``for item in ...`` or using the Python constructors like :py:class:`list` directly. This may fail fatally! Mostly it is not necessary to use :py:class:`.InfiniteIterable` directly because the corresponding methods on :py:class:`.Iterable` return an :py:class:`.InfiniteIterable` when appropriate. However using ``isinstance(some_iterable, InfiniteIterable)`` could be used to determine if the :py:class:`.Iterable` is infinite! Available methods: =========================================================== ======================================================= Method Reference =========================================================== ======================================================= :py:meth:`~.InfiniteIterable.accumulate` See :py:func:`~iteration_utilities.accumulate`. :py:meth:`~.InfiniteIterable.clamp` See :py:func:`~iteration_utilities.clamp`. :py:meth:`~.InfiniteIterable.combinations` See :py:func:`itertools.combinations`. :py:meth:`~.InfiniteIterable.combinations_with_replacement` See :py:func:`itertools.combinations_with_replacement`. :py:meth:`~.InfiniteIterable.compress` See :py:func:`itertools.compress`. :py:meth:`~.InfiniteIterable.cycle` See :py:func:`itertools.cycle`. :py:meth:`~.InfiniteIterable.deepflatten` See :py:func:`~iteration_utilities.deepflatten`. :py:meth:`~.InfiniteIterable.dropwhile` See :py:func:`itertools.dropwhile`. :py:meth:`~.InfiniteIterable.duplicates` See :py:func:`~iteration_utilities.duplicates`. :py:meth:`~.InfiniteIterable.enumerate` See :py:func:`python:enumerate`. :py:meth:`~.InfiniteIterable.filter` See :py:func:`python:filter`. :py:meth:`~.InfiniteIterable.filterfalse` See :py:func:`itertools.filterfalse`. :py:meth:`~.InfiniteIterable.flatten` See :py:func:`~iteration_utilities.flatten`. :py:meth:`~.InfiniteIterable.from_applyfunc` See :py:func:`~iteration_utilities.applyfunc`. :py:meth:`~.InfiniteIterable.from_count` See :py:func:`itertools.count`. :py:meth:`~.InfiniteIterable.from_empty` See :py:func:`~iteration_utilities.empty`. :py:meth:`~.InfiniteIterable.from_iterfunc_exception` See :py:func:`~iteration_utilities.iter_except`. :py:meth:`~.InfiniteIterable.from_iterfunc_sentinel` See :py:func:`python:iter`. :py:meth:`~.InfiniteIterable.from_itersubclasses` See :py:func:`~iteration_utilities.itersubclasses`. :py:meth:`~.InfiniteIterable.from_repeat` See :py:func:`itertools.repeat`. :py:meth:`~.InfiniteIterable.from_repeatfunc` See :py:func:`~iteration_utilities.repeatfunc`. :py:meth:`~.InfiniteIterable.from_tabulate` See :py:func:`~iteration_utilities.tabulate`. :py:meth:`~.InfiniteIterable.getitem` See :py:func:`~iteration_utilities.getitem` :py:meth:`~.InfiniteIterable.grouper` See :py:func:`~iteration_utilities.grouper`. :py:meth:`~.InfiniteIterable.insert` See :py:func:`~iteration_utilities.insert` :py:meth:`~.InfiniteIterable.intersperse` See :py:func:`~iteration_utilities.intersperse`. :py:meth:`~.InfiniteIterable.islice` See :py:func:`itertools.islice`. :py:meth:`~.InfiniteIterable.map` See :py:func:`python:map`. :py:meth:`~.InfiniteIterable.ncycles` See :py:func:`~iteration_utilities.ncycles`. :py:meth:`~.InfiniteIterable.pad` See :py:func:`~iteration_utilities.pad`. :py:meth:`~.InfiniteIterable.permutations` See :py:func:`itertools.permutations`. :py:meth:`~.InfiniteIterable.powerset` See :py:func:`~iteration_utilities.powerset`. :py:meth:`~.InfiniteIterable.remove` See :py:func:`~iteration_utilities.remove`. :py:meth:`~.InfiniteIterable.replace` See :py:func:`~iteration_utilities.replace`. :py:meth:`~.InfiniteIterable.replicate` See :py:func:`~iteration_utilities.replicate`. :py:meth:`~.InfiniteIterable.split` See :py:func:`~iteration_utilities.split`. :py:meth:`~.InfiniteIterable.starfilter` See :py:func:`~iteration_utilities.starfilter`. :py:meth:`~.InfiniteIterable.starmap` See :py:func:`itertools.starmap`. :py:meth:`~.InfiniteIterable.successive` See :py:func:`~iteration_utilities.successive`. :py:meth:`~.InfiniteIterable.tail` See :py:func:`~iteration_utilities.tail`. :py:meth:`~.InfiniteIterable.takewhile` See :py:func:`itertools.takewhile`. :py:meth:`~.InfiniteIterable.unique_everseen` See :py:func:`~iteration_utilities.unique_everseen`. :py:meth:`~.InfiniteIterable.unique_justseen` See :py:func:`~iteration_utilities.unique_justseen`. =========================================================== ======================================================= """ __slots__ = ('_iterable',) class ManyIterables: __slots__ = ('_iterables',) def __init__(self, *iterables): """:py:class:`.ManyIterables` stores several `iterables` and implements methods to convert these to one :py:class:`.Iterable`. .. warning:: :py:class:`.ManyIterables` itself cannot be iterated! Parameters ---------- *iterables : any amount of iterables The `iterables` to store. Notes ----- This is just a convenience class to separate the expressions dealing with multiple iterables from those applying on one. Available methods: ===================================== =============================================== Method Reference ===================================== =============================================== :py:meth:`~ManyIterables.chain` See :py:func:`itertools.chain`. :py:meth:`~ManyIterables.map` See :py:func:`python:map`. :py:meth:`~ManyIterables.merge` See :py:func:`~iteration_utilities.merge`. :py:meth:`~ManyIterables.product` See :py:func:`itertools.product`. :py:meth:`~ManyIterables.roundrobin` See :py:func:`~iteration_utilities.roundrobin`. :py:meth:`~ManyIterables.zip` See :py:func:`python:zip`. :py:meth:`~ManyIterables.zip_longest` See :py:func:`itertools.zip_longest`. ===================================== =============================================== Examples -------- Depending on the function and the types of the `iterables` the returned class may be different. For example :py:meth:`map` returns an :py:class:`.InfiniteIterable` if **all** `iterables` are infinite:: >>> from iteration_utilities import ManyIterables >>> ManyIterables(Iterable.from_count(10), range(10)).map(pow) \ # doctest: +ELLIPSIS <Iterable: <map object at ...>> >>> ManyIterables(Iterable.from_count(10), ... Iterable.from_count(10)).map(pow) \ # doctest: +ELLIPSIS <InfiniteIterable: <map object at ...>> While other methods also return an :py:class:`.InfiniteIterable` if **any** of the `iterables` is infinite:: >>> ManyIterables(range(10), Iterable.from_count(10)).merge() \ # doctest: +ELLIPSIS <InfiniteIterable: <iteration_utilities.merge object at ...>> >>> ManyIterables(range(10), range(10)).merge() \ # doctest: +ELLIPSIS <Iterable: <iteration_utilities.merge object at ...>> Each method has a note explicitly stating to which of these categories it belongs. """ self._iterables = iterables def _call(self, fn, infinitecheck, *args, **kwargs): iterables = self._iterables if infinitecheck and any_isinstance(iterables, InfiniteIterable): cls = InfiniteIterable elif not infinitecheck and all_isinstance(iterables, InfiniteIterable): cls = InfiniteIterable else: cls = Iterable if args: iterables = args + iterables _parse_kwargs(kwargs, _default) return cls(fn(*iterables, **kwargs)) def chain(self): """See :py:func:`itertools.chain`. .. note:: If any of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,3,5,7,9], [0,2,4,6,8]).chain().as_list() [1, 3, 5, 7, 9, 0, 2, 4, 6, 8] """ return self._call(chain, True) def map(self, function): """See :py:func:`python:map`. .. note:: If **all** of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,3,5,7,9], [0,2,4,6,8]).map(pow).as_list() [1, 9, 625, 117649, 43046721] """ return self._call(map, False, function) def merge(self, key=_default, reverse=_default): """See :py:func:`~iteration_utilities.merge`. .. note:: If any of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,3,5,7,9], [0,2,4,6,8]).merge().as_list() [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] >>> from operator import neg >>> ManyIterables([1,3,5,7,9], [0,2,4,6,8]).merge(neg, True).as_list() [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] >>> ManyIterables([1,3,5,7,9], [0,2,4,6,8]).merge( ... key=neg, reverse=True).as_list() [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] """ return self._call(merge, True, key=key, reverse=reverse) def product(self, repeat=_default): """See :py:func:`itertools.product`. .. note:: If any of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,2], [10, 11, 12]).product().as_list() [(1, 10), (1, 11), (1, 12), (2, 10), (2, 11), (2, 12)] >>> ManyIterables([1], [10, 11]).product(2).as_list() [(1, 10, 1, 10), (1, 10, 1, 11), (1, 11, 1, 10), (1, 11, 1, 11)] >>> ManyIterables([1], [10, 11]).product(repeat=2).as_list() [(1, 10, 1, 10), (1, 10, 1, 11), (1, 11, 1, 10), (1, 11, 1, 11)] """ return self._call(product, True, repeat=repeat) def roundrobin(self): """See :py:func:`~iteration_utilities.roundrobin`. .. note:: If any of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,2,3,4], [10, 11, 12]).roundrobin().as_list() [1, 10, 2, 11, 3, 12, 4] """ return self._call(roundrobin, True) def zip(self): """See :py:func:`python:zip`. .. note:: If **all** of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,2,3,4], [2,3,4,5]).zip().as_list() [(1, 2), (2, 3), (3, 4), (4, 5)] """ return self._call(zip, False) def zip_longest(self, fillvalue=_default): """See :py:func:`itertools.zip_longest`. .. note:: If any of the `iterables` is infinite then this will also return an :py:class:`.InfiniteIterable`. Examples -------- >>> from iteration_utilities import ManyIterables >>> ManyIterables([1,2,3,4], [2,3,4]).zip_longest().as_list() [(1, 2), (2, 3), (3, 4), (4, None)] >>> ManyIterables([1,2,3,4], [2,3,4]).zip_longest('x').as_list() [(1, 2), (2, 3), (3, 4), (4, 'x')] >>> ManyIterables([1,2,3,4], [2,3,4]).zip_longest( ... fillvalue='x').as_list() [(1, 2), (2, 3), (3, 4), (4, 'x')] """ return self._call(zip_longest, True, fillvalue=fillvalue) 070701000000B4000081A400000000000000000000000165E3BCDA0000015F000000000000000000000000000000000000004300000000iteration_utilities-0.12.1/src/iteration_utilities/_convenience.py# Licensed under Apache License Version 2.0 - see LICENSE from iteration_utilities import constant, nth __all__ = ["return_True", "return_False", "return_None", "first", "second", "third", "last"] return_True = constant(True) return_False = constant(False) return_None = constant(None) first = nth(0) second = nth(1) third = nth(2) last = nth(-1) 070701000000B5000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000004800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities070701000000B6000081A400000000000000000000000165E3BCDA00003127000000000000000000000000000000000000005F00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/_iteration_utilities.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #define PY_SSIZE_T_CLEAN #include <Python.h> #include "docsfunctions.h" #include "helper.h" #include "always_iterable.h" #include "exported_helper.h" #include "isx.h" #include "itemidxkey.h" #include "mathematical.h" #include "placeholder.h" #include "returnx.h" #include "seen.h" #include "chained.h" #include "complement.h" #include "constant.h" #include "flip.h" #include "packed.h" #include "partial.h" #include "nth.h" #include "alldistinct.h" #include "allequal.h" #include "allisinstance.h" #include "allmonotone.h" #include "anyisinstance.h" #include "argminmax.h" #include "countitems.h" #include "dotproduct.h" #include "groupedby.h" #include "minmax.h" #include "one.h" #include "partition.h" #include "accumulate.h" #include "applyfunc.h" #include "clamp.h" #include "deepflatten.h" #include "duplicates.h" #include "empty.h" #include "grouper.h" #include "intersperse.h" #include "iterexcept.h" #include "merge.h" #include "replicate.h" #include "roundrobin.h" #include "sideeffect.h" #include "split.h" #include "starfilter.h" #include "successive.h" #include "tabulate.h" #include "uniqueever.h" #include "uniquejust.h" static PyMethodDef PyIU_methods[] = { /* isx */ { "is_None", /* ml_name */ (PyCFunction)PyIU_IsNone, /* ml_meth */ METH_O, /* ml_flags */ PyIU_IsNone_doc /* ml_doc */ }, { "is_not_None", /* ml_name */ (PyCFunction)PyIU_IsNotNone, /* ml_meth */ METH_O, /* ml_flags */ PyIU_IsNotNone_doc /* ml_doc */ }, { "is_even", /* ml_name */ (PyCFunction)PyIU_IsEven, /* ml_meth */ METH_O, /* ml_flags */ PyIU_IsEven_doc /* ml_doc */ }, { "is_odd", /* ml_name */ (PyCFunction)PyIU_IsOdd, /* ml_meth */ METH_O, /* ml_flags */ PyIU_IsOdd_doc /* ml_doc */ }, { "is_iterable", /* ml_name */ (PyCFunction)PyIU_IsIterable, /* ml_meth */ METH_O, /* ml_flags */ PyIU_IsIterable_doc /* ml_doc */ }, /* Math */ { "square", /* ml_name */ (PyCFunction)PyIU_MathSquare, /* ml_meth */ METH_O, /* ml_flags */ PyIU_MathSquare_doc /* ml_doc */ }, { "double", /* ml_name */ (PyCFunction)PyIU_MathDouble, /* ml_meth */ METH_O, /* ml_flags */ PyIU_MathDouble_doc /* ml_doc */ }, { "reciprocal", /* ml_name */ (PyCFunction)PyIU_MathReciprocal, /* ml_meth */ METH_O, /* ml_flags */ PyIU_MathReciprocal_doc /* ml_doc */ }, { "radd", /* ml_name */ (PyCFunction)PyIU_MathRadd, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRadd_doc /* ml_doc */ }, { "rsub", /* ml_name */ (PyCFunction)PyIU_MathRsub, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRsub_doc /* ml_doc */ }, { "rmul", /* ml_name */ (PyCFunction)PyIU_MathRmul, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRmul_doc /* ml_doc */ }, { "rdiv", /* ml_name */ (PyCFunction)PyIU_MathRdiv, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRdiv_doc /* ml_doc */ }, { "rfdiv", /* ml_name */ (PyCFunction)PyIU_MathRfdiv, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRfdiv_doc /* ml_doc */ }, { "rpow", /* ml_name */ (PyCFunction)PyIU_MathRpow, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRpow_doc /* ml_doc */ }, { "rmod", /* ml_name */ (PyCFunction)PyIU_MathRmod, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_MathRmod_doc /* ml_doc */ }, /* Helper */ { "_parse_args", /* ml_name */ #if PyIU_USE_VECTORCALL (PyCFunction)(void (*)(void))PyIU_TupleToList_and_InsertItemAtIndex, /* ml_meth */ METH_FASTCALL, /* ml_flags */ #else (PyCFunction)PyIU_TupleToList_and_InsertItemAtIndex, /* ml_meth */ METH_VARARGS, /* ml_flags */ #endif PyIU_TupleToList_and_InsertItemAtIndex_doc /* ml_doc */ }, { "_parse_kwargs", /* ml_name */ #if PyIU_USE_VECTORCALL (PyCFunction)(void (*)(void))PyIU_RemoveFromDictWhereValueIs, /* ml_meth */ METH_FASTCALL, /* ml_flags */ #else (PyCFunction)PyIU_RemoveFromDictWhereValueIs, /* ml_meth */ METH_VARARGS, /* ml_flags */ #endif PyIU_RemoveFromDictWhereValueIs_doc /* ml_doc */ }, /* returnx */ { "return_identity", /* ml_name */ (PyCFunction)PyIU_ReturnIdentity, /* ml_meth */ METH_O, /* ml_flags */ PyIU_ReturnIdentity_doc /* ml_doc */ }, { "return_first_arg", /* ml_name */ (PyCFunction)PyIU_ReturnFirstArg, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_ReturnFirstArg_doc /* ml_doc */ }, { "return_called", /* ml_name */ (PyCFunction)PyIU_ReturnCalled, /* ml_meth */ METH_O, /* ml_flags */ PyIU_ReturnCalled_doc /* ml_doc */ }, { "always_iterable", /* ml_name */ (PyCFunction)PyIU_AlwaysIterable, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_AlwaysIterable_doc /* ml_doc */ }, /* Fold functions */ { "argmin", /* ml_name */ (PyCFunction)PyIU_Argmin, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_Argmin_doc /* ml_doc */ }, { "argmax", /* ml_name */ (PyCFunction)PyIU_Argmax, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_Argmax_doc /* ml_doc */ }, { "all_distinct", /* ml_name */ (PyCFunction)PyIU_AllDistinct, /* ml_meth */ METH_O, /* ml_flags */ PyIU_AllDistinct_doc /* ml_doc */ }, { "all_equal", /* ml_name */ (PyCFunction)PyIU_AllEqual, /* ml_meth */ METH_O, /* ml_flags */ PyIU_AllEqual_doc /* ml_doc */ }, { "all_isinstance", /* ml_name */ (PyCFunction)PyIU_AllIsinstance, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_AllIsinstance_doc /* ml_doc */ }, { "all_monotone", /* ml_name */ (PyCFunction)PyIU_Monotone, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_Monotone_doc /* ml_doc */ }, { "any_isinstance", /* ml_name */ (PyCFunction)PyIU_AnyIsinstance, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_AnyIsinstance_doc /* ml_doc */ }, { "count_items", /* ml_name */ (PyCFunction)PyIU_Count, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_Count_doc /* ml_doc */ }, { "dotproduct", /* ml_name */ (PyCFunction)PyIU_DotProduct, /* ml_meth */ METH_VARARGS, /* ml_flags */ PyIU_DotProduct_doc /* ml_doc */ }, { "groupedby", /* ml_name */ (PyCFunction)PyIU_Groupby, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_Groupby_doc /* ml_doc */ }, { "minmax", /* ml_name */ (PyCFunction)PyIU_MinMax, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_MinMax_doc /* ml_doc */ }, { "one", /* ml_name */ (PyCFunction)PyIU_One, /* ml_meth */ METH_O, /* ml_flags */ PyIU_One_doc /* ml_doc */ }, { "partition", /* ml_name */ (PyCFunction)PyIU_Partition, /* ml_meth */ METH_VARARGS | METH_KEYWORDS, /* ml_flags */ PyIU_Partition_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static int _iteration_utilities_exec(PyObject *module) { /* Classes available in module. */ PyTypeObject *typelist[] = { &PyIUType_ItemIdxKey, &PyIUType_Seen, &PyIUType_Chained, &PyIUType_Complement, &PyIUType_Constant, &PyIUType_Flip, &PyIUType_Packed, &Placeholder_Type, &PyIUType_Partial, &PyIUType_Nth, &PyIUType_Accumulate, &PyIUType_Applyfunc, &PyIUType_Clamp, &PyIUType_DeepFlatten, &PyIUType_Duplicates, &PyIUType_Empty, &PyIUType_Grouper, &PyIUType_Intersperse, &PyIUType_Iterexcept, &PyIUType_Merge, &PyIUType_Replicate, &PyIUType_Roundrobin, &PyIUType_Sideeffects, &PyIUType_Split, &PyIUType_Starfilter, &PyIUType_Successive, &PyIUType_Tabulate, &PyIUType_UniqueEver, &PyIUType_UniqueJust, NULL }; size_t i; /* Add classes to the module but only use the name starting after the first occurrence of ".". */ for (i = 0; typelist[i] != NULL; i++) { #if PyIU_USE_BUILTIN_MODULE_ADDTYPE if (PyModule_AddType(module, typelist[i]) < 0) { return -1; } #else char *name; if (PyType_Ready(typelist[i]) < 0) { return -1; } name = strrchr(typelist[i]->tp_name, '.'); assert(name != NULL); Py_INCREF(typelist[i]); if (PyModule_AddObject(module, name + 1, (PyObject *)typelist[i]) < 0) { return -1; } #endif } Py_INCREF(PYIU_Placeholder); if (PyModule_AddObject(module, PyIU_Placeholder_name, PYIU_Placeholder) < 0) { return -1; } Py_INCREF(PYIU_Empty); if (PyModule_AddObject(module, PyIU_Empty_name, PYIU_Empty) < 0) { return -1; } if (PyDict_SetItemString(PyIUType_Partial.tp_dict, "_", PYIU_Placeholder) < 0) { return -1; } return 0; } static PyModuleDef_Slot _iteration_utilities_slots[] = { {Py_mod_exec, _iteration_utilities_exec}, {0, NULL} }; /* Module definition */ static struct PyModuleDef PyIU_module = { PyModuleDef_HEAD_INIT, /* m_base */ PyIU_module_name, /* m_name */ PyIU_module_doc, /* m_doc */ 0, /* m_size */ (PyMethodDef *)PyIU_methods, /* m_methods */ (struct PyModuleDef_Slot*)_iteration_utilities_slots, /* m_slots */ NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ }; /* Module initialization */ PyMODINIT_FUNC PyInit__iteration_utilities(void) { PyIU_InitializeConstants(); return PyModuleDef_Init(&PyIU_module); } 070701000000B7000081A400000000000000000000000165E3BCDA00002D9E000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/accumulate.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "accumulate.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( accumulate_prop_func_doc, "(callable or None) The function used for accumulation (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( accumulate_prop_current_doc, "(any type) The current accumulated total (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( accumulate_doc, "accumulate(iterable, func=None, start=None)\n" "--\n\n" "Make an iterator that returns accumulated sums, or accumulated\n" "results of other binary functions (specified via the optional `func`\n" "argument). Copied and modified from [0]_.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` to accumulate.\n" "\n" "func : callable or None, optional\n" " The function with which to accumulate. Should be a function of two\n" " arguments.\n" " If ``None`` defaults to :py:func:`operator.add`.\n" "\n" "start : any type, optional\n" " If given (even as ``None``) this value is inserted before the `iterable`.\n" "\n" "Returns\n" "-------\n" "accumulated : generator\n" " The accumulated results as generator.\n" "\n" "Notes\n" "-----\n" "Elements of the input `iterable` may be any type that can be\n" "accepted as arguments to `func`. (For example, with the default\n" "operation of addition, elements may be any addable type including\n" "Decimal or Fraction.) If the input `iterable` is empty, the output\n" "iterable will also be empty.\n" "\n" "Examples\n" "--------\n" "There are a number of uses for the `func` argument. It can be set to\n" ":py:func:`min` for a running minimum, :py:func:`max` for a running\n" "maximum, or :py:func:`operator.mul` for a running product. Amortization\n" "tables can be built by accumulating interest and applying payments.\n" "First-order recurrence relations can be modeled by supplying the\n" "initial value in the `iterable` and using only the accumulated total in\n" "`func` argument::\n" "\n" " >>> from iteration_utilities import accumulate\n" " >>> from itertools import repeat\n" " >>> import operator\n" "\n" " >>> data = [3, 4, 6, 2, 1, 9, 0, 7, 5, 8]\n" " >>> list(accumulate(data)) # running sum\n" " [3, 7, 13, 15, 16, 25, 25, 32, 37, 45]\n" " >>> list(accumulate(data, operator.add)) # running sum (explicit)\n" " [3, 7, 13, 15, 16, 25, 25, 32, 37, 45]\n" " >>> list(accumulate(data, operator.mul)) # running product\n" " [3, 12, 72, 144, 144, 1296, 0, 0, 0, 0]\n" " >>> list(accumulate(data, max)) # running maximum\n" " [3, 4, 6, 6, 6, 9, 9, 9, 9, 9]\n" "\n" "Amortize a 5% loan of 1000 (start value) with 4 annual payments of 90::\n" "\n" " >>> cashflows = [-90, -90, -90, -90]\n" " >>> list(accumulate(cashflows, lambda bal, pmt: bal*1.05 + pmt, 1000))\n" " [960.0, 918.0, 873.9000000000001, 827.5950000000001]\n" "\n" "Chaotic recurrence relation [1]_::\n" "\n" " >>> logistic_map = lambda x, _: r * x * (1 - x)\n" " >>> r = 3.8\n" " >>> x0 = 0.4\n" " >>> inputs = repeat(x0, 36) # only the initial value is used\n" " >>> [format(x, '.2f') for x in accumulate(inputs, logistic_map)]\n" " ['0.40', '0.91', '0.30', '0.81', '0.60', '0.92', '0.29', '0.79', " "'0.63', '0.88', '0.39', '0.90', '0.33', '0.84', '0.52', '0.95', '0.18', " "'0.57', '0.93', '0.25', '0.71', '0.79', '0.63', '0.88', '0.39', '0.91', " "'0.32', '0.83', '0.54', '0.95', '0.20', '0.60', '0.91', '0.30', '0.80', '0.60']\n" "\n" "References\n" "----------\n" ".. [0] https://docs.python.org/3/library/itertools.html#itertools.accumulate\n" ".. [1] https://en.wikipedia.org/wiki/Logistic_map\n"); /****************************************************************************** * Parts are taken from the CPython package (PSF licensed). *****************************************************************************/ static PyObject * accumulate_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "func", "start", NULL}; PyIUObject_Accumulate *self; PyObject *iterable; PyObject *binop = NULL; PyObject *start = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|OO:accumulate", kwlist, &iterable, &binop, &start)) { return NULL; } self = (PyIUObject_Accumulate *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->binop = binop == Py_None ? NULL : binop; Py_XINCREF(self->binop); Py_XINCREF(start); self->total = start; return (PyObject *)self; } static void accumulate_dealloc(PyIUObject_Accumulate *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->binop); Py_XDECREF(self->total); Py_TYPE(self)->tp_free(self); } static int accumulate_traverse(PyIUObject_Accumulate *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->binop); Py_VISIT(self->total); return 0; } static int accumulate_clear(PyIUObject_Accumulate *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->binop); Py_CLEAR(self->total); return 0; } static PyObject * accumulate_next(PyIUObject_Accumulate *self) { PyObject *item; PyObject *newtotal; /* Get next item from iterator. */ item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); if (item == NULL) { return NULL; } /* If it's the first element the total is yet unset and we simply return the item. */ if (self->total == NULL) { Py_INCREF(item); self->total = item; return item; } /* Apply the binop to the old total and the item defaulting to add if the binop is not set or set to None. */ if (self->binop == NULL) { newtotal = PyNumber_Add(self->total, item); } else { newtotal = PyIU_CallWithTwoArguments(self->binop, self->total, item); } Py_DECREF(item); if (newtotal == NULL) { return NULL; } /* Update the total and return it. */ Py_INCREF(newtotal); Py_SETREF(self->total, newtotal); return newtotal; } static PyObject * accumulate_reduce(PyIUObject_Accumulate *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on total == NULL because otherwise "None" would be ambiguous. It could mean that we did not had a start or that the start was None. Better to make an "if" than to introduce another variable depending on total == NULL. */ if (self->total != NULL) { return Py_BuildValue("O(OOO)", Py_TYPE(self), self->iterator, self->binop ? self->binop : Py_None, self->total); } else { return Py_BuildValue("O(OO)", Py_TYPE(self), self->iterator, self->binop ? self->binop : Py_None); } } static PyObject * accumulate_lengthhint(PyIUObject_Accumulate *self, PyObject *Py_UNUSED(args)) { Py_ssize_t len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } return PyLong_FromSsize_t(len); } static PyMethodDef accumulate_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)accumulate_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)accumulate_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef accumulate_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Accumulate, binop), /* offset */ READONLY, /* flags */ accumulate_prop_func_doc /* doc */ }, { "current", /* name */ T_OBJECT_EX, /* type */ offsetof(PyIUObject_Accumulate, total), /* offset */ READONLY, /* flags */ accumulate_prop_current_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Accumulate = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.accumulate", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Accumulate), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)accumulate_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)accumulate_doc, /* tp_doc */ (traverseproc)accumulate_traverse, /* tp_traverse */ (inquiry)accumulate_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)accumulate_next, /* tp_iternext */ accumulate_methods, /* tp_methods */ accumulate_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)accumulate_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000B8000081A400000000000000000000000165E3BCDA00000172000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/accumulate.h#ifndef PYIU_ACCUMULATE_H #define PYIU_ACCUMULATE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *binop; PyObject *total; } PyIUObject_Accumulate; extern PyTypeObject PyIUType_Accumulate; #ifdef __cplusplus } #endif #endif 070701000000B9000081A400000000000000000000000165E3BCDA00000520000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/alldistinct.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "alldistinct.h" #include "helper.h" #include "seen.h" PyObject * PyIU_AllDistinct(PyObject *Py_UNUSED(m), PyObject *iterable) { PyObject *iterator; PyObject *item; PyObject *seen; iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } seen = PyIUSeen_New(); if (seen == NULL) { Py_DECREF(iterator); return NULL; } /* Almost identical to unique_everseen so no inline comments. */ while ((item = Py_TYPE(iterator)->tp_iternext(iterator))) { /* Check if the item is in seen. */ int ok = PyIUSeen_ContainsAdd(seen, item); Py_DECREF(item); if (ok != 0) { /* Found duplicate or failure. */ Py_DECREF(iterator); Py_DECREF(seen); if (ok == 1) { Py_RETURN_FALSE; } else if (ok == -1) { return NULL; } } } Py_DECREF(iterator); Py_DECREF(seen); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } Py_RETURN_TRUE; } 070701000000BA000081A400000000000000000000000165E3BCDA00000115000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/alldistinct.h#ifndef PYIU_ALLDISTINCT_H #define PYIU_ALLDISTINCT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_AllDistinct(PyObject *Py_UNUSED(m), PyObject *iterable); #ifdef __cplusplus } #endif #endif 070701000000BB000081A400000000000000000000000165E3BCDA00000475000000000000000000000000000000000000005300000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/allequal.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "allequal.h" #include "helper.h" PyObject * PyIU_AllEqual(PyObject *Py_UNUSED(m), PyObject *iterable) { PyObject *iterator; PyObject *item; PyObject *first = NULL; int ok; iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } while ((item = Py_TYPE(iterator)->tp_iternext(iterator))) { if (first == NULL) { first = item; continue; } ok = PyObject_RichCompareBool(first, item, Py_EQ); Py_DECREF(item); if (ok != 1) { Py_DECREF(iterator); Py_DECREF(first); if (ok == 0) { Py_RETURN_FALSE; } else if (ok == -1) { return NULL; } } } Py_DECREF(iterator); Py_XDECREF(first); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } Py_RETURN_TRUE; } 070701000000BC000081A400000000000000000000000165E3BCDA0000010C000000000000000000000000000000000000005300000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/allequal.h#ifndef PYIU_ALLEQUAL_H #define PYIU_ALLEQUAL_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_AllEqual(PyObject *Py_UNUSED(m), PyObject *iterable); #ifdef __cplusplus } #endif #endif 070701000000BD000081A400000000000000000000000165E3BCDA000004D0000000000000000000000000000000000000005800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/allisinstance.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "allisinstance.h" #include "helper.h" PyObject * PyIU_AllIsinstance(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "types", NULL}; PyObject *iterable; PyObject *types; PyObject *iterator; PyObject *item; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO:all_isinstance", kwlist, &iterable, &types)) { return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } while ((item = Py_TYPE(iterator)->tp_iternext(iterator))) { int ok = PyObject_IsInstance(item, types); Py_DECREF(item); if (ok != 1) { Py_DECREF(iterator); if (ok == 0) { Py_RETURN_FALSE; } else { return NULL; } } } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } Py_RETURN_TRUE; } 070701000000BE000081A400000000000000000000000165E3BCDA00000129000000000000000000000000000000000000005800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/allisinstance.h#ifndef PYIU_ALLISINSTANCE_H #define PYIU_ALLISINSTANCE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_AllIsinstance(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000BF000081A400000000000000000000000165E3BCDA0000062E000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/allmonotone.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "allmonotone.h" #include "helper.h" PyObject * PyIU_Monotone(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "decreasing", "strict", NULL}; PyObject *iterable; PyObject *iterator; PyObject *item; PyObject *last = NULL; int decreasing = 0; int strict = 0; int op; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|pp:all_monotone", kwlist, &iterable, &decreasing, &strict)) { return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } op = decreasing ? (strict ? Py_GT : Py_GE) : (strict ? Py_LT : Py_LE); while ((item = Py_TYPE(iterator)->tp_iternext(iterator))) { int ok; if (last == NULL) { last = item; continue; } ok = PyObject_RichCompareBool(last, item, op); Py_DECREF(last); last = item; if (ok != 1) { Py_DECREF(iterator); Py_DECREF(last); if (ok == 0) { Py_RETURN_FALSE; } else if (ok == -1) { return NULL; } } } Py_DECREF(iterator); Py_XDECREF(last); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } Py_RETURN_TRUE; } 070701000000C0000081A400000000000000000000000165E3BCDA00000120000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/allmonotone.h#ifndef PYIU_ALLMONOTONE_H #define PYIU_ALLMONOTONE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_Monotone(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000C1000081A400000000000000000000000165E3BCDA000006EF000000000000000000000000000000000000005A00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/always_iterable.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "always_iterable.h" #include "helper.h" #include "empty.h" PyObject * PyIU_AlwaysIterable(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"obj", "excluded_types", "empty_if_none", NULL}; PyObject *object; PyObject *excluded_types = NULL; PyObject *tup; PyObject *result; int wrap_iterable = 0; int empty_if_none = 0; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|Op:always_iterable", kwlist, &object, &excluded_types, &empty_if_none)) { return NULL; } if (empty_if_none && object == Py_None) { Py_INCREF(PYIU_Empty); return PYIU_Empty; } if (excluded_types == NULL) { wrap_iterable = PyUnicode_CheckExact(object) || PyBytes_CheckExact(object); } else if (excluded_types != Py_None) { wrap_iterable = PyObject_IsInstance(object, excluded_types); if (wrap_iterable == -1) { return NULL; } } if (!wrap_iterable) { PyObject *it = PyObject_GetIter(object); if (it != NULL) { return it; } if (PyErr_Occurred()) { if (!PyErr_ExceptionMatches(PyExc_TypeError)) { return NULL; } else { PyErr_Clear(); } } } tup = PyTuple_New(1); if (tup == NULL) { return NULL; } Py_INCREF(object); PyTuple_SET_ITEM(tup, 0, object); result = PyObject_GetIter(tup); Py_DECREF(tup); return result; } 070701000000C2000081A400000000000000000000000165E3BCDA0000012C000000000000000000000000000000000000005A00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/always_iterable.h#ifndef PYIU_ALWAYSITERABLE_H #define PYIU_ALWAYSITERABLE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_AlwaysIterable(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000C3000081A400000000000000000000000165E3BCDA00000511000000000000000000000000000000000000005800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/anyisinstance.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "anyisinstance.h" #include "helper.h" PyObject * PyIU_AnyIsinstance(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "types", NULL}; PyObject *iterable; PyObject *types; PyObject *iterator; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO:any_isinstance", kwlist, &iterable, &types)) { return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } for (;;) { int ok; PyObject *item = Py_TYPE(iterator)->tp_iternext(iterator); if (item == NULL) { break; } ok = PyObject_IsInstance(item, types); Py_DECREF(item); if (ok) { Py_DECREF(iterator); if (ok == 1) { Py_RETURN_TRUE; } else { return NULL; } } } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } Py_RETURN_FALSE; } 070701000000C4000081A400000000000000000000000165E3BCDA00000129000000000000000000000000000000000000005800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/anyisinstance.h#ifndef PYIU_ANYISINSTANCE_H #define PYIU_ANYISINSTANCE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_AnyIsinstance(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000C5000081A400000000000000000000000165E3BCDA00001AED000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/applyfunc.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "applyfunc.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( applyfunc_prop_func_doc, "(callable) The function used (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( applyfunc_prop_current_doc, "(any type) The current value for the function (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( applyfunc_doc, "applyfunc(func, initial)\n" "--\n\n" "Successively apply `func` on `value`.\n" "\n" "Parameters\n" "----------\n" "func : callable\n" " The function to apply. The `value` is given as first argument to the \n" " `func`, no other arguments will be passed during the function call.\n" "\n" "initial : any type\n" " The `initial` `value` for the function.\n" "\n" "Returns\n" "-------\n" "results : generator\n" " The result of the successively applied `func`.\n" "\n" "Examples\n" "--------\n" "The first element is the initial `value` and the next elements are\n" "the result of ``func(value)``, then ``func(func(value))``, ...::\n" "\n" " >>> from iteration_utilities import applyfunc, getitem\n" " >>> import math\n" " >>> list(getitem(applyfunc(math.sqrt, 10), stop=4))\n" " [3.1622776601683795, 1.7782794100389228, 1.333521432163324, 1.1547819846894583]\n" "\n" ".. warning::\n" " This will return an infinitely long generator so do **not** try to do\n" " something like ``list(applyfunc())``!\n"); static PyObject * applyfunc_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"func", "initial", NULL}; PyIUObject_Applyfunc *self; PyObject *func; PyObject *initial; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO:applyfunc", kwlist, &func, &initial)) { return NULL; } self = (PyIUObject_Applyfunc *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(func); self->func = func; Py_INCREF(initial); self->value = initial; return (PyObject *)self; } static void applyfunc_dealloc(PyIUObject_Applyfunc *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->func); Py_XDECREF(self->value); Py_TYPE(self)->tp_free(self); } static int applyfunc_traverse(PyIUObject_Applyfunc *self, visitproc visit, void *arg) { Py_VISIT(self->func); Py_VISIT(self->value); return 0; } static int applyfunc_clear(PyIUObject_Applyfunc *self) { Py_CLEAR(self->func); Py_CLEAR(self->value); return 0; } static PyObject * applyfunc_next(PyIUObject_Applyfunc *self) { PyObject *newval; /* Call the function with the current value as argument. */ newval = PyIU_CallWithOneArgument(self->func, self->value); if (newval == NULL) { return NULL; } /* Save the new value and also return it. */ Py_INCREF(newval); Py_SETREF(self->value, newval); return newval; } static PyObject * applyfunc_reduce(PyIUObject_Applyfunc *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OO)", Py_TYPE(self), self->func, self->value); } static PyMethodDef applyfunc_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)applyfunc_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef applyfunc_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Applyfunc, func), /* offset */ READONLY, /* flags */ applyfunc_prop_func_doc /* doc */ }, { "current", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Applyfunc, value), /* offset */ READONLY, /* flags */ applyfunc_prop_current_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Applyfunc = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.applyfunc", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Applyfunc), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)applyfunc_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)applyfunc_doc, /* tp_doc */ (traverseproc)applyfunc_traverse, /* tp_traverse */ (inquiry)applyfunc_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)applyfunc_next, /* tp_iternext */ applyfunc_methods, /* tp_methods */ applyfunc_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)applyfunc_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000C6000081A400000000000000000000000165E3BCDA00000155000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/applyfunc.h#ifndef PYIU_APPLYFUNC_H #define PYIU_APPLYFUNC_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; PyObject *value; } PyIUObject_Applyfunc; extern PyTypeObject PyIUType_Applyfunc; #ifdef __cplusplus } #endif #endif 070701000000C7000081A400000000000000000000000165E3BCDA00000E5F000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/argminmax.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "argminmax.h" #include "helper.h" static PyObject * argminmax(PyObject *args, PyObject *kwargs, int cmpop) { static char *kwlist[] = {"key", "default", NULL}; PyObject *sequence; PyObject *keyfunc = NULL; PyObject *iterator = NULL; PyObject *item = NULL; PyObject *val = NULL; PyObject *maxval = NULL; Py_ssize_t defaultitem = 0; Py_ssize_t idx = -1; Py_ssize_t maxidx = -1; int defaultisset = 0; const int positional = PyTuple_GET_SIZE(args) > 1; if (positional) { sequence = args; } else if (!PyArg_UnpackTuple(args, cmpop == Py_LT ? "argmin" : "argmax", 1, 1, &sequence)) { return NULL; } if (!PyArg_ParseTupleAndKeywords( PyIU_global_0tuple, kwargs, cmpop == Py_LT ? "|On:argmin" : "|On:argmax", kwlist, &keyfunc, &defaultitem)) { return NULL; } if (defaultitem != 0 || (kwargs != NULL && PyDict_CheckExact(kwargs) && PyDict_GetItemString(kwargs, "default"))) { defaultisset = 1; } if (keyfunc == Py_None) { keyfunc = NULL; } Py_XINCREF(keyfunc); if (positional && defaultisset) { PyErr_Format(PyExc_TypeError, "Cannot specify a `default` for `%s` with " "multiple positional arguments", cmpop == Py_LT ? "argmin" : "argmax"); goto Fail; } iterator = PyObject_GetIter(sequence); if (iterator == NULL) { goto Fail; } while ((item = Py_TYPE(iterator)->tp_iternext(iterator))) { idx++; /* Use the item itself or keyfunc(item). */ if (keyfunc != NULL) { val = PyIU_CallWithOneArgument(keyfunc, item); if (val == NULL) { goto Fail; } } else { val = item; Py_INCREF(val); } if (maxval == NULL) { /* maximum value and item are unset; set them. */ maxval = val; maxidx = idx; } else { /* maximum value and item are set; update them as necessary. */ int cmpres = PyObject_RichCompareBool(val, maxval, cmpop); if (cmpres > 0) { Py_DECREF(maxval); maxval = val; maxidx = idx; } else if (cmpres == 0) { Py_DECREF(val); } else { goto Fail; } } Py_DECREF(item); } Py_DECREF(iterator); Py_XDECREF(maxval); Py_XDECREF(keyfunc); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } if (maxidx == -1) { if (defaultisset) { maxidx = defaultitem; } else { PyErr_Format(PyExc_ValueError, "`%s` `iterable` is an empty sequence", cmpop == Py_LT ? "argmin" : "argmax"); return NULL; } } return PyLong_FromSsize_t(maxidx); Fail: Py_XDECREF(keyfunc); Py_XDECREF(item); Py_XDECREF(val); Py_XDECREF(maxval); Py_XDECREF(iterator); return NULL; } PyObject *PyIU_Argmin(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { return argminmax(args, kwargs, Py_LT); } PyObject *PyIU_Argmax(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { return argminmax(args, kwargs, Py_GT); } 070701000000C8000081A400000000000000000000000165E3BCDA0000016C000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/argminmax.h#ifndef PYIU_ARGMINMAX_H #define PYIU_ARGMINMAX_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_Argmin(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); PyObject * PyIU_Argmax(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000C9000081A400000000000000000000000165E3BCDA0000401C000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/chained.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "chained.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( chained_prop_funcs_doc, "(:py:class:`tuple`) The functions to be used (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( chained_prop_all_doc, "(:py:class:`bool`) Apply functions on each other (``False``) or " "separate (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( chained_doc, "chained(*funcs, reverse=False, all=False)\n" "--\n\n" "Chained function calls.\n" "\n" "Parameters\n" "----------\n" "funcs\n" " Any number of callables.\n" "\n" "reverse : :py:class:`bool`, optional\n" " If ``True`` apply the the `funcs` in reversed order.\n" " Default is ``False``.\n" "\n" "all : :py:class:`bool`, optional\n" " If ``True`` apply each of the `funcs` separately and return a tuple\n" " containing the individual results when calling the instance.\n" "\n" "Returns\n" "-------\n" "chained_func : callable\n" " The chained `funcs`.\n" "\n" "Examples\n" "--------\n" "`chained` simple calls all `funcs` on the result of the previous one::\n" "\n" " >>> from iteration_utilities import chained\n" " >>> double = lambda x: x*2\n" " >>> increment = lambda x: x+1\n" " >>> double_then_increment = chained(double, increment)\n" " >>> double_then_increment(10)\n" " 21\n" "\n" "Or apply them in reversed order::\n" "\n" " >>> increment_then_double = chained(double, increment, reverse=True)\n" " >>> increment_then_double(10)\n" " 22\n" "\n" "Or apply all of them on the input::\n" "\n" " >>> double_and_increment = chained(double, increment, all=True)\n" " >>> double_and_increment(10)\n" " (20, 11)\n"); #if PyIU_USE_VECTORCALL static PyObject *chained_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames); #endif static PyObject * chained_new(PyTypeObject *type, PyObject *funcs, PyObject *kwargs) { static char *kwlist[] = {"reverse", "all", NULL}; PyIUObject_Chained *self = NULL; int reverse = 0; int all = 0; Py_ssize_t num_funcs = PyTuple_GET_SIZE(funcs); if (num_funcs == 0) { PyErr_SetString(PyExc_TypeError, "`chained` expected at least one function."); return NULL; } if (!PyArg_ParseTupleAndKeywords(PyIU_global_0tuple, kwargs, "|pp:chained", kwlist, &reverse, &all)) { return NULL; } self = (PyIUObject_Chained *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } /* In case we want consecutive function calls (and not all) of them we can unwrap other "chained" instances inside the function-tuple passed in. */ if (all == 0 && type == &PyIUType_Chained) { Py_ssize_t finalsize = 0; Py_ssize_t i; /* Index for the input "funcs". */ Py_ssize_t j; /* Index for the out "funcs". */ /* First pass over the data to get the number of functions. This is mostly unnecessary except when there are other "chained" instances inside the functions. These can be unwrapped. */ for (i = 0; i < num_funcs; i++) { PyObject *function = PyTuple_GET_ITEM(funcs, i); if (PyIU_IsTypeExact(function, &PyIUType_Chained) && ((PyIUObject_Chained *)function)->all == 0) { finalsize += PyTuple_GET_SIZE(((PyIUObject_Chained *)function)->funcs); } else { finalsize++; } } /* The second step involves creating a suitable tuple and inserting the functions while unwrapping "chained" instances that have "all == 0". We don't want to unwrap "all" chained instances because these don't participate in "chains" and would produce other results if unwrapped. One could argue that unwrapping "all"-chained inside other "all"-chained might make sense but it would change the way the function works and not to forget: That behaviour would be confusing. Special care has to be taken for "reversed" because even though all other functions are inserted in reversed order an unwrapped "chained" instance must be inserted in the original order so not to change the order of execution compared to the case when they would not be unwrapped. */ self->funcs = PyTuple_New(finalsize); if (self->funcs == NULL) { Py_DECREF(self); return NULL; } j = reverse ? (finalsize - 1) : 0; for (i = 0; i < num_funcs; i++) { PyObject *function = PyTuple_GET_ITEM(funcs, i); if (PyIU_IsTypeExact(function, &PyIUType_Chained) && ((PyIUObject_Chained *)function)->all == 0) { Py_ssize_t k; PyIUObject_Chained *sub = (PyIUObject_Chained *)function; Py_ssize_t sub_size = PyTuple_GET_SIZE(sub->funcs); /* Prepare the index for inserting the array in normal order even when "reversed" is given. */ j = reverse ? (j - sub_size + 1) : j; for (k = 0; k < sub_size; k++) { PyObject *subfunc = PyTuple_GET_ITEM(sub->funcs, k); Py_INCREF(subfunc); PyTuple_SET_ITEM(self->funcs, j, subfunc); j++; } /* The index needs to jump back to the original position in case a "chained" instance was unwrapped while "reverse" was given. */ j = reverse ? (j - sub_size - 1) : j; } else { /* This is the normal behaviour without unwrapping. Just change the insertion index differently depending on "reverse". */ Py_INCREF(function); PyTuple_SET_ITEM(self->funcs, j, function); j = reverse ? (j - 1) : (j + 1); } } } else { if (reverse) { self->funcs = PyIU_TupleReverse(funcs); } else { self->funcs = PyIU_TupleCopy(funcs); } } if (self->funcs == NULL) { Py_DECREF(self); return NULL; } self->all = all; #if PyIU_USE_VECTORCALL self->vectorcall = chained_vectorcall; #endif return (PyObject *)self; } static void chained_dealloc(PyIUObject_Chained *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->funcs); Py_TYPE(self)->tp_free(self); } static int chained_traverse(PyIUObject_Chained *self, visitproc visit, void *arg) { Py_VISIT(self->funcs); return 0; } static int chained_clear(PyIUObject_Chained *self) { Py_CLEAR(self->funcs); return 0; } #if PyIU_USE_VECTORCALL static PyObject * chained_vectorcall_normal(PyIUObject_Chained *self, PyObject *const *args, size_t nargsf, PyObject *kwnames) { Py_ssize_t idx; PyObject *temp = PyIU_PyObject_Vectorcall(PyTuple_GET_ITEM(self->funcs, 0), args, nargsf, kwnames); if (temp == NULL) { return NULL; } for (idx = 1; idx < PyTuple_GET_SIZE(self->funcs); idx++) { PyObject *func = PyTuple_GET_ITEM(self->funcs, idx); PyObject *oldtemp = temp; temp = PyIU_CallWithOneArgument(func, temp); Py_DECREF(oldtemp); if (temp == NULL) { return NULL; } } return temp; } static PyObject * chained_vectorcall_all(PyIUObject_Chained *self, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyObject *result; Py_ssize_t idx; Py_ssize_t num_funcs = PyTuple_GET_SIZE(self->funcs); /* Create a placeholder tuple for "all=True". */ result = PyTuple_New(num_funcs); if (result == NULL) { return NULL; } for (idx = 0; idx < num_funcs; idx++) { PyObject *func = PyTuple_GET_ITEM(self->funcs, idx); PyObject *temp = PyIU_PyObject_Vectorcall(func, args, nargsf, kwnames); PyTuple_SET_ITEM(result, idx, temp); if (temp == NULL) { Py_DECREF(result); return NULL; } } return result; } static PyObject * chained_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyIUObject_Chained *self = ((PyIUObject_Chained *)obj); if (self->all) { return chained_vectorcall_all(self, args, nargsf, kwnames); } else { return chained_vectorcall_normal(self, args, nargsf, kwnames); } } #else static PyObject * chained_call_normal(PyIUObject_Chained *self, PyObject *args, PyObject *kwargs) { Py_ssize_t idx; PyObject *temp = PyObject_Call(PyTuple_GET_ITEM(self->funcs, 0), args, kwargs); if (temp == NULL) { return NULL; } for (idx = 1; idx < PyTuple_GET_SIZE(self->funcs); idx++) { PyObject *func = PyTuple_GET_ITEM(self->funcs, idx); PyObject *oldtemp = temp; temp = PyIU_CallWithOneArgument(func, temp); Py_DECREF(oldtemp); if (temp == NULL) { return NULL; } } return temp; } static PyObject * chained_call_all(PyIUObject_Chained *self, PyObject *args, PyObject *kwargs) { PyObject *result; Py_ssize_t idx; Py_ssize_t num_funcs = PyTuple_GET_SIZE(self->funcs); /* Create a placeholder tuple for "all=True". */ result = PyTuple_New(num_funcs); if (result == NULL) { return NULL; } for (idx = 0; idx < num_funcs; idx++) { PyObject *func = PyTuple_GET_ITEM(self->funcs, idx); PyObject *temp = PyObject_Call(func, args, kwargs); PyTuple_SET_ITEM(result, idx, temp); if (temp == NULL) { Py_DECREF(result); return NULL; } } return result; } static PyObject * chained_call(PyIUObject_Chained *self, PyObject *args, PyObject *kwargs) { if (self->all) { return chained_call_all(self, args, kwargs); } else { return chained_call_normal(self, args, kwargs); } } #endif static PyObject * chained_repr(PyIUObject_Chained *self) { PyObject *result = NULL; PyObject *arglist; Py_ssize_t i; Py_ssize_t n; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } arglist = PyUnicode_FromString(""); if (arglist == NULL) { goto done; } /* Pack positional arguments */ n = PyTuple_GET_SIZE(self->funcs); for (i = 0; i < n; i++) { PyObject *tmp = PyUnicode_FromFormat("%U%R, ", arglist, PyTuple_GET_ITEM(self->funcs, i)); Py_CLEAR(arglist); arglist = tmp; if (arglist == NULL) { goto done; } } result = PyUnicode_FromFormat("%s(%Uall=%R)", Py_TYPE(self)->tp_name, arglist, self->all ? Py_True : Py_False); Py_DECREF(arglist); done: Py_ReprLeave((PyObject *)self); return result; } static PyObject * chained_reduce(PyIUObject_Chained *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("OO(i)", Py_TYPE(self), self->funcs, self->all); } static PyObject * chained_setstate(PyIUObject_Chained *self, PyObject *state) { int all; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "i:chained.__setstate__", &all)) { return NULL; } self->all = all; Py_RETURN_NONE; } static PyObject * chained_get_all(PyIUObject_Chained *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->all); } static PyMethodDef chained_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)chained_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)chained_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyGetSetDef chained_getsetlist[] = { { "all", /* name */ (getter)chained_get_all, /* get */ (setter)0, /* set */ chained_prop_all_doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; static PyMemberDef chained_memberlist[] = { { "funcs", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Chained, funcs), /* offset */ READONLY, /* flags */ chained_prop_funcs_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Chained = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.chained", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Chained), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)chained_dealloc, /* tp_dealloc */ #if PyIU_USE_VECTORCALL offsetof(PyIUObject_Chained, vectorcall), /* tp_vectorcall_offset */ #else (printfunc)0, /* tp_print */ #endif (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)chained_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ #if PyIU_USE_VECTORCALL (ternaryfunc)PyVectorcall_Call, /* tp_call */ #else (ternaryfunc)chained_call, /* tp_call */ #endif (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL | _Py_TPFLAGS_HAVE_VECTORCALL #else | Py_TPFLAGS_HAVE_VECTORCALL #endif #endif , /* tp_flags */ (const char *)chained_doc, /* tp_doc */ (traverseproc)chained_traverse, /* tp_traverse */ (inquiry)chained_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ chained_methods, /* tp_methods */ chained_memberlist, /* tp_members */ chained_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)chained_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000CA000081A400000000000000000000000165E3BCDA00000184000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/chained.h#ifndef PYIU_CHAINED_H #define PYIU_CHAINED_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *funcs; int all; #if PyIU_USE_VECTORCALL vectorcallfunc vectorcall; #endif } PyIUObject_Chained; extern PyTypeObject PyIUType_Chained; #ifdef __cplusplus } #endif #endif 070701000000CB000081A400000000000000000000000165E3BCDA00002C61000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/clamp.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "clamp.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" PyDoc_STRVAR( clamp_prop_low_doc, "(any type) The lower bound for `clamp` (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( clamp_prop_high_doc, "(any type) The upper bound for `clamp` (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( clamp_prop_inclusive_doc, "(:py:class:`bool`) Are the bounds inclusive (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( clamp_prop_remove_doc, "(:py:class:`bool`) Remove the outliers or clamp them to nearest bound " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( clamp_doc, "clamp(iterable, low=None, high=None, inclusive=False, remove=True)\n" "--\n\n" "Remove values which are not between `low` and `high`.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Clamp the values from this `iterable`.\n" "\n" "low : any type, optional\n" " The lower bound for clamp. If not given or ``None`` there is no lower \n" " bound.\n" "\n" "high : any type, optional\n" " The upper bound for clamp. If not given or ``None`` there is no upper \n" " bound.\n" "\n" "inclusive : :py:class:`bool`, optional\n" " If ``True`` also remove values that are equal to `low` and `high`.\n" " Default is ``False``.\n" "\n" "remove : :py:class:`bool`, optional\n" " If ``True`` remove the items outside the range given by ``low`` and\n" " ``high``, otherwise replace them with ``low`` if they are lower or\n" " ``high`` if they are higher.\n" " Default is ``True``.\n" "\n" " .. versionadded:: 0.2\n" "\n" "Returns\n" "-------\n" "clamped : generator\n" " A generator containing the values of `iterable` which are between `low`\n" " and `high`.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to a generator expression like:\n" "``(item for item in iterable if low <= item <= high)`` or\n" "``(item for item in iterable if low < item < high)`` if `inclusive=True`.\n" "Or a similar `filter`: ``filter(lambda item: low <= item <= high, iterable)``\n" "But it also allows for either ``low`` or ``high`` to be ignored and is faster.\n" "Some simple examples::\n" "\n" " >>> from iteration_utilities import clamp\n" " >>> list(clamp(range(5), low=2))\n" " [2, 3, 4]\n" " >>> list(clamp(range(5), high=2))\n" " [0, 1, 2]\n" " >>> list(clamp(range(1000), low=2, high=8, inclusive=True))\n" " [3, 4, 5, 6, 7]\n" "\n" "If ``remove=False`` the function will replace values instead::\n" "\n" " >>> list(clamp(range(10), low=4, high=8, remove=False))\n" " [4, 4, 4, 4, 4, 5, 6, 7, 8, 8]\n"); static PyObject * clamp_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "low", "high", "inclusive", "remove", NULL}; PyIUObject_Clamp *self; PyObject *iterable; PyObject *low = NULL; PyObject *high = NULL; int inclusive = 0; int remove = 1; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|OOpp:clamp", kwlist, &iterable, &low, &high, &inclusive, &remove)) { return NULL; } self = (PyIUObject_Clamp *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } /* None cannot be compared so it's unlikely we exclude use-cases by allowing low=None as equivalent to not giving any "low" argument. */ self->low = low == Py_None ? NULL : low; Py_XINCREF(self->low); self->high = high == Py_None ? NULL : high; Py_XINCREF(self->high); self->inclusive = inclusive; self->remove = remove; return (PyObject *)self; } static void clamp_dealloc(PyIUObject_Clamp *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->low); Py_XDECREF(self->high); Py_TYPE(self)->tp_free(self); } static int clamp_traverse(PyIUObject_Clamp *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->low); Py_VISIT(self->high); return 0; } static int clamp_clear(PyIUObject_Clamp *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->low); Py_CLEAR(self->high); return 0; } static PyObject * clamp_next(PyIUObject_Clamp *self) { PyObject *item; int res; while ((item = Py_TYPE(self->iterator)->tp_iternext(self->iterator))) { /* Check if it's smaller than the lower bound. */ if (self->low != NULL) { res = PyObject_RichCompareBool(item, self->low, self->inclusive ? Py_LE : Py_LT); if (res == 1) { Py_DECREF(item); if (!(self->remove)) { Py_INCREF(self->low); return self->low; } continue; } else if (res == -1) { Py_DECREF(item); return NULL; } } /* Check if it's bigger than the upper bound. */ if (self->high != NULL) { res = PyObject_RichCompareBool(item, self->high, self->inclusive ? Py_GE : Py_GT); if (res == 1) { Py_DECREF(item); if (!(self->remove)) { Py_INCREF(self->high); return self->high; } continue; } else if (res == -1) { Py_DECREF(item); return NULL; } } /* Still here? Return the item! */ return item; } return NULL; } static PyObject * clamp_reduce(PyIUObject_Clamp *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OOOii)", Py_TYPE(self), self->iterator, self->low ? self->low : Py_None, self->high ? self->high : Py_None, self->inclusive, self->remove); } static PyObject * clamp_lengthhint(PyIUObject_Clamp *self, PyObject *Py_UNUSED(args)) { Py_ssize_t len = 0; /* If we don't remove outliers or there are no bounds at all we can determine the length. */ if (!(self->remove) || (self->low == NULL && self->high == NULL)) { len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } } return PyLong_FromSsize_t(len); } static PyObject * clamp_get_inclusive(PyIUObject_Clamp *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->inclusive); } static PyObject * clamp_get_remove(PyIUObject_Clamp *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->remove); } static PyMethodDef clamp_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)clamp_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)clamp_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyGetSetDef clamp_getsetlist[] = { { "inclusive", /* name */ (getter)clamp_get_inclusive, /* get */ (setter)0, /* set */ clamp_prop_inclusive_doc, /* doc */ (void *)NULL /* closure */ }, { "remove", /* name */ (getter)clamp_get_remove, /* get */ (setter)0, /* set */ clamp_prop_remove_doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; static PyMemberDef clamp_memberlist[] = { { "low", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Clamp, low), /* offset */ READONLY, /* flags */ clamp_prop_low_doc /* doc */ }, { "high", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Clamp, high), /* offset */ READONLY, /* flags */ clamp_prop_high_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Clamp = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.clamp", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Clamp), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)clamp_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)clamp_doc, /* tp_doc */ (traverseproc)clamp_traverse, /* tp_traverse */ (inquiry)clamp_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)clamp_next, /* tp_iternext */ clamp_methods, /* tp_methods */ clamp_memberlist, /* tp_members */ clamp_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)clamp_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000CC000081A400000000000000000000000165E3BCDA0000017E000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/clamp.h#ifndef PYIU_CLAMP_H #define PYIU_CLAMP_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *low; PyObject *high; int inclusive; int remove; } PyIUObject_Clamp; extern PyTypeObject PyIUType_Clamp; #ifdef __cplusplus } #endif #endif 070701000000CD000081A400000000000000000000000165E3BCDA00001DBD000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/complement.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "complement.h" #include "helper.h" #include <structmember.h> #include "docs_reduce.h" PyDoc_STRVAR( complement_prop_func_doc, "(callable) The function that is complemented (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( complement_doc, "complement(func)\n" "--\n\n" "Invert a predicate function. There is a homonymous function in the `toolz` \n" "package ([0]_) but significantly modified.\n" "\n" "Parameters\n" "----------\n" "func : callable\n" " The function to complement.\n" "\n" "Returns\n" "-------\n" "complemented_func : callable\n" " The complement to `func`.\n" "\n" "Examples\n" "--------\n" "`complement` is equivalent to ``lambda x: not x()`` but significantly faster::\n" "\n" " >>> from iteration_utilities import complement\n" " >>> from iteration_utilities import is_None\n" " >>> is_not_None = complement(is_None)\n" " >>> list(filter(is_not_None, [1,2,None,3,4,None]))\n" " [1, 2, 3, 4]\n" "\n" ".. note::\n" " The example code could also be done with :py:func:`itertools.filterfalse` \n" " or :py:func:`iteration_utilities.is_not_None`.\n" "\n" "References\n" "----------\n" ".. [0] https://toolz.readthedocs.io/en/latest/index.html\n"); #if PyIU_USE_VECTORCALL static PyObject *complement_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames); #endif static PyObject * complement_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { PyIUObject_Complement *self; PyObject *func; if (!PyArg_UnpackTuple(args, "complement", 1, 1, &func)) { return NULL; } self = (PyIUObject_Complement *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(func); self->func = func; #if PyIU_USE_VECTORCALL self->vectorcall = complement_vectorcall; #endif return (PyObject *)self; } static void complement_dealloc(PyIUObject_Complement *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->func); Py_TYPE(self)->tp_free(self); } static int complement_traverse(PyIUObject_Complement *self, visitproc visit, void *arg) { Py_VISIT(self->func); return 0; } static int complement_clear(PyIUObject_Complement *self) { Py_CLEAR(self->func); return 0; } #if PyIU_USE_VECTORCALL static PyObject * complement_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyObject *temp; int res; /* "not func(*args, **kwargs)" */ temp = PyIU_PyObject_Vectorcall(((PyIUObject_Complement *)obj)->func, args, nargsf, kwnames); if (temp == NULL) { return NULL; } res = PyObject_Not(temp); Py_DECREF(temp); if (res == 1) { Py_RETURN_TRUE; } else if (res == 0) { Py_RETURN_FALSE; } else { return NULL; } } #else static PyObject * complement_call(PyIUObject_Complement *self, PyObject *args, PyObject *kwargs) { PyObject *temp; int res; /* "not func(*args, **kwargs)" */ temp = PyObject_Call(self->func, args, kwargs); if (temp == NULL) { return NULL; } res = PyObject_Not(temp); Py_DECREF(temp); if (res == 1) { Py_RETURN_TRUE; } else if (res == 0) { Py_RETURN_FALSE; } else { return NULL; } } #endif static PyObject * complement_repr(PyIUObject_Complement *self) { PyObject *result = NULL; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } result = PyUnicode_FromFormat("%s(%R)", Py_TYPE(self)->tp_name, self->func); Py_ReprLeave((PyObject *)self); return result; } static PyObject * complement_reduce(PyIUObject_Complement *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(O)", Py_TYPE(self), self->func); } static PyMethodDef complement_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)complement_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef complement_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Complement, func), /* offset */ READONLY, /* flags */ complement_prop_func_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Complement = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.complement", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Complement), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)complement_dealloc, /* tp_dealloc */ #if PyIU_USE_VECTORCALL offsetof(PyIUObject_Complement, vectorcall), /* tp_vectorcall_offset */ #else (printfunc)0, /* tp_print */ #endif (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)complement_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ #if PyIU_USE_VECTORCALL (ternaryfunc)PyVectorcall_Call, /* tp_call */ #else (ternaryfunc)complement_call, /* tp_call */ #endif (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL | _Py_TPFLAGS_HAVE_VECTORCALL #else | Py_TPFLAGS_HAVE_VECTORCALL #endif #endif , /* tp_flags */ (const char *)complement_doc, /* tp_doc */ (traverseproc)complement_traverse, /* tp_traverse */ (inquiry)complement_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ complement_methods, /* tp_methods */ complement_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)complement_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000CE000081A400000000000000000000000165E3BCDA00000182000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/complement.h#ifndef PYIU_COMPLEMENT_H #define PYIU_COMPLEMENT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; #if PyIU_USE_VECTORCALL vectorcallfunc vectorcall; #endif } PyIUObject_Complement; extern PyTypeObject PyIUType_Complement; #ifdef __cplusplus } #endif #endif 070701000000CF000081A400000000000000000000000165E3BCDA00001D31000000000000000000000000000000000000005300000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/constant.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "constant.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( constant_prop_item_doc, "(any type) The value that is returned each time the instance is called " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( constant_doc, "constant(item, /)\n" "--\n\n" "Class that always returns a constant value when called.\n" "\n" "Parameters\n" "----------\n" "item : any type\n" " The item that should be returned when called.\n" "\n" "Examples\n" "--------\n" "Creating :py:class:`~iteration_utilities.constant` instances::\n" "\n" " >>> from iteration_utilities import constant\n" " >>> five = constant(5)\n" " >>> five()\n" " 5\n" " >>> ten = constant(10)\n" " >>> # Any parameters are ignored\n" " >>> ten(5, give_me=100)\n" " 10\n" "\n" "There are already three predefined instances:\n" "\n" "- :py:func:`~iteration_utilities.return_True`: equivalent to ``constant(True)``.\n" "- :py:func:`~iteration_utilities.return_False`: equivalent to ``constant(False)``.\n" "- :py:func:`~iteration_utilities.return_None`: equivalent to ``constant(None)``.\n" "\n" "For example::\n" "\n" " >>> from iteration_utilities import return_True, return_False, return_None\n" " >>> return_True()\n" " True\n" " >>> return_False()\n" " False\n" " >>> return_None()\n" " >>> return_None() is None\n" " True\n"); #if PyIU_USE_VECTORCALL static PyObject *constant_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames); #endif PyObject * PyIUConstant_New(PyObject *value) { assert(value != NULL); PyIUObject_Constant *self; self = PyObject_GC_New(PyIUObject_Constant, &PyIUType_Constant); if (self == NULL) { return NULL; } Py_INCREF(value); self->item = value; #if PyIU_USE_VECTORCALL self->vectorcall = constant_vectorcall; #endif PyObject_GC_Track(self); return (PyObject *)self; } static PyObject * constant_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { PyIUObject_Constant *self; PyObject *item; if (!PyArg_UnpackTuple(args, "constant", 1, 1, &item)) { return NULL; } self = (PyIUObject_Constant *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(item); self->item = item; #if PyIU_USE_VECTORCALL self->vectorcall = constant_vectorcall; #endif return (PyObject *)self; } static void constant_dealloc(PyIUObject_Constant *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->item); Py_TYPE(self)->tp_free(self); } static int constant_traverse(PyIUObject_Constant *self, visitproc visit, void *arg) { Py_VISIT(self->item); return 0; } static int constant_clear(PyIUObject_Constant *self) { Py_CLEAR(self->item); return 0; } #if PyIU_USE_VECTORCALL static PyObject * constant_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyObject *tmp = ((PyIUObject_Constant *)obj)->item; Py_INCREF(tmp); return tmp; } #else static PyObject * constant_call(PyIUObject_Constant *self, PyObject *args, PyObject *kwargs) { Py_INCREF(self->item); return self->item; } #endif static PyObject * constant_repr(PyIUObject_Constant *self) { PyObject *result = NULL; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } result = PyUnicode_FromFormat("%s(%R)", Py_TYPE(self)->tp_name, self->item); Py_ReprLeave((PyObject *)self); return result; } static PyObject * constant_reduce(PyIUObject_Constant *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(O)", Py_TYPE(self), self->item); } static PyMethodDef constant_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)constant_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef constant_memberlist[] = { { "item", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Constant, item), /* offset */ READONLY, /* flags */ constant_prop_item_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Constant = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.constant", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Constant), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)constant_dealloc, /* tp_dealloc */ #if PyIU_USE_VECTORCALL offsetof(PyIUObject_Constant, vectorcall), /* tp_vectorcall_offset */ #else (printfunc)0, /* tp_print */ #endif (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)constant_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ #if PyIU_USE_VECTORCALL (ternaryfunc)PyVectorcall_Call, /* tp_call */ #else (ternaryfunc)constant_call, /* tp_call */ #endif (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL | _Py_TPFLAGS_HAVE_VECTORCALL #else | Py_TPFLAGS_HAVE_VECTORCALL #endif #endif , /* tp_flags */ (const char *)constant_doc, /* tp_doc */ (traverseproc)constant_traverse, /* tp_traverse */ (inquiry)constant_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ constant_methods, /* tp_methods */ constant_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)constant_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000D0000081A400000000000000000000000165E3BCDA000001A9000000000000000000000000000000000000005300000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/constant.h#ifndef PYIU_CONSTANT_H #define PYIU_CONSTANT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *item; #if PyIU_USE_VECTORCALL vectorcallfunc vectorcall; #endif } PyIUObject_Constant; extern PyTypeObject PyIUType_Constant; PyObject * PyIUConstant_New(PyObject *value); #ifdef __cplusplus } #endif #endif 070701000000D1000081A400000000000000000000000165E3BCDA00000BB2000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/countitems.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "countitems.h" #include "helper.h" PyObject * PyIU_Count(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "pred", "eq", NULL}; PyObject *iterable; PyObject *item; PyObject *iterator; PyObject *pred = NULL; Py_ssize_t sum_int = 0; int eq = 0; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|Op:count_items", kwlist, &iterable, &pred, &eq)) { return NULL; } if (pred == Py_None) { pred = NULL; } if (eq && pred == NULL) { PyErr_SetString(PyExc_TypeError, "`pred` argument for `count_items` must be specified " "if `eq=True`."); return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } while ((item = Py_TYPE(iterator)->tp_iternext(iterator))) { int ok; if (pred == NULL) { /* No predicate given just set ok == 1 so the element is counted. */ ok = 1; } else if (eq) { /* Always check for equality if "eq=1". */ ok = PyObject_RichCompareBool(pred, item, Py_EQ); } else if (pred == (PyObject *)&PyBool_Type) { /* Predicate is bool, so we can skip the function call and just evaluate if the object is truthy. */ ok = PyObject_IsTrue(item); } else { /* Call the function and check if the returned value is truthy. */ PyObject *val = PyIU_CallWithOneArgument(pred, item); if (val == NULL) { Py_DECREF(item); Py_DECREF(iterator); return NULL; } ok = PyObject_IsTrue(val); Py_DECREF(val); } Py_DECREF(item); /* If we found a match increment the counter, if we encountered an Exception throw it here. */ if (ok == 1) { /* check if the sum variable is about to overflow. In this case there is no fallback because it's unlikely that we should process some iterable that's longer than the maximum py_ssize_t... */ if (sum_int == PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_TypeError, "`iterable` for `count_items` is too long to count."); Py_DECREF(iterator); return NULL; } sum_int++; } else if (ok < 0) { Py_DECREF(iterator); return NULL; } } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } return PyLong_FromSsize_t(sum_int); } 070701000000D2000081A400000000000000000000000165E3BCDA0000011B000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/countitems.h#ifndef PYIU_COUNTITEMS_H #define PYIU_COUNTITEMS_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_Count(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000D3000081A400000000000000000000000165E3BCDA00005250000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/deepflatten.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "deepflatten.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( deepflatten_prop_types_doc, "(:py:class:`type` or :py:class:`tuple` thereof) The types to flatten or " "None if `deepflatten` attempts to flatten every type (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( deepflatten_prop_ignore_doc, "(:py:class:`type` or :py:class:`tuple` thereof) The types that are not " "flattened (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( deepflatten_prop_depth_doc, "(:py:class:`int`) Up to this depth the iterable is flattened (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( deepflatten_prop_currentdepth_doc, "(:py:class:`int`) The current depth inside the iterable (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( deepflatten_doc, "deepflatten(iterable, depth=-1, types=None, ignore=None)\n" "--\n\n" "Flatten an `iterable` with given `depth`.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Any `iterable` to flatten.\n" "\n" "depth : :py:class:`int` or None, optional\n" " Flatten `depth` levels of nesting or all if ``depth=-1``.\n" " Default is ``-1``.\n" "\n" " .. note::\n" " If the `depth` is known this significantly speeds up the function!\n" "\n" "types : type, tuple of types, optional\n" " Which types should be flattened. If not given it flattens all items if\n" " ``iter(item)`` does not throw a ``TypeError``.\n" "\n" " .. note::\n" " If the `types` are given this significantly speeds up the function\n" " but only *if* the `depth` is unknown.\n" "\n" "ignore : type, iterable of types or None, optional\n" " The types which should not be flattened. If not given all `types` are\n" " flattened.\n" "\n" "Returns\n" "-------\n" "flattened_iterable : generator\n" " The `iterable` with the `depth` level of nesting flattened.\n" "\n" "Examples\n" "--------\n" "To flatten a given depth::\n" "\n" " >>> from iteration_utilities import deepflatten\n" " >>> list(deepflatten([1, [1,2], [[1,2]], [[[1,2]]]], depth=1))\n" " [1, 1, 2, [1, 2], [[1, 2]]]\n" "\n" "To completely flatten it::\n" "\n" " >>> list(deepflatten([1, [1,2], [[1,2]], [[[1,2]]]]))\n" " [1, 1, 2, 1, 2, 1, 2]\n" "\n" "To ignore for example dictionaries::\n" "\n" " >>> # Only the keys of a dictionary will be kept with deepflatten.\n" " >>> list(deepflatten([1, 2, [1,2], {1: 10, 2: 10}]))\n" " [1, 2, 1, 2, 1, 2]\n" " >>> list(deepflatten([1, 2, [1,2], {1: 10, 2: 10}], ignore=dict))\n" " [1, 2, 1, 2, {1: 10, 2: 10}]\n" "\n" "In this case we could have also chosen only to flatten the lists::\n" "\n" " >>> list(deepflatten([1, 2, [1,2], {1: 10, 2: 10}], types=list))\n" " [1, 2, 1, 2, {1: 10, 2: 10}]\n" "\n" ".. warning::\n" " If the iterable contains recursive iterable objects (i.e. `UserString`)\n" " one either needs to set ``ignore`` or a `depth` that is not ``None``.\n" " Otherwise this will raise an ``RecursionError`` (or ``RuntimeError`` on\n" " older Python versions) because each item in a ``UserString`` is itself a\n" " ``UserString``, even if it has a length of 1! The builtin strings \n" " (``str``, ``bytes``, ``unicode``) are special cased, but only the exact\n" " types because subtypes might implement custom not-recursive ``__iter__``\n" " methods. This means that these won't run into the infinite recursion,\n" " but subclasses might.\n" "\n" "See for example::\n" "\n" " >>> from collections import UserString\n" " >>> list(deepflatten([1, 2, [1,2], UserString('abc')], depth=1))\n" " [1, 2, 1, 2, 'a', 'b', 'c']\n" " >>> list(deepflatten([1, 2, [1,2], UserString('abc')], ignore=UserString))\n" " [1, 2, 1, 2, 'abc']\n" "\n" "This function is roughly (it's missing some of the complicated details \n" "and performance optimizations of the actual function) equivalent to this \n" "python function:\n" "\n" ".. code::\n" "\n" " def deepflatten(iterable, depth=None, types=None, ignore=None):\n" " if depth is None:\n" " depth = float('inf')\n" " if depth == -1:\n" " yield iterable\n" " else:\n" " for x in iterable:\n" " if ignore is not None and isinstance(x, ignore):\n" " yield x\n" " if types is None:\n" " try:\n" " iter(x)\n" " except TypeError:\n" " yield x\n" " else:\n" " yield from deepflatten(x, depth - 1, types, ignore)\n" " elif not isinstance(x, types):\n" " yield x\n" " else:\n" " yield from deepflatten(x, depth - 1, types, ignore)\n"); static PyObject * deepflatten_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "depth", "types", "ignore", NULL}; PyIUObject_DeepFlatten *self; PyObject *iterable; PyObject *iterator = NULL; PyObject *types = NULL; PyObject *ignore = NULL; Py_ssize_t depth = -1; Py_ssize_t i; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|nOO:deepflatten", kwlist, &iterable, &depth, &types, &ignore)) { return NULL; } self = (PyIUObject_DeepFlatten *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { Py_DECREF(self); return NULL; } /* Create a list of size "depth+1" or if depth was not given use 3 for a start. Fill all entries with None except for the first which should be the iterator over the iterable. */ self->iteratorlist = PyList_New(depth >= 0 ? depth + 1 : 3); if (self->iteratorlist == NULL) { Py_DECREF(self); Py_DECREF(iterator); return NULL; } PyList_SET_ITEM(self->iteratorlist, 0, iterator); iterator = NULL; for (i = 1; i < PyList_GET_SIZE(self->iteratorlist); i++) { Py_INCREF(Py_None); PyList_SET_ITEM(self->iteratorlist, i, Py_None); } self->types = types == Py_None ? NULL : types; Py_XINCREF(self->types); self->ignore = ignore == Py_None ? NULL : ignore; Py_XINCREF(self->ignore); self->depth = depth; self->currentdepth = 0; self->isstring = 0; return (PyObject *)self; } static void deepflatten_dealloc(PyIUObject_DeepFlatten *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iteratorlist); Py_XDECREF(self->types); Py_XDECREF(self->ignore); Py_TYPE(self)->tp_free(self); } static int deepflatten_traverse(PyIUObject_DeepFlatten *self, visitproc visit, void *arg) { Py_VISIT(self->iteratorlist); Py_VISIT(self->types); Py_VISIT(self->ignore); return 0; } static int deepflatten_clear(PyIUObject_DeepFlatten *self) { Py_CLEAR(self->iteratorlist); Py_CLEAR(self->types); Py_CLEAR(self->ignore); return 0; } static PyObject * deepflatten_next(PyIUObject_DeepFlatten *self) { PyObject *activeiterator; PyObject *item; PyObject *temp; int ok; if (self->currentdepth < 0) { return NULL; } /* TODO: This is likely a problem when using copy because currentdepth might be changed by the copy. However deepcopy should work as as expected. */ activeiterator = PyList_GET_ITEM(self->iteratorlist, self->currentdepth); while (self->currentdepth >= 0) { item = Py_TYPE(activeiterator)->tp_iternext(activeiterator); /* The active iterator finished, remove it from the list and take up the iterator one level up. */ if (item == NULL) { if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } Py_INCREF(Py_None); PyList_SET_ITEM(self->iteratorlist, self->currentdepth, Py_None); self->currentdepth--; /* The iterator finished so we're not in a string anymore. */ self->isstring = 0; Py_DECREF(activeiterator); if (self->currentdepth < 0) { break; } activeiterator = PyList_GET_ITEM(self->iteratorlist, self->currentdepth); continue; } if (self->depth >= 0 && self->currentdepth >= self->depth) { /* If the currentdepth exceeds the specified depth just return. */ return item; } else if (self->isstring) { /* If we're in a built-in string/bytes or unicode simply return. */ return item; } else if (self->ignore && (ok = PyObject_IsInstance(item, self->ignore))) { /* First check if the item is an instance of the ignored types, if it is, then simply return it. */ if (ok == 1) { return item; } Py_DECREF(item); return NULL; } else if (self->types) { /* If types is given then check if it's an instance thereof and if so replace activeiterator, otherwise return the item. */ if ((ok = PyObject_IsInstance(item, self->types))) { if (ok == -1) { Py_DECREF(item); return NULL; } /* Check if it's a builtin-string-type and if so set "isstring". Check for the exact type because sub types might have custom __iter__ methods, better not to interfere with these. */ if (PyBytes_CheckExact(item) || PyUnicode_CheckExact(item)) { self->isstring = 1; } self->currentdepth++; activeiterator = PyObject_GetIter(item); Py_DECREF(item); if (activeiterator == NULL) { return NULL; } } else { return item; } } else { /* If no types are given just try to convert it to an iterator and if that succeeds replaces activeiterator, otherwise return item. */ temp = PyObject_GetIter(item); if (temp == NULL) { if (PyErr_Occurred() && PyErr_ExceptionMatches(PyExc_TypeError)) { PyErr_Clear(); return item; } else { Py_DECREF(item); return NULL; } } else { /* See comment above why the exact check is (probably) better. */ if (PyBytes_CheckExact(item) || PyUnicode_CheckExact(item)) { self->isstring = 1; } self->currentdepth++; activeiterator = temp; temp = NULL; Py_DECREF(item); } } /* Still here? That means we have a new activeiterator. Make sure we can save the new iterator (if necessary increase the list size). However first make sure we are not in danger of being in an endless recursion, to this means we "borrow" the recursion depth built into Python as limit for the list length. */ if ((Py_ssize_t)Py_GetRecursionLimit() < self->currentdepth) { PyErr_SetString(PyExc_RecursionError, "`deepflatten` reached maximum recursion depth."); Py_DECREF(activeiterator); return NULL; } if (self->currentdepth >= PyList_GET_SIZE(self->iteratorlist)) { int ok = PyList_Append(self->iteratorlist, activeiterator); Py_DECREF(activeiterator); if (ok == -1) { return NULL; } } else { PyObject *tmp = PyList_GET_ITEM(self->iteratorlist, self->currentdepth); PyList_SET_ITEM(self->iteratorlist, self->currentdepth, activeiterator); Py_DECREF(tmp); } } return NULL; } static PyObject * deepflatten_reduce(PyIUObject_DeepFlatten *self, PyObject *Py_UNUSED(args)) { PyObject *res; /* We need to copy the iteratorlist in case someone grabs it. This could lead to segmentation faults if the list is partially deleted, the next call to "next" could try to access an out-of-bounds index. */ PyObject *itlist = PyList_GetSlice(self->iteratorlist, 0, PY_SSIZE_T_MAX); res = Py_BuildValue("O(OnOO)(Oni)", Py_TYPE(self), PyList_GET_ITEM(self->iteratorlist, 0), /* stub */ self->depth, self->types ? self->types : Py_None, self->ignore ? self->ignore : Py_None, itlist, self->currentdepth, self->isstring); Py_DECREF(itlist); return res; } static PyObject * deepflatten_setstate(PyIUObject_DeepFlatten *self, PyObject *state) { PyObject *iteratorlist; Py_ssize_t currentdepth; int isstring; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "Oni:deepflatten.__setstate__", &iteratorlist, ¤tdepth, &isstring)) { return NULL; } if (!PyList_CheckExact(iteratorlist)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `list` instance as " "first argument in the `state`, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(iteratorlist)->tp_name); return NULL; } if (currentdepth < -1) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the second (%zd) " "argument in the `state` is bigger than or equal to -1.", Py_TYPE(self)->tp_name, currentdepth); return NULL; } else { Py_ssize_t i; Py_ssize_t listlength = PyList_GET_SIZE(iteratorlist); if (currentdepth >= listlength) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the second (%zd) " "argument in the `state` is smaller than the length " "of the first argument (%zd)", Py_TYPE(self)->tp_name, currentdepth, listlength); return NULL; } /* The iteratorlist requires iterators in the list so make sure no bad items could be accessed. */ for (i = 0; i <= currentdepth; i++) { if (!PyIter_Check(PyList_GET_ITEM(iteratorlist, i))) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected only iterators " "in the first argument in the `state`, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(PyList_GET_ITEM(iteratorlist, i))->tp_name); return NULL; } } } /* We need to make sure nobody can alter the iteratorlist so we need a copy. */ iteratorlist = PyList_GetSlice(iteratorlist, 0, PY_SSIZE_T_MAX); if (iteratorlist == NULL) { return NULL; } /* No need to incref iteratorlist, we copied it. */ Py_XSETREF(self->iteratorlist, iteratorlist); self->currentdepth = currentdepth; self->isstring = isstring; Py_RETURN_NONE; } static PyMethodDef deepflatten_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)deepflatten_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)deepflatten_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef deepflatten_memberlist[] = { { "types", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_DeepFlatten, types), /* offset */ READONLY, /* flags */ deepflatten_prop_types_doc /* doc */ }, { "ignore", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_DeepFlatten, ignore), /* offset */ READONLY, /* flags */ deepflatten_prop_ignore_doc /* doc */ }, { "depth", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_DeepFlatten, depth), /* offset */ READONLY, /* flags */ deepflatten_prop_depth_doc /* doc */ }, { "currentdepth", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_DeepFlatten, currentdepth), /* offset */ READONLY, /* flags */ deepflatten_prop_currentdepth_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_DeepFlatten = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.deepflatten", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_DeepFlatten), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)deepflatten_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)deepflatten_doc, /* tp_doc */ (traverseproc)deepflatten_traverse, /* tp_traverse */ (inquiry)deepflatten_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)deepflatten_next, /* tp_iternext */ deepflatten_methods, /* tp_methods */ deepflatten_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)deepflatten_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000D4000081A400000000000000000000000165E3BCDA000001C0000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/deepflatten.h#ifndef PYIU_DEEPFLATTEN_H #define PYIU_DEEPFLATTEN_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iteratorlist; PyObject *types; PyObject *ignore; Py_ssize_t depth; Py_ssize_t currentdepth; int isstring; } PyIUObject_DeepFlatten; extern PyTypeObject PyIUType_DeepFlatten; #ifdef __cplusplus } #endif #endif 070701000000D5000081A400000000000000000000000165E3BCDA0000014B000000000000000000000000000000000000005A00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/docs_lengthhint.h#ifndef PYIU_DOCSLENGTHHINT_H #define PYIU_DOCSLENGTHHINT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> PyDoc_STRVAR( PYIU_lenhint_doc, "__length_hint__(/)\n" "--\n\n" "Return an *estimate* for the length of the iterator or zero.\n"); #ifdef __cplusplus } #endif #endif 070701000000D6000081A400000000000000000000000165E3BCDA00000147000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/docs_reduce.h#ifndef PYIU_DOCSREDUCE_H #define PYIU_DOCSREDUCE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> PyDoc_STRVAR( PYIU_reduce_doc, "__reduce__($self, /)\n" "--\n\n" "Return a `tuple` containing the state information for pickling.\n"); #ifdef __cplusplus } #endif #endif 070701000000D7000081A400000000000000000000000165E3BCDA00000162000000000000000000000000000000000000005800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/docs_setstate.h#ifndef PYIU_DOCSSETSTATE_H #define PYIU_DOCSSETSTATE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> PyDoc_STRVAR( PYIU_setstate_doc, "__setstate__($self, state, /)\n" "--\n\n" "Set state for unpickling. " "The `state` argument must be `tuple`-like.\n"); #ifdef __cplusplus } #endif #endif 070701000000D8000081A400000000000000000000000165E3BCDA00000129000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/docs_sizeof.h#ifndef PYIU_DOCSSIZEOF_H #define PYIU_DOCSSIZEOF_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> PyDoc_STRVAR( PYIU_sizeof_doc, "__sizeof__($self, /)\n" "--\n\n" "Returns size in memory, in bytes.\n"); #ifdef __cplusplus } #endif #endif 070701000000D9000081A400000000000000000000000165E3BCDA000088EE000000000000000000000000000000000000005800000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/docsfunctions.h/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #ifndef PYIU_DOCSFUNCTIONS_H #define PYIU_DOCSFUNCTIONS_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> PyDoc_STRVAR(PyIU_Placeholder_name, "Placeholder"); PyDoc_STRVAR(PyIU_Empty_name, "empty"); PyDoc_STRVAR(PyIU_module_name, "iteration_utilities._iteration_utilities"); PyDoc_STRVAR(PyIU_module_doc, "This is the C extension module used by iteration_utilities."); PyDoc_STRVAR( PyIU_TupleToList_and_InsertItemAtIndex_doc, "_parse_args(tup, item, pos, /)\n" "--\n\n" "Converts the `tup` to a new `tuple` and inserts `item` at `pos`.\n" "\n" ".. warning::\n" " This function is especially made for internal use, **DO NOT USE THIS \n" " FUNCTION** anywhere else.\n" "\n" "Parameters\n" "----------\n" "tup : tuple\n" " The tuple to convert.\n" " \n" " .. warning::\n" " This function will encounter a segmentation fault if `tup` is not\n" " a tuple.\n" "\n" "item : any type\n" " The item to insert.\n" "\n" "pos : int\n" " The position where to insert the `item`. \n" " \n" " .. warning::\n" " No bounds checking - If `pos` is not carefully chosen the function \n" " will segfault!\n" "\n" "Returns\n" "-------\n" "lst : tuple\n" " The converted `tup` with `item` inserted.\n" "\n" "Notes\n" "-----\n" "This is equivalent to:\n" "\n" ".. code::\n" "\n" " def _parse_args(args, item, pos):\n" " return tuple(args[:pos]) + (item, ) + tuple(args[pos:])\n"); PyDoc_STRVAR( PyIU_RemoveFromDictWhereValueIs_doc, "_parse_kwargs(dct, item, /)\n" "--\n\n" "Removes every key from the `dct` where the ``dct[key] is item``.\n" "\n" ".. warning::\n" " This function is especially made for internal use, **DO NOT USE THIS \n" " FUNCTION** anywhere else.\n" "\n" "Parameters\n" "----------\n" "dct : dict\n" " The dictionary from which to remove the keys.\n" "\n" "item : any type\n" " The item to check for.\n" "\n" "Returns\n" "-------\n" "nothing. This function works in-place.\n" "\n" "Notes\n" "-----\n" "This is equivalent to:\n" "\n" ".. code::\n" "\n" " def _parse_kwargs(dct, item):\n" " keys_to_remove = [key for key in dct if dct[key] is item]\n" " for key in keys_to_remove:\n" " del dct[key]\n"); PyDoc_STRVAR( PyIU_IsNone_doc, "is_None(value, /)\n" "--\n\n" "Returns ``True`` if `value` is ``None``, otherwise ``False``.\n" "\n" "Parameters\n" "----------\n" "value : any type \n" " The value to test for ``None``.\n" "\n" "Returns\n" "-------\n" "is_none : :py:class:`bool`\n" " ``True`` if `value` is ``None`` otherwise it returns ``False``.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: x is None`` but significantly\n" "faster::\n" "\n" " >>> from iteration_utilities import is_None\n" " >>> is_None(None)\n" " True\n" " >>> is_None(False)\n" " False\n" "\n" "This can be used for example to remove all ``None`` from an iterable::\n" "\n" " >>> import sys\n" " >>> import itertools\n" " >>> filterfalse = itertools.ifilterfalse if sys.version_info.major == 2 else itertools.filterfalse\n" " >>> list(filterfalse(is_None, [1, None, 3, 4, 5, None, 7]))\n" " [1, 3, 4, 5, 7]\n"); PyDoc_STRVAR( PyIU_IsNotNone_doc, "is_not_None(value, /)\n" "--\n\n" "Returns ``False`` if `value` is ``None``, otherwise ``True``.\n" "\n" "Parameters\n" "----------\n" "value : any type \n" " The value to test for ``None``.\n" "\n" "Returns\n" "-------\n" "is_not_none : :py:class:`bool`\n" " ``False`` if `value` is ``None`` otherwise it returns ``True``.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: x is not None`` but significantly\n" "faster::\n" "\n" " >>> from iteration_utilities import is_not_None\n" " >>> is_not_None(None)\n" " False\n" " >>> is_not_None(False)\n" " True\n"); PyDoc_STRVAR( PyIU_IsEven_doc, "is_even(value, /)\n" "--\n\n" "Returns ``True`` if `value` is even, otherwise ``False``.\n" "\n" "Parameters\n" "----------\n" "value : any type \n" " The value to test if even.\n" "\n" "Returns\n" "-------\n" "is_even : :py:class:`bool`\n" " ``True`` if `value` is even otherwise it returns ``False``.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: not x % 2`` but significantly\n" "faster::\n" "\n" " >>> from iteration_utilities import is_even\n" " >>> is_even(0)\n" " True\n" " >>> is_even(1)\n" " False\n" " >>> is_even(2)\n" " True\n"); PyDoc_STRVAR( PyIU_IsOdd_doc, "is_odd(value, /)\n" "--\n\n" "Returns ``True`` if `value` is odd, otherwise ``False``.\n" "\n" "Parameters\n" "----------\n" "value : any type \n" " The value to test if odd.\n" "\n" "Returns\n" "-------\n" "is_odd : :py:class:`bool`\n" " ``True`` if `value` is odd otherwise it returns ``False``.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: bool(x % 2)`` but significantly\n" "faster::\n" "\n" " >>> from iteration_utilities import is_odd\n" " >>> is_odd(0)\n" " False\n" " >>> is_odd(1)\n" " True\n" " >>> is_odd(2)\n" " False\n"); PyDoc_STRVAR( PyIU_IsIterable_doc, "is_iterable(value, /)\n" "--\n\n" "Returns ``True`` if `value` is iterable, otherwise ``False``.\n" "\n" "Parameters\n" "----------\n" "value : any type \n" " The value to test if iterable.\n" "\n" "Returns\n" "-------\n" "is_iterable : :py:class:`bool`\n" " ``True`` if `value` is iterable otherwise it returns ``False``.\n" "\n" "Examples\n" "--------\n" "A few simple examples::\n" "\n" " >>> from iteration_utilities import is_iterable\n" " >>> is_iterable(0)\n" " False\n" " >>> is_iterable('abc')\n" " True\n" " >>> is_iterable([1,2,3])\n" " True\n"); PyDoc_STRVAR( PyIU_MathSquare_doc, "square(value, /)\n" "--\n\n" "Returns the squared `value`.\n" "\n" "Parameters\n" "----------\n" "value : any type\n" " The value to be squared. The type of the `value` must support ``pow``.\n" "\n" "Returns\n" "-------\n" "square : any type\n" " Returns ``value**2``.\n" "\n" "Examples\n" "--------\n" "It is not possible to apply :py:func:`functools.partial` to :py:func:`pow` \n" "so that one has a one-argument square function and is significantly faster \n" "than ``lambda x: x**2``::\n" "\n" " >>> from iteration_utilities import square\n" " >>> square(1)\n" " 1\n" " >>> square(2.0)\n" " 4.0\n"); PyDoc_STRVAR( PyIU_MathDouble_doc, "double(value, /)\n" "--\n\n" "Returns the doubled `value`.\n" "\n" "Parameters\n" "----------\n" "value : any type\n" " The value to be doubled..\n" "\n" "Returns\n" "-------\n" "doubled : any type\n" " Returns ``value*2``.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: x*2`` and for numerical arguments\n" "to ``functools.partial(operator.mul, 2)`` but faster::\n" "\n" " >>> from iteration_utilities import double\n" " >>> double(1)\n" " 2\n" " >>> double(2.0)\n" " 4.0\n"); PyDoc_STRVAR( PyIU_MathReciprocal_doc, "reciprocal(value, /)\n" "--\n\n" "Returns ``1 / value``.\n" "\n" "Parameters\n" "----------\n" "value : any type\n" " The value for the computation.\n" "\n" "Returns\n" "-------\n" "reciprocal : any type\n" " Returns ``1 / value``.\n" "\n" "Examples\n" "--------\n" "This is equivalent to ``lambda x: 1 / x``\n" "or ``functools.partial(operator.truediv, 1)`` but faster::\n" "\n" " >>> from iteration_utilities import reciprocal \n" " >>> reciprocal(1)\n" " 1.0\n" " >>> reciprocal(2)\n" " 0.5\n" " >>> reciprocal(4)\n" " 0.25\n"); PyDoc_STRVAR( PyIU_MathRadd_doc, "radd(op1, op2, /)\n" "--\n\n" "Returns ``op2 + op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " The values to be added.\n" "\n" "Returns\n" "-------\n" "radd : any type\n" " Returns ``op2 + op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y + x``::\n" "\n" " >>> from iteration_utilities import radd\n" " >>> radd(2, 2)\n" " 4\n"); PyDoc_STRVAR( PyIU_MathRsub_doc, "rsub(op1, op2, /)\n" "--\n\n" "Returns ``op2 - op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " The values to be subtracted.\n" "\n" "Returns\n" "-------\n" "rsub : any type\n" " Returns ``op2 - op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y - x``::\n" "\n" " >>> from iteration_utilities import rsub\n" " >>> rsub(2, 5)\n" " 3\n"); PyDoc_STRVAR( PyIU_MathRmul_doc, "rmul(op1, op2, /)\n" "--\n\n" "Returns ``op2 * op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " The values to be multiplied.\n" "\n" "Returns\n" "-------\n" "rmul : any type\n" " Returns ``op2 * op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y * x``::\n" "\n" " >>> from iteration_utilities import rmul\n" " >>> rmul(2, 2)\n" " 4\n"); PyDoc_STRVAR( PyIU_MathRdiv_doc, "rdiv(op1, op2, /)\n" "--\n\n" "Returns ``op2 / op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " The values to be divided.\n" "\n" "Returns\n" "-------\n" "rdiv : any type\n" " Returns ``op2 / op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y / x``::\n" "\n" " >>> from iteration_utilities import rdiv\n" " >>> rdiv(10, 1)\n" " 0.1\n"); PyDoc_STRVAR( PyIU_MathRfdiv_doc, "rfdiv(op1, op2, /)\n" "--\n\n" "Returns ``op2 // op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " The values to be floor divided.\n" "\n" "Returns\n" "-------\n" "rfdiv : any type\n" " Returns ``op2 // op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y // x``::\n" "\n" " >>> from iteration_utilities import rfdiv\n" " >>> rfdiv(10, 22)\n" " 2\n"); PyDoc_STRVAR( PyIU_MathRpow_doc, "rpow(op1, op2, /)\n" "--\n\n" "Returns ``op2 ** op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " The values for the operation.\n" "\n" "Returns\n" "-------\n" "rpow : any type\n" " Returns ``op2 ** op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y ** x``::\n" "\n" " >>> from iteration_utilities import rpow\n" " >>> rpow(3, 2)\n" " 8\n"); PyDoc_STRVAR( PyIU_MathRmod_doc, "rmod(op1, op2, /)\n" "--\n\n" "Returns ``op2 % op1``.\n" "\n" "Parameters\n" "----------\n" "op1, op2 : any type\n" " Get the remainder of these two operands.\n" "\n" "Returns\n" "-------\n" "rmod : any type\n" " Returns ``op2 % op1``.\n" "\n" "Examples\n" "--------\n" "Equivalent to ``lambda x, y: y % x``::\n" "\n" " >>> from iteration_utilities import rmod\n" " >>> rmod(2, 5)\n" " 1\n"); PyDoc_STRVAR( PyIU_ReturnIdentity_doc, "return_identity(obj, /)\n" "--\n\n" "Always return the argument.\n" "\n" "Parameters\n" "----------\n" "obj : any type \n" " The `obj` to return.\n" "\n" "Returns\n" "-------\n" "identity : any type\n" " The argument itself.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: x`` but significantly faster::\n" "\n" " >>> from iteration_utilities import return_identity\n" " >>> return_identity(1)\n" " 1\n" " >>> return_identity('abc')\n" " 'abc'\n"); PyDoc_STRVAR( PyIU_ReturnCalled_doc, "return_called(func, /)\n" "--\n\n" "Return the result of ``func()``.\n" "\n" "Parameters\n" "----------\n" "func : callable \n" " The function to be called.\n" "\n" "Returns\n" "-------\n" "result : any type\n" " The result of ``func()``.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda x: x()`` but significantly\n" "faster::\n" "\n" " >>> from iteration_utilities import return_called\n" " >>> return_called(int)\n" " 0\n"); PyDoc_STRVAR( PyIU_AlwaysIterable_doc, "always_iterable(obj, excluded_types=None, empty_if_none=False)\n" "--\n\n" "Make the obj iterable.\n" "\n" ".. versionadded:: 0.11.0\n" "\n" "Parameters\n" "----------\n" "obj : any type \n" " The object to make iterable.\n" "\n" "excluded_types : type or tuple of types, optional \n" " The types that should not be interpreted as already-iterable.\n" " If the argument is omitted, then :py:class:`str` and :py:class:`bytes` (but not subclasses) will be wrapped.\n" " If None, then no type is excluded.\n" "\n" "empty_if_none : bool, optional\n" " If this argument is True, then an empty iterable will be returned if *obj* is *None*.\n" " Default is ``False``.\n" "\n" "Returns\n" "-------\n" "iterable : any type\n" " An iterable over *obj* if it was considered iterable, otherwise an iterable that will only yield one item the *obj*.\n" "\n" "Examples\n" "--------\n" "In case the *obj* is iterable an iterator over the object is returned::\n" "\n" " >>> from iteration_utilities import always_iterable\n" " >>> list(always_iterable([1, 2, 3]))\n" " [1, 2, 3]\n" "\n" "If it wasn't iterable or it was excluded explicitly an iterator is returned which yields one item, the *obj*::\n" "\n" " >>> list(always_iterable(1))\n" " [1]\n" " >>> list(always_iterable([1, 2, 3], excluded_types=list))\n" " [[1, 2, 3]]\n" "\n" "By default strings are considered not iterable, but this can be overridden using *None* as *excluded_types*::\n" "\n" " >>> list(always_iterable('abc'))\n" " ['abc']\n" " >>> list(always_iterable('abc', excluded_types=None))\n" " ['a', 'b', 'c']\n" "\n" "If an empty iterator should be returned if *obj* is *None*, then *empty_if_none* can be used::\n" "\n" " >>> list(always_iterable(None))\n" " [None]\n" " >>> list(always_iterable(None, empty_if_none=True))\n" " []\n"); PyDoc_STRVAR( PyIU_ReturnFirstArg_doc, "return_first_arg(*args, **kwargs)\n" "--\n\n" "Always return the first positional argument given to the function.\n" "\n" "Parameters\n" "----------\n" "args, kwargs \n" " any number of positional or keyword parameter.\n" "\n" "Returns\n" "-------\n" "first_positional_argument : any type\n" " Always returns the first positional argument given to the function.\n" "\n" "Examples\n" "--------\n" "This function is equivalent to ``lambda *args, **kwargs: args[0]`` but\n" "significantly faster::\n" "\n" " >>> from iteration_utilities import return_first_arg\n" " >>> return_first_arg(1, 2, 3, 4, a=100)\n" " 1\n"); PyDoc_STRVAR( PyIU_AllDistinct_doc, "all_distinct(iterable, /)\n" "--\n\n" "Checks if all items in the `iterable` are distinct.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " `Iterable` containing the elements.\n" "\n" "Returns\n" "-------\n" "distinct : :py:class:`bool`\n" " ``True`` if no two values are equal and ``False`` if there is at least\n" " one duplicate in `iterable`.\n" "\n" "Notes\n" "-----\n" "The items in the `iterable` should implement equality.\n" "\n" "If the items are hashable the function is much faster.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import all_distinct\n" ">>> all_distinct('AAAABBBCCDAABBB')\n" "False\n" "\n" ">>> all_distinct('abcd')\n" "True\n"); PyDoc_STRVAR( PyIU_AllEqual_doc, "all_equal(iterable, /)\n" "--\n\n" "Checks if all the elements are equal to each other.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Any `iterable` to test.\n" "\n" "Returns\n" "-------\n" "all_equal : :py:class:`bool`\n" " ``True`` if all elements in `iterable` are equal or ``False`` if not.\n" "\n" "Notes\n" "-----\n" "If the input is empty the function returns ``True``.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import all_equal\n" ">>> all_equal([1,1,1,1,1,1,1,1,1])\n" "True\n" "\n" ">>> all_equal([1,1,1,1,1,1,1,2,1])\n" "False\n"); PyDoc_STRVAR( PyIU_AllIsinstance_doc, "all_isinstance(iterable, types)\n" "--\n\n" "Like :py:func:`isinstance` but for `iterables`.\n" "\n" "Checks if all items in `iterable` are instances of `types`.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Each item of the `iterable` is tested with ``isinstance(item, types)``.\n" "\n" "types : :py:class:`type` or :py:class:`tuple` of types\n" " Test for this type if it's a single class or test if the item is of any\n" " of the types (if types is a :py:class:`tuple`).\n" "\n" "Returns\n" "-------\n" "all : :py:class:`bool`\n" " ``True`` if all elements in `iterable` are instances of `types`,\n" " ``False`` if not.\n" "\n" "Examples\n" "--------\n" "This function is equivalent (but faster) than\n" "``all(isinstance(item, types) for item in iterable)``::\n" "\n" " >>> from iteration_utilities import all_isinstance\n" " >>> all_isinstance(range(100), int)\n" " True\n" "\n" " >>> all_isinstance([1, 2, 3.2], (int, float))\n" " True\n" "\n" ".. warning::\n" " This function returns ``True`` if the `iterable` is empty.\n"); PyDoc_STRVAR( PyIU_Monotone_doc, "all_monotone(iterable, decreasing=False, strict=False)\n" "--\n\n" "Checks if the elements in `iterable` are (strictly) monotonic \n" "increasing or decreasing.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Any `iterable` to test.\n" "\n" "decreasing : :py:class:`bool`, optional\n" " If ``False`` check if the values are monotonic increasing, otherwise\n" " check for monotone decreasing.\n" " Default is ``False``.\n" "\n" "strict : :py:class:`bool`, optional\n" " If ``True`` check if the elements are strictly greater or smaller\n" " (``>`` or ``<``) than their predecessor. Otherwise use ``>=`` and ``<=``.\n" "\n" "Returns\n" "-------\n" "monotonic : :py:class:`bool`\n" " ``True`` if all elements in `iterable` are monotonic or ``False`` if not.\n" "\n" "Notes\n" "-----\n" "If the input is empty the function returns ``True``.\n" "\n" "Examples\n" "--------\n" "This is roughly equivalent to\n" "``all(itertools.starmap(operator.lt, iteration_utilities.successive(iterable, 2)))``\n" "with the appropriate operator depending on `decreasing` and `strict`::\n" "\n" " >>> from iteration_utilities import all_monotone\n" " >>> all_monotone([1,1,1,1,1,1,1,1,1])\n" " True\n" " >>> all_monotone([1,1,1,1,1,1,1,1,1], strict=True)\n" " False\n" " >>> all_monotone([2,1,1,1,1,1,1,1,0], decreasing=True)\n" " True\n" " >>> all_monotone([2,1,1,1,1,1,1,1,0], decreasing=True, strict=True)\n" " False\n"); PyDoc_STRVAR( PyIU_AnyIsinstance_doc, "any_isinstance(iterable, types)\n" "--\n\n" "Like :py:func:`isinstance` but for `iterables`.\n" "\n" "Checks if any item in `iterable` is an instance of `types`.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Each item of the `iterable` is tested with ``isinstance(item, types)``.\n" "\n" "types : :py:class:`type` or :py:class:`tuple` of types\n" " Test for this type if it's a single class or test if the item is of any\n" " of the types (if types is a :py:class:`tuple`).\n" "\n" "Returns\n" "-------\n" "any : :py:class:`bool`\n" " ``True`` if any elements in `iterable` is an instance of `types`,\n" " ``False`` if not.\n" "\n" "Examples\n" "--------\n" "This function is equivalent (but faster) than\n" "``any(isinstance(item, types) for item in iterable)``\n" "\n" " >>> from iteration_utilities import any_isinstance\n" " >>> all_isinstance(range(100), int)\n" " True\n" "\n" " >>> any_isinstance([1, 2, 3.2], float)\n" " True\n"); PyDoc_STRVAR( PyIU_Argmin_doc, "argmin(iterable, /, key=None, default=None)\n" "--\n\n" "Find index of the minimum.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` for which to calculate the index of the minimum.\n" "\n" " .. note::\n" " Instead of one `iterable` it is also possible to pass the values (at\n" " least 2) as positional arguments.\n" "\n" "key : callable, optional\n" " If not given then compare the values, otherwise compare ``key(item)``.\n" "\n" "default : :py:class:`int`, optional\n" " If given an empty `iterable` will return `default` instead of raising a \n" " ``ValueError``.\n" "\n" "Returns\n" "-------\n" "argmin : :py:class:`int`\n" " The index of the minimum or default if the `iterable` was empty.\n" "\n" "Examples\n" "--------\n" "This is equivalent (but faster) than \n" "``min(enumerate(iterable), key=operator.itemgetter(1))[0]``::\n" "\n" " >>> from iteration_utilities import argmin\n" " >>> argmin(3,2,1,2,3)\n" " 2\n" "\n" "It allows a `key` function::\n" "\n" " >>> argmin([3, -3, 0], key=abs)\n" " 2\n" "\n" "And a `default`::\n" "\n" " >>> argmin([], default=10)\n" " 10\n"); PyDoc_STRVAR( PyIU_Argmax_doc, "argmax(iterable, /, key=None, default=None)\n" "--\n\n" "Find index of the maximum.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` for which to calculate the index of the maximum.\n" "\n" " .. note::\n" " Instead of one `iterable` it is also possible to pass the values (at\n" " least 2) as positional arguments.\n" "\n" "key : callable, optional\n" " If not given then compare the values, otherwise compare ``key(item)``.\n" "\n" "default : :py:class:`int`, optional\n" " If not given raise ``ValueError`` if the `iterable` is empty otherwise\n" " return ``default``\n" "\n" "Returns\n" "-------\n" "argmax : :py:class:`int`\n" " The index of the maximum or default if the `iterable` was empty.\n" "\n" "Examples\n" "--------\n" "This is equivalent (but faster) than \n" "``max(enumerate(iterable), key=operator.itemgetter(1))[0]``::\n" "\n" " >>> from iteration_utilities import argmax\n" " >>> argmax(3,2,1,2,3)\n" " 0\n" "\n" "It allows a `key` function::\n" "\n" " >>> argmax([0, -3, 3, 0], key=abs)\n" " 1\n" "\n" "And a `default`::\n" "\n" " >>> argmax([], default=10)\n" " 10\n"); PyDoc_STRVAR( PyIU_Count_doc, "count_items(iterable, pred=None, eq=False)\n" "--\n\n" "Count how many times the predicate is true.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Any `iterable` to count in.\n" "\n" "pred : callable, any type, None, optional\n" " Predicate to test. Depending on the `eq` parameter this parameter has\n" " different meanings:\n" " \n" " - ``eq=True`` : Each item will be counted if ``item == pred``, the `pred`\n" " must not be omitted in this case.\n" " - ``eq=False`` : If ``pred`` is not given or ``None`` then each item in\n" " the iterable is counted.\n" " If ``pred`` is given and not ``None`` then each item satisfying\n" " ``if pred(item)`` is counted.\n" "\n" " Default is ``None``.\n" "\n" "eq : :py:class:`bool`, optional\n" " If ``True`` compare each item in the `iterable` to `pred` instead of\n" " calling ``pred(item)``.\n" " Default is ``False``.\n" "\n" "Returns\n" "-------\n" "number : number\n" " The number of times the predicate is ``True``.\n" "\n" "Examples\n" "--------\n" "To count how many elements are within an `iterable`::\n" "\n" " >>> from iteration_utilities import count_items\n" " >>> count_items([0, 0, '', {}, [], 2])\n" " 6\n" "\n" "To count the number of truthy values::\n" "\n" " >>> count_items([0, 0, '', {}, [], 2], pred=bool)\n" " 1\n" "\n" "To count the number of values satisfying a condition::\n" "\n" " >>> def smaller5(val): return val < 5\n" " >>> count_items([1, 2, 3, 4, 5, 6, 6, 7], smaller5)\n" " 4\n" "\n" "To count the number of values equal to another value::\n" "\n" " >>> count_items([1, 2, 3, 4, 5, 6, 6, 7], 6, True)\n" " 2\n"); PyDoc_STRVAR( PyIU_DotProduct_doc, "dotproduct(vec1, vec2)\n" "--\n\n" "Dot product (matrix multiplication) of two vectors.\n" "\n" "Parameters\n" "----------\n" "vec1, vec2 : iterable\n" " Any `iterables` to calculate the dot product. Positional-only parameter.\n" "\n" "Returns\n" "-------\n" "dotproduct : number\n" " The dot product - the sum of the element-wise multiplication.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import dotproduct\n" ">>> dotproduct([1,2,3,4], [1,2,3,4])\n" "30\n"); PyDoc_STRVAR( PyIU_Groupby_doc, "groupedby(iterable, key, keep=None, reduce=None, reducestart=None)\n" "--\n\n" "Group values of `iterable` by a `key` function as dictionary.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` to group by a `key` function.\n" "\n" "key : callable\n" " The items of the `iterable` are grouped by the ``key(item)``.\n" "\n" "keep : callable, optional\n" " If given append only the result of ``keep(item)`` instead of ``item``.\n" "\n" "reduce : callable, optional\n" " If given then instead of returning a list of all ``items`` reduce them\n" " using the binary `reduce` function. This works like the `func` parameter\n" " in :py:func:`functools.reduce`.\n" "\n" "reducestart : any type, optional\n" " If given (even as ``None``) it will be interpreted as start value for the\n" " `reduce` function.\n" " \n" " .. note::\n" " Can only be specified if `reduce` is given.\n" "\n" "Returns\n" "-------\n" "grouped : dict\n" " A dictionary where the `keys` represent the ``key(item)`` and the `values`\n" " are lists containing all ``items`` having the same `key`.\n" "\n" "Notes\n" "-----\n" "This function differs from :py:func:`itertools.groupby` in several ways: (1) This\n" "function is eager (consumes the `iterable` in one go) and (2) the itertools\n" "function only groups the `iterable` locally.\n" "\n" "Examples\n" "--------\n" "A simple example::\n" "\n" " >>> from iteration_utilities import groupedby\n" " >>> from operator import itemgetter, add\n" " >>> dct = groupedby(['a', 'bac', 'ba', 'ab', 'abc'], key=itemgetter(0))\n" " >>> dct['a']\n" " ['a', 'ab', 'abc']\n" " >>> dct['b']\n" " ['bac', 'ba']\n" "\n" "One could also specify a `keep` function::\n" "\n" " >>> dct = groupedby(['a', 'bac', 'ba', 'ab', 'abc'], key=itemgetter(0), keep=len)\n" " >>> dct['a']\n" " [1, 2, 3]\n" " >>> dct['b']\n" " [3, 2]\n" "\n" "Or reduce all values for one key::\n" "\n" " >>> from iteration_utilities import is_even\n" " >>> dct = groupedby([1, 2, 3, 4, 5], key=is_even, reduce=add)\n" " >>> dct[True] # 2 + 4\n" " 6\n" " >>> dct[False] # 1 + 3 + 5\n" " 9\n" "\n" "using `reduce` also allows to specify a start value::\n" "\n" " >>> dct = groupedby([1, 2, 3, 4, 5], key=is_even, reduce=add, reducestart=7)\n" " >>> dct[True] # 7 + 2 + 4\n" " 13\n" " >>> dct[False] # 7 + 1 + 3 + 5\n" " 16\n"); PyDoc_STRVAR( PyIU_MinMax_doc, "minmax(iterable, /, key=None, default=None)\n" "--\n\n" "Computes the minimum and maximum values in one-pass using only\n" "``1.5*len(iterable)`` comparisons. Recipe based on the snippet\n" "of Raymond Hettinger ([0]_) but significantly modified.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` for which to calculate the minimum and maximum.\n" "\n" " .. note::\n" " Instead of one `iterable` it is also possible to pass the values (at\n" " least 2) as positional arguments.\n" "\n" "key : callable, optional\n" " If not given then compare the values, otherwise compare ``key(item)``.\n" "\n" "default : any type, optional\n" " If not given raise ``ValueError`` if the `iterable` is empty otherwise\n" " return ``(default, default)``\n" "\n" "Returns\n" "-------\n" "minimum : any type\n" " The `minimum` of the `iterable`.\n" "\n" "maximum : any type\n" " The `maximum` of the `iterable`.\n" "\n" "Raises\n" "------\n" "ValueError\n" " If `iterable` is empty and no `default` is given.\n" "\n" "See also\n" "--------\n" "min : Calculate the minimum of an iterable.\n" "\n" "max : Calculate the maximum of an iterable.\n" "\n" "Examples\n" "--------\n" "This function calculates the minimum (:py:func:`min`) and maximum\n" "(:py:func:`max`) of an `iterable`::\n" "\n" " >>> from iteration_utilities import minmax\n" " >>> minmax([2,1,3,5,4])\n" " (1, 5)\n" "\n" "or pass in the values as arguments::\n" "\n" " >>> minmax(2, 1, -1, 5, 4)\n" " (-1, 5)\n" "\n" "If the iterable is empty `default` is returned::\n" "\n" " >>> minmax([], default=0)\n" " (0, 0)\n" "\n" "Like the builtin functions it also supports a `key` argument::\n" "\n" " >>> import operator\n" " >>> seq = [(3, 2), (5, 1), (10, 3), (8, 5), (3, 4)]\n" " >>> minmax(seq, key=operator.itemgetter(1))\n" " ((5, 1), (8, 5))\n" "\n" "References\n" "----------\n" ".. [0] http://code.activestate.com/recipes/577916/\n"); PyDoc_STRVAR( PyIU_One_doc, "one(iterable, /)\n" "--\n\n" "Return the first value in the `iterable` and expects it only contains one element.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` from which to get the one item.\n" "\n" "Returns\n" "-------\n" "one : any type\n" " The first value.\n" "\n" "Raises\n" "-------\n" "ValueError :\n" " If the `iterable` contains no items or more than one item.\n" "\n" "Examples\n" "--------\n" "Some basic examples::\n" "\n" " >>> from iteration_utilities import one\n" " >>> one([0])\n" " 0\n" " >>> one('b')\n" " 'b'\n" "\n" ".. warning::\n" " :py:func:`~iteration_utilities.one` will access the first two values of \n" " the `iterable` so it should only be used if the `iterable` must only \n" " contain one item!\n"); PyDoc_STRVAR( PyIU_Partition_doc, "partition(iterable, pred=None)\n" "--\n\n" "Use a predicate to partition entries into ``False`` entries and ``True``\n" "entries.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " `Iterable` to partition.\n" "\n" "pred : callable or None, optional\n" " The predicate which determines the partition.\n" " Default is ``None``.\n" "\n" "Returns\n" "-------\n" "false_values : list\n" " An list containing the values for which the `pred` was False.\n" "\n" "true_values : list\n" " An list containing the values for which the `pred` was True.\n" "\n" "See also\n" "--------\n" ".ipartition : Generator variant of :py:func:`~iteration_utilities.partition`.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import partition\n" ">>> def is_odd(val): return val % 2\n" ">>> partition(range(10), is_odd)\n" "([0, 2, 4, 6, 8], [1, 3, 5, 7, 9])\n" "\n" ".. warning::\n" " In case the `pred` is expensive then \n" " :py:func:`~iteration_utilities.partition` can be noticeable\n" " faster than :py:func:`~iteration_utilities.ipartition`.\n"); #ifdef __cplusplus } #endif #endif 070701000000DA000081A400000000000000000000000165E3BCDA000007E9000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/dotproduct.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "dotproduct.h" #include "helper.h" static PyObject * dot_product(PyObject *iterator1, PyObject *iterator2) { PyObject *item1; PyObject *result = NULL; while ((item1 = Py_TYPE(iterator1)->tp_iternext(iterator1))) { PyObject *item2; PyObject *product; item2 = Py_TYPE(iterator2)->tp_iternext(iterator2); if (item2 == NULL) { Py_DECREF(item1); Py_XDECREF(result); return NULL; } product = PyNumber_Multiply(item1, item2); Py_DECREF(item1); Py_DECREF(item2); if (product == NULL) { Py_XDECREF(result); return NULL; } if (result == NULL) { result = product; } else { PyObject *tmp = result; result = PyNumber_Add(result, product); Py_DECREF(product); Py_DECREF(tmp); if (result == NULL) { return NULL; } } } if (PyIU_ErrorOccurredClearStopIteration()) { Py_XDECREF(result); return NULL; } if (result == NULL) { result = PyLong_FromLong((long)0); } return result; } PyObject * PyIU_DotProduct(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *vec1; PyObject *vec2; PyObject *iterator1; PyObject *iterator2; PyObject *result; if (!PyArg_ParseTuple(args, "OO", &vec1, &vec2)) { return NULL; } iterator1 = PyObject_GetIter(vec1); if (iterator1 == NULL) { return NULL; } iterator2 = PyObject_GetIter(vec2); if (iterator2 == NULL) { Py_DECREF(iterator1); return NULL; } result = dot_product(iterator1, iterator2); Py_DECREF(iterator1); Py_DECREF(iterator2); return result; } 070701000000DB000081A400000000000000000000000165E3BCDA0000010E000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/dotproduct.h#ifndef PYIU_DOTPRODUCT_H #define PYIU_DOTPRODUCT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_DotProduct(PyObject *Py_UNUSED(m), PyObject *args); #ifdef __cplusplus } #endif #endif 070701000000DC000081A400000000000000000000000165E3BCDA0000256D000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/duplicates.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "duplicates.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" #include "seen.h" PyDoc_STRVAR( duplicates_prop_seen_doc, "(:py:class:`~iteration_utilities.Seen`) Already seen values (readonly)."); PyDoc_STRVAR( duplicates_prop_key_doc, "(callable or `None`) The key function (readonly)."); PyDoc_STRVAR( duplicates_doc, "duplicates(iterable, key=None)\n" "--\n\n" "Return only duplicate entries, remembers all items ever seen.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " `Iterable` containing the elements.\n" "\n" "key : callable, optional\n" " If given it must be a callable taking one argument and this\n" " callable is applied to the value before checking if it was seen yet.\n" "\n" "Returns\n" "-------\n" "iterable : generator\n" " An iterable containing all duplicates values of the `iterable`.\n" "\n" "Notes\n" "-----\n" "The items in the `iterable` should implement equality.\n" "\n" "If the items are hashable the function is much faster.\n" "\n" "Examples\n" "--------\n" "Multiple duplicates will be kept::\n" "\n" " >>> from iteration_utilities import duplicates\n" " >>> list(duplicates('AABBCCDA'))\n" " ['A', 'B', 'C', 'A']\n" "\n" " >>> list(duplicates('ABBCcAD', str.lower))\n" " ['B', 'c', 'A']\n" "\n" "To get each duplicate only once this can be combined with \n" ":py:func:`~iteration_utilities.unique_everseen`::\n" "\n" " >>> from iteration_utilities import unique_everseen\n" " >>> list(unique_everseen(duplicates('AABBCCDA')))\n" " ['A', 'B', 'C']\n"); /****************************************************************************** * * IMPORTANT NOTE (Implementation): * * This function is almost identical to "unique_everseen", so any changes * or bugfixes should also be implemented there!!! * *****************************************************************************/ static PyObject * duplicates_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "key", NULL}; PyIUObject_Duplicates *self; PyObject *iterable; PyObject *key = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|O:duplicates", kwlist, &iterable, &key)) { return NULL; } self = (PyIUObject_Duplicates *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->seen = PyIUSeen_New(); if (self->seen == NULL) { Py_DECREF(self); return NULL; } self->key = key == Py_None ? NULL : key; Py_XINCREF(self->key); return (PyObject *)self; } static void duplicates_dealloc(PyIUObject_Duplicates *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->key); Py_XDECREF(self->seen); Py_TYPE(self)->tp_free(self); } static int duplicates_traverse(PyIUObject_Duplicates *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->key); Py_VISIT(self->seen); return 0; } static int duplicates_clear(PyIUObject_Duplicates *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->key); Py_CLEAR(self->seen); return 0; } static PyObject * duplicates_next(PyIUObject_Duplicates *self) { PyObject *item = NULL; while ((item = Py_TYPE(self->iterator)->tp_iternext(self->iterator))) { PyObject *val; int ok; /* Use the item if key is not given, otherwise apply the key. */ if (self->key == NULL) { Py_INCREF(item); val = item; } else { val = PyIU_CallWithOneArgument(self->key, item); if (val == NULL) { Py_DECREF(item); return NULL; } } /* Check if the item is in seen. */ ok = PyIUSeen_ContainsAdd(self->seen, val); Py_DECREF(val); if (ok == 1) { return item; } Py_DECREF(item); if (ok == -1) { return NULL; } } return NULL; } static PyObject * duplicates_reduce(PyIUObject_Duplicates *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OO)(O)", Py_TYPE(self), self->iterator, self->key ? self->key : Py_None, self->seen); } static PyObject * duplicates_setstate(PyIUObject_Duplicates *self, PyObject *state) { PyObject *seen; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "O:duplicates.__setstate__", &seen)) { return NULL; } /* object passed in must be an instance of Seen. Otherwise the function calls could result in an segmentation fault. */ if (!PyIU_IsTypeExact(seen, &PyIUType_Seen)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `Seen` instance as " "first argument in the `state`, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(seen)->tp_name); return NULL; } Py_INCREF(seen); Py_XSETREF(self->seen, seen); Py_RETURN_NONE; } static PyMethodDef duplicates_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)duplicates_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)duplicates_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef duplicates_memberlist[] = { { "seen", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Duplicates, seen), /* offset */ READONLY, /* flags */ duplicates_prop_seen_doc /* doc */ }, { "key", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Duplicates, key), /* offset */ READONLY, /* flags */ duplicates_prop_key_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Duplicates = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.duplicates", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Duplicates), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)duplicates_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)duplicates_doc, /* tp_doc */ (traverseproc)duplicates_traverse, /* tp_traverse */ (inquiry)duplicates_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)duplicates_next, /* tp_iternext */ duplicates_methods, /* tp_methods */ duplicates_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)duplicates_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000DD000081A400000000000000000000000165E3BCDA0000016F000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/duplicates.h#ifndef PYIU_DUPLICATES_H #define PYIU_DUPLICATES_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *key; PyObject *seen; } PyIUObject_Duplicates; extern PyTypeObject PyIUType_Duplicates; #ifdef __cplusplus } #endif #endif 070701000000DE000081A400000000000000000000000165E3BCDA000010EE000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/empty.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "empty.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( empty_doc, "_EmptyType(/)\n" "--\n\n" "An empty iterator.\n" "\n" "Notes\n" "-------\n" "There is only one instance of this class. And this class cannot be subclassed.\n"); static PyObject * empty_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { if (PyTuple_GET_SIZE(args) || (kwargs != NULL && PyDict_Size(kwargs))) { PyErr_Format(PyExc_TypeError, "`%.200s.__new__` takes no arguments.", PyIUType_Empty.tp_name); return NULL; } Py_INCREF(&EmptyStruct); return &EmptyStruct; } static PyObject * empty_next(PyObject *self) { return NULL; } static PyObject * empty_lengthhint(PyObject *self, PyObject *Py_UNUSED(args)) { Py_INCREF(PyIU_global_zero); return PyIU_global_zero; } static PyObject * empty_reduce(PyObject *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O()", Py_TYPE(self)); } static PyMethodDef empty_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)empty_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)empty_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; PyTypeObject PyIUType_Empty = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities._iteration_utilities._EmptyType", /* tp_name */ (Py_ssize_t)0, /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)0, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT, /* tp_flags */ (const char *)empty_doc, /* tp_doc */ (traverseproc)0, /* tp_traverse */ (inquiry)0, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)empty_next, /* tp_iternext */ empty_methods, /* tp_methods */ 0, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)empty_new, /* tp_new */ (freefunc)PyObject_Del, /* tp_free */ }; PyObject EmptyStruct = PYIU_CREATE_SINGLETON_INSTANCE(PyIUType_Empty); 070701000000DF000081A400000000000000000000000165E3BCDA00000109000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/empty.h#ifndef PYIU_EMPTY_H #define PYIU_EMPTY_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> extern PyObject EmptyStruct; #define PYIU_Empty (&EmptyStruct) extern PyTypeObject PyIUType_Empty; #ifdef __cplusplus } #endif #endif 070701000000E0000081A400000000000000000000000165E3BCDA00001032000000000000000000000000000000000000005A00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/exported_helper.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "exported_helper.h" #include "helper.h" /****************************************************************************** * This file contains functions that are meant as helpers, they are especially * written to speed up parts of the Python code, they shouldn't be considered * safe to use elsewhere. *****************************************************************************/ static PyObject * PyIU_parse_args(PyObject *tuple, PyObject *item, Py_ssize_t index) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); assert(item != NULL); assert(index >= 0 && index <= PyTuple_GET_SIZE(tuple)); PyObject *new_tuple; Py_ssize_t i; Py_ssize_t tuple_size = PyTuple_GET_SIZE(tuple); new_tuple = PyTuple_New(tuple_size + 1); if (new_tuple == NULL) { return NULL; } Py_INCREF(item); PyTuple_SET_ITEM(new_tuple, index, item); for (i = 0; i < tuple_size + 1; i++) { PyObject *tmp; if (i < index) { tmp = PyTuple_GET_ITEM(tuple, i); Py_INCREF(tmp); PyTuple_SET_ITEM(new_tuple, i, tmp); } else if (i == index) { continue; } else { tmp = PyTuple_GET_ITEM(tuple, i - 1); Py_INCREF(tmp); PyTuple_SET_ITEM(new_tuple, i, tmp); } } return new_tuple; } static PyObject * PyIU_parse_kwargs(PyObject *dct, PyObject *remvalue) { assert(dct != NULL && PyDict_CheckExact(dct)); assert(remvalue != NULL); PyObject *key; PyObject *value; PyObject *small_stack[PyIU_SMALL_ARG_STACK_SIZE]; PyObject **stack = small_stack; Py_ssize_t pos; Py_ssize_t dict_size; Py_ssize_t i; Py_ssize_t j; dict_size = PyDict_Size(dct); if (dict_size == 0) { Py_RETURN_NONE; } if (dict_size > PyIU_SMALL_ARG_STACK_SIZE) { stack = PyIU_AllocatePyObjectArray(dict_size); if (stack == NULL) { return PyErr_NoMemory(); } } pos = 0; i = 0; while (PyDict_Next(dct, &pos, &key, &value)) { /* Compare the "value is remvalue" (this is not "value == remvalue" at least in the python-sense). */ if (value == remvalue) { stack[i] = key; i++; } } if (i == dict_size) { PyDict_Clear(dct); } else { for (j = 0; j < i; j++) { /* Error checking is intentionally omitted since we know that the items in the stack are not-NULL and hashable. */ PyDict_DelItem(dct, stack[j]); } } if (stack != small_stack) { PyMem_Free(stack); } Py_RETURN_NONE; } #if PyIU_USE_VECTORCALL PyObject * PyIU_TupleToList_and_InsertItemAtIndex(PyObject *Py_UNUSED(m), PyObject *const *args, size_t nargs) { PyObject *tup; PyObject *item; Py_ssize_t index; if (!_PyArg_ParseStack(args, nargs, "OOn:_parse_args", &tup, &item, &index)) { return NULL; } return PyIU_parse_args(tup, item, index); } PyObject * PyIU_RemoveFromDictWhereValueIs(PyObject *Py_UNUSED(m), PyObject *const *args, size_t nargs) { PyObject *dct; PyObject *remvalue; if (!_PyArg_ParseStack(args, nargs, "OO:_parse_kwargs", &dct, &remvalue)) { return NULL; } return PyIU_parse_kwargs(dct, remvalue); } #else PyObject * PyIU_TupleToList_and_InsertItemAtIndex(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *tup; PyObject *item; Py_ssize_t index; if (!PyArg_ParseTuple(args, "OOn:_parse_args", &tup, &item, &index)) { return NULL; } return PyIU_parse_args(tup, item, index); } PyObject * PyIU_RemoveFromDictWhereValueIs(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *dct; PyObject *remvalue; if (!PyArg_ParseTuple(args, "OO:_parse_kwargs", &dct, &remvalue)) { return NULL; } return PyIU_parse_kwargs(dct, remvalue); } #endif 070701000000E1000081A400000000000000000000000165E3BCDA0000027F000000000000000000000000000000000000005A00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/exported_helper.h#ifndef PYIU_EXPORTEDHELPER_H #define PYIU_EXPORTEDHELPER_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" #if PyIU_USE_VECTORCALL PyObject * PyIU_TupleToList_and_InsertItemAtIndex(PyObject *Py_UNUSED(m), PyObject *const *args, size_t nargs); PyObject * PyIU_RemoveFromDictWhereValueIs(PyObject *Py_UNUSED(m), PyObject *const *args, size_t nargs); #else PyObject * PyIU_TupleToList_and_InsertItemAtIndex(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_RemoveFromDictWhereValueIs(PyObject *Py_UNUSED(m), PyObject *args); #endif #ifdef __cplusplus } #endif #endif 070701000000E2000081A400000000000000000000000165E3BCDA00001EAB000000000000000000000000000000000000004F00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/flip.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "flip.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( flip_prop_func_doc, "(callable) The function with flipped arguments (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( flip_doc, "flip(x, /)\n" "--\n\n" "Class that reverses the positional arguments to a `func` when called.\n" "\n" "Parameters\n" "----------\n" "func : callable\n" " The function that should be called with the flipped (reversed) arguments.\n" "\n" "Examples\n" "--------\n" "This can be used to alter the call to a function::\n" "\n" " >>> from iteration_utilities import flip\n" " >>> from functools import partial\n" " >>> flipped = flip(isinstance)\n" " >>> isfloat = partial(flipped, float)\n" "\n" " >>> isfloat(10)\n" " False\n" " >>> isfloat(11.25)\n" " True\n"); #if PyIU_USE_VECTORCALL static PyObject *flip_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames); #endif static PyObject * flip_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { PyIUObject_Flip *self; PyObject *func; if (!PyArg_UnpackTuple(args, "flip", 1, 1, &func)) { return NULL; } /* If the object is another flip we can simply return it's function because two flips are equivalent to no flip. However subclasses should be excluded from this behaviour so also check that the first argument is in fact "flip" and not a subclass. */ if (PyIU_IsTypeExact(func, &PyIUType_Flip) && type == &PyIUType_Flip) { PyObject *ret = ((PyIUObject_Flip *)func)->func; Py_INCREF(ret); return ret; } self = (PyIUObject_Flip *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(func); self->func = func; #if PyIU_USE_VECTORCALL self->vectorcall = flip_vectorcall; #endif return (PyObject *)self; } static void flip_dealloc(PyIUObject_Flip *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->func); Py_TYPE(self)->tp_free(self); } static int flip_traverse(PyIUObject_Flip *self, visitproc visit, void *arg) { Py_VISIT(self->func); return 0; } static int flip_clear(PyIUObject_Flip *self) { Py_CLEAR(self->func); return 0; } #if PyIU_USE_VECTORCALL static PyObject * flip_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyObject *result; PyObject *small_stack[PyIU_SMALL_ARG_STACK_SIZE]; PyObject **stack = small_stack; Py_ssize_t i; Py_ssize_t j; PyIUObject_Flip *self = (PyIUObject_Flip *)obj; Py_ssize_t n_pos_args = PyVectorcall_NARGS(nargsf); Py_ssize_t n_args = n_pos_args + (kwnames == NULL ? 0 : PyTuple_GET_SIZE(kwnames)); if (n_pos_args <= 1) { return PyIU_PyObject_Vectorcall(self->func, args, n_pos_args, kwnames); } if (n_args > PyIU_SMALL_ARG_STACK_SIZE) { stack = PyIU_AllocatePyObjectArray(n_args); if (stack == NULL) { return PyErr_NoMemory(); } } for (i = 0, j = n_pos_args - 1; i < n_pos_args; i++, j--) { stack[i] = args[j]; } memcpy(stack + n_pos_args, args + n_pos_args, (n_args - n_pos_args) * sizeof(PyObject *)); result = PyIU_PyObject_Vectorcall(self->func, stack, n_pos_args, kwnames); if (stack != small_stack) { PyMem_Free(stack); } return result; } #else static PyObject * flip_call(PyIUObject_Flip *self, PyObject *args, PyObject *kwargs) { PyObject *result; PyObject *tmpargs; if (PyTuple_GET_SIZE(args) >= 2) { tmpargs = PyIU_TupleReverse(args); result = PyObject_Call(self->func, tmpargs, kwargs); Py_DECREF(tmpargs); } else { result = PyObject_Call(self->func, args, kwargs); } return result; } #endif static PyObject * flip_repr(PyIUObject_Flip *self) { PyObject *result = NULL; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } result = PyUnicode_FromFormat("%s(%R)", Py_TYPE(self)->tp_name, self->func); Py_ReprLeave((PyObject *)self); return result; } static PyObject * flip_reduce(PyIUObject_Flip *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(O)", Py_TYPE(self), self->func); } static PyMethodDef flip_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)flip_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef flip_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Flip, func), /* offset */ READONLY, /* flags */ flip_prop_func_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Flip = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.flip", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Flip), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)flip_dealloc, /* tp_dealloc */ #if PyIU_USE_VECTORCALL offsetof(PyIUObject_Flip, vectorcall), /* tp_vectorcall_offset */ #else (printfunc)0, /* tp_print */ #endif (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)flip_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ #if PyIU_USE_VECTORCALL (ternaryfunc)PyVectorcall_Call, /* tp_call */ #else (ternaryfunc)flip_call, /* tp_call */ #endif (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL | _Py_TPFLAGS_HAVE_VECTORCALL #else | Py_TPFLAGS_HAVE_VECTORCALL #endif #endif , /* tp_flags */ (const char *)flip_doc, /* tp_doc */ (traverseproc)flip_traverse, /* tp_traverse */ (inquiry)flip_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ flip_methods, /* tp_methods */ flip_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)flip_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000E3000081A400000000000000000000000165E3BCDA0000016A000000000000000000000000000000000000004F00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/flip.h#ifndef PYIU_FLIP_H #define PYIU_FLIP_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; #if PyIU_USE_VECTORCALL vectorcallfunc vectorcall; #endif } PyIUObject_Flip; extern PyTypeObject PyIUType_Flip; #ifdef __cplusplus } #endif #endif 070701000000E4000081A400000000000000000000000165E3BCDA00001701000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/groupedby.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "groupedby.h" #include "helper.h" #define PyIU_USE_DICT_INTERNALS PYIU_CPYTHON PyObject * PyIU_Groupby(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "key", "keep", "reduce", "reducestart", NULL}; PyObject *iterable; PyObject *keyfunc; PyObject *valfunc = NULL; PyObject *iterator = NULL; PyObject *reducefunc = NULL; PyObject *reducestart = NULL; PyObject *resdict = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO|OOO:groupedby", kwlist, &iterable, &keyfunc, &valfunc, &reducefunc, &reducestart)) { goto Fail; } if (reducefunc == Py_None) { reducefunc = NULL; } if (valfunc == Py_None) { valfunc = NULL; } if (reducefunc == NULL && reducestart != NULL) { PyErr_SetString(PyExc_TypeError, "cannot specify `reducestart` argument for " "`groupedby` if no `reduce` is given."); goto Fail; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { goto Fail; } resdict = PyDict_New(); if (resdict == NULL) { goto Fail; } for (;;) { PyObject *item; PyObject *val; PyObject *keep; #if PyIU_USE_DICT_INTERNALS Py_hash_t hash; #endif item = Py_TYPE(iterator)->tp_iternext(iterator); if (item == NULL) { break; } /* Calculate the key for the dictionary (val). */ val = PyIU_CallWithOneArgument(keyfunc, item); if (val == NULL) { Py_DECREF(item); goto Fail; } /* Calculate the value for the dictionary (keep). */ if (valfunc == NULL) { keep = item; } else { /* We use the same item again to calculate the keep so we don't need to replace. */ keep = PyIU_CallWithOneArgument(valfunc, item); Py_DECREF(item); if (keep == NULL) { Py_DECREF(val); goto Fail; } } #if PyIU_USE_DICT_INTERNALS /* Taken from dictobject.c CPython 3.5 */ if (!PyUnicode_CheckExact(val) || (hash = ((PyASCIIObject *)val)->hash) == -1) { hash = PyObject_Hash(val); if (hash == -1) { Py_DECREF(keep); Py_DECREF(val); goto Fail; } } #endif if (reducefunc == NULL) { /* Keep all values as list. */ PyObject *lst; #if PyIU_USE_DICT_INTERNALS lst = _PyDict_GetItem_KnownHash(resdict, val, hash); #else lst = PyDict_GetItem(resdict, val); #endif if (lst == NULL) { int ok; lst = PyList_New(1); if (lst == NULL) { Py_DECREF(keep); Py_DECREF(val); goto Fail; } PyList_SET_ITEM(lst, 0, keep); #if PyIU_USE_DICT_INTERNALS ok = _PyDict_SetItem_KnownHash(resdict, val, lst, hash); #else ok = PyDict_SetItem(resdict, val, lst); #endif Py_DECREF(lst); Py_DECREF(val); if (ok == -1) { goto Fail; } } else { int ok; Py_DECREF(val); ok = PyList_Append(lst, keep); Py_DECREF(keep); if (ok < 0) { goto Fail; } } } else { /* Reduce the values with a binary operation. */ PyObject *current; #if PyIU_USE_DICT_INTERNALS current = _PyDict_GetItem_KnownHash(resdict, val, hash); #else current = PyDict_GetItem(resdict, val); #endif Py_XINCREF(current); if (current == NULL && reducestart == NULL) { /* No item yet and no starting value given: Keep the "keep". */ int ok; #if PyIU_USE_DICT_INTERNALS ok = _PyDict_SetItem_KnownHash(resdict, val, keep, hash); #else ok = PyDict_SetItem(resdict, val, keep); #endif Py_DECREF(val); Py_DECREF(keep); if (ok == -1) { goto Fail; } } else { /* Already an item present so use the binary operation. */ PyObject *reducetmp; int ok; if (current == NULL) { reducetmp = PyIU_CallWithTwoArguments(reducefunc, reducestart, keep); } else { reducetmp = PyIU_CallWithTwoArguments(reducefunc, current, keep); Py_DECREF(current); } Py_DECREF(keep); if (reducetmp == NULL) { Py_DECREF(val); goto Fail; } #if PyIU_USE_DICT_INTERNALS ok = _PyDict_SetItem_KnownHash(resdict, val, reducetmp, hash); #else ok = PyDict_SetItem(resdict, val, reducetmp); #endif Py_DECREF(val); Py_DECREF(reducetmp); if (ok == -1) { goto Fail; } } } } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(resdict); return NULL; } return resdict; Fail: Py_XDECREF(iterator); Py_XDECREF(resdict); return NULL; } 070701000000E5000081A400000000000000000000000165E3BCDA0000011B000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/groupedby.h#ifndef PYIU_GROUPEDBY_H #define PYIU_GROUPEDBY_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_Groupby(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000E6000081A400000000000000000000000165E3BCDA0000381E000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/grouper.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "grouper.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( grouper_prop_fillvalue_doc, "(any type) The fillvalue if the last group does not contain enough " "items (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( grouper_prop_times_doc, "(:py:class:`int`) The size of each group (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( grouper_prop_truncate_doc, "(:py:class:`int`) ``True`` if an incomplete last group is discarded " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( grouper_doc, "grouper(iterable, n, fillvalue=None, truncate=False)\n" "--\n\n" "Collect data into fixed-length chunks or blocks.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Any `iterable` to group.\n" "\n" "n : :py:class:`int`\n" " The number of elements in each chunk.\n" "\n" "fillvalue : any type, optional\n" " The `fillvalue` if the `iterable` is consumed and the last yielded group\n" " should be filled. If not given the last yielded group may be shorter\n" " than the group before. Using ``fillvalue=None`` is different from not \n" " giving a `fillvalue` in that the last group will be filled with ``None``.\n" "\n" "truncate : :py:class:`bool`, optional\n" " As alternative to `fillvalue` the last group is discarded if it is\n" " shorter than `n` and `truncate` is ``True``.\n" " Default is ``False``.\n" "\n" "Raises\n" "------\n" "TypeError\n" " If `truncate` is ``True`` and a `fillvalue` is given.\n" "\n" "Returns\n" "-------\n" "groups : generator\n" " An `iterable` containing the groups/chunks as ``tuple``.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import grouper\n" "\n" ">>> list(grouper('ABCDEFG', 3))\n" "[('A', 'B', 'C'), ('D', 'E', 'F'), ('G',)]\n" "\n" ">>> list(grouper('ABCDEFG', 3, fillvalue='x'))\n" "[('A', 'B', 'C'), ('D', 'E', 'F'), ('G', 'x', 'x')]\n" "\n" ">>> list(grouper('ABCDEFG', 3, truncate=True))\n" "[('A', 'B', 'C'), ('D', 'E', 'F')]\n"); static PyObject * grouper_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "n", "fillvalue", "truncate", NULL}; PyIUObject_Grouper *self; PyObject *iterable; PyObject *fillvalue = NULL; PyObject *result = NULL; Py_ssize_t times; int truncate = 0; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "On|Op:grouper", kwlist, &iterable, ×, &fillvalue, &truncate)) { return NULL; } if (fillvalue != NULL && truncate != 0) { PyErr_SetString(PyExc_TypeError, "cannot specify both the `truncate` and the " "`fillvalue` argument for `grouper`."); return NULL; } if (times <= 0) { PyErr_SetString(PyExc_ValueError, "`n` argument for `grouper` must be greater than 0."); return NULL; } self = (PyIUObject_Grouper *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->times = times; Py_XINCREF(fillvalue); self->fillvalue = fillvalue; self->truncate = truncate; self->result = result; return (PyObject *)self; } static void grouper_dealloc(PyIUObject_Grouper *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->fillvalue); Py_XDECREF(self->result); Py_TYPE(self)->tp_free(self); } static int grouper_traverse(PyIUObject_Grouper *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->fillvalue); Py_VISIT(self->result); return 0; } static int grouper_clear(PyIUObject_Grouper *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->fillvalue); Py_CLEAR(self->result); return 0; } static PyObject * grouper_next_last(PyIUObject_Grouper *self, PyObject *result, Py_ssize_t idx, int recycle) { assert(self != NULL); assert(result != NULL && PyTuple_CheckExact(result)); assert(idx >= 0 && idx < PyTuple_GET_SIZE(result)); /* No need to keep the result in the instance anymore. */ Py_CLEAR(self->result); if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(result); return NULL; } if (idx == 0 || self->truncate != 0) { /* In case it would be the first element of a new tuple or we truncate the iterator we stop here. */ Py_DECREF(result); return NULL; } else if (self->fillvalue != NULL) { /* If we want to fill the last group just proceed but use the fillvalue as item. */ while (idx < self->times) { Py_INCREF(self->fillvalue); if (recycle) { PyObject *tmp = PyTuple_GET_ITEM(result, idx); PyTuple_SET_ITEM(result, idx, self->fillvalue); assert(tmp != NULL); Py_DECREF(tmp); } else { PyTuple_SET_ITEM(result, idx, self->fillvalue); } idx++; } return result; } else { /* Otherwise we need a return just the last idx1 items. Because idx1 is by definition smaller than self->times we need a new tuple to hold the result. */ PyObject *last_result = PyIU_TupleGetSlice(result, idx); Py_DECREF(result); return last_result; } } static PyObject * grouper_next(PyIUObject_Grouper *self) { PyObject *result; Py_ssize_t idx; int recycle = 0; /* First call needs to create a tuple for the result. */ if (self->result == NULL) { result = PyTuple_New(self->times); if (result == NULL) { return NULL; } Py_INCREF(result); self->result = result; } else { /* Recycle old result if the instance is the only one holding a reference, otherwise create a new tuple. */ recycle = PYIU_CPYTHON && (Py_REFCNT(self->result) == 1); if (recycle) { result = self->result; Py_INCREF(result); } else { result = PyTuple_New(self->times); if (result == NULL) { return NULL; } } } /* Take the next self->times elements from the iterator. */ for (idx = 0; idx < self->times; idx++) { PyObject *item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); if (item != NULL) { if (recycle) { /* If we recycle we need to decref the old results before replacing them. May be insecure because deleting elements might have consequences for the sequence. A better way would be to keep all of them until the tuple elements are replaced and then to delete them. */ PyObject *tmp = PyTuple_GET_ITEM(result, idx); PyTuple_SET_ITEM(result, idx, item); assert(tmp != NULL); Py_DECREF(tmp); } else { PyTuple_SET_ITEM(result, idx, item); } } else { return grouper_next_last(self, result, idx, recycle); } } return result; } static PyObject * grouper_reduce(PyIUObject_Grouper *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on fillvalue == NULL because otherwise "None" would be ambiguous. It could mean that we did not had a fillvalue or that the next item was None. Better to make an "if" than to introduce another variable depending on fillvalue == NULL. */ if (self->fillvalue == NULL) { return Py_BuildValue("O(On)(i)", Py_TYPE(self), self->iterator, self->times, self->truncate); } else { return Py_BuildValue("O(OnO)(i)", Py_TYPE(self), self->iterator, self->times, self->fillvalue, self->truncate); } } static PyObject * grouper_setstate(PyIUObject_Grouper *self, PyObject *state) { int truncate; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "i:grouper.__setstate__", &truncate)) { return NULL; } /* truncate is just a boolean-like flag so there isn't anything that could checked here. */ self->truncate = truncate; Py_RETURN_NONE; } static PyObject * grouper_lengthhint(PyIUObject_Grouper *self, PyObject *Py_UNUSED(args)) { Py_ssize_t groups, rem; Py_ssize_t len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } groups = len / self->times; rem = len % self->times; if (self->truncate || rem == 0) { return PyLong_FromSsize_t(groups); } else { /* groups + 1 cannot overflow because that could only happen if "times" is 1 and in that case "rem==0". So it would always enter the first branch which does not contain addition. */ return PyLong_FromSsize_t(groups + 1); } } static PyObject * grouper_get_truncate(PyIUObject_Grouper *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->truncate); } static PyMethodDef grouper_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)grouper_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)grouper_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)grouper_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyGetSetDef grouper_getsetlist[] = { { "truncate", /* name */ (getter)grouper_get_truncate, /* get */ (setter)0, /* set */ grouper_prop_truncate_doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; static PyMemberDef grouper_memberlist[] = { { "fillvalue", /* name */ T_OBJECT_EX, /* type */ offsetof(PyIUObject_Grouper, fillvalue), /* offset */ READONLY, /* flags */ grouper_prop_fillvalue_doc /* doc */ }, { "times", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Grouper, times), /* offset */ READONLY, /* flags */ grouper_prop_times_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Grouper = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.grouper", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Grouper), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)grouper_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)grouper_doc, /* tp_doc */ (traverseproc)grouper_traverse, /* tp_traverse */ (inquiry)grouper_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)grouper_next, /* tp_iternext */ grouper_methods, /* tp_methods */ grouper_memberlist, /* tp_members */ grouper_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)grouper_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000E7000081A400000000000000000000000165E3BCDA00000193000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/grouper.h#ifndef PYIU_GROUPER_H #define PYIU_GROUPER_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *fillvalue; PyObject *result; Py_ssize_t times; int truncate; } PyIUObject_Grouper; extern PyTypeObject PyIUType_Grouper; #ifdef __cplusplus } #endif #endif 070701000000E8000081A400000000000000000000000165E3BCDA00001B17000000000000000000000000000000000000005100000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/helper.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "helper.h" /****************************************************************************** * Global constants. * * Python objects that are created only once and stay in memory. *****************************************************************************/ PyObject *PyIU_global_zero = NULL; PyObject *PyIU_global_one = NULL; PyObject *PyIU_global_two = NULL; PyObject *PyIU_global_0tuple = NULL; void PyIU_InitializeConstants(void) { if (PyIU_global_zero == NULL) { PyIU_global_zero = PyLong_FromLong(0L); PyIU_global_one = PyLong_FromLong(1L); PyIU_global_two = PyLong_FromLong(2L); PyIU_global_0tuple = PyTuple_New(0); } } /****************************************************************************** * Create a new tuple containing iterators for the input-tuple. *****************************************************************************/ PyObject * PyIU_CreateIteratorTuple(PyObject *tuple) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); PyObject *newtuple; Py_ssize_t i; Py_ssize_t tuplesize = PyTuple_GET_SIZE(tuple); newtuple = PyTuple_New(tuplesize); if (newtuple == NULL) { return NULL; } for (i = 0; i < tuplesize; i++) { PyObject *iterator = PyObject_GetIter(PyTuple_GET_ITEM(tuple, i)); if (iterator == NULL) { Py_DECREF(newtuple); return NULL; } PyTuple_SET_ITEM(newtuple, i, iterator); } return newtuple; } /****************************************************************************** * Create a new reversed tuple from another tuple. *****************************************************************************/ PyObject * PyIU_TupleReverse(PyObject *tuple) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); PyObject *newtuple; Py_ssize_t i; Py_ssize_t j; Py_ssize_t tuplesize = PyTuple_GET_SIZE(tuple); newtuple = PyTuple_New(tuplesize); if (newtuple == NULL) { return NULL; } for (i = 0, j = tuplesize - 1; i < tuplesize; i++, j--) { PyObject *item = PyTuple_GET_ITEM(tuple, i); Py_INCREF(item); PyTuple_SET_ITEM(newtuple, j, item); } return newtuple; } /****************************************************************************** * Copy a tuple. This is necessary because PyTuple_GetSlice doesn't return a * copy when the range is identical (or bigger) than the original tuple. * * tuple : Tuple where the value should be inserted. *****************************************************************************/ PyObject * PyIU_TupleCopy(PyObject *tuple) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); PyObject *newtuple; Py_ssize_t i; Py_ssize_t tuplesize = PyTuple_GET_SIZE(tuple); newtuple = PyTuple_New(tuplesize); if (newtuple == NULL) { return NULL; } for (i = 0; i < tuplesize; i++) { PyObject *tmp = PyTuple_GET_ITEM(tuple, i); Py_INCREF(tmp); PyTuple_SET_ITEM(newtuple, i, tmp); } return newtuple; } /****************************************************************************** * Insert a value in a Tuple by moving all items at or above this index one to * the right. * * WARNING: The last item of the Tuple mustn't be a PyObject or the caller must * have a reference to it - because this would leave a dangling reference! * * tuple : Tuple where the value should be inserted. * where : index to insert the value * v : Value to insert * num : Move items up to this index. I.e. if 10 then item 9 is moved to * index 10 but item 10 isn't moved. (In fact item 10 mustn't be a * PyObject, see Warning.) *****************************************************************************/ void PyIU_TupleInsert(PyObject *tuple, Py_ssize_t where, PyObject *v, Py_ssize_t num) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); assert(where >= 0 && where < PyTuple_GET_SIZE(tuple)); assert(v != NULL); assert(num >= 0 && num <= PyTuple_GET_SIZE(tuple)); Py_ssize_t i; /* Move each of them to the next place, starting by the next-to-last element going left until where. */ for (i = num - 2; i >= where; i--) { PyObject *temp = PyTuple_GET_ITEM(tuple, i); PyTuple_SET_ITEM(tuple, i + 1, temp); } /* Insert the new element. */ PyTuple_SET_ITEM(tuple, where, v); } /****************************************************************************** * Remove a value from a Tuple and move every successive element one to the * left. * * WARNING: The value that is to be removed is not DECREF'd so the caller must * ensure that he DECREFs the removed item afterwards, otherwise this will * create a memory leak! * * tuple : Tuple where the value should be removed. * where : index where to remove the value * num : Move items to up to this index. I.e. if num=10 then the item at pos * 10 is moved to 9 (and 10 is set to NULL), ... until where+1 which is * moved to "where". *****************************************************************************/ void PyIU_TupleRemove(PyObject *tuple, Py_ssize_t where, Py_ssize_t num) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); assert(where >= 0 && where < PyTuple_GET_SIZE(tuple)); assert(num >= 0 && num <= PyTuple_GET_SIZE(tuple)); Py_ssize_t i; /* Move each item to the left from the after-where index until the end of the array. */ for (i = where + 1; i < num; i++) { PyObject *temp = PyTuple_GET_ITEM(tuple, i); PyTuple_SET_ITEM(tuple, i - 1, temp); } /* Insert NULL at the last position. */ PyTuple_SET_ITEM(tuple, num - 1, NULL); } /****************************************************************************** * Get the first 'n' values of a tuple. * * PyPy does not allow slicing tuples with NULL in it with PyTuple_GetSlice() * even if the NULL would not be copied. So this is put in a separate helper. * * tuple : Tuple to slice * num : The number of items to copy from the tuple. *****************************************************************************/ PyObject * PyIU_TupleGetSlice(PyObject *tuple, Py_ssize_t num) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); assert(num >= 0 && num < PyTuple_GET_SIZE(tuple)); Py_ssize_t i; PyObject *result = PyTuple_New(num); if (result == NULL) { return NULL; } for (i = 0; i < num; i++) { PyObject *tmp = PyTuple_GET_ITEM(tuple, i); assert(tmp != NULL); Py_INCREF(tmp); PyTuple_SET_ITEM(result, i, tmp); } return result; } 070701000000E9000081A400000000000000000000000165E3BCDA00001476000000000000000000000000000000000000005100000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/helper.h#ifndef PYIU_HELPER_H #define PYIU_HELPER_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" #define PyIU_SMALL_ARG_STACK_SIZE 5 extern PyObject *PyIU_global_zero; extern PyObject *PyIU_global_one; extern PyObject *PyIU_global_two; extern PyObject *PyIU_global_0tuple; PyObject *PyIU_CreateIteratorTuple(PyObject *tuple); PyObject *PyIU_TupleReverse(PyObject *tuple); PyObject *PyIU_TupleCopy(PyObject *tuple); void PyIU_TupleInsert(PyObject *tuple, Py_ssize_t where, PyObject *v, Py_ssize_t num); void PyIU_TupleRemove(PyObject *tuple, Py_ssize_t where, Py_ssize_t num); PyObject *PyIU_TupleGetSlice(PyObject *tuple, Py_ssize_t num); void PyIU_InitializeConstants(void); static inline int PyIU_IsTypeExact(PyObject *obj, PyTypeObject *type) { #if PyIU_USE_BUILTIN_IS_TYPE return Py_IS_TYPE(obj, type); #else return Py_TYPE(obj) == type; #endif } static inline PyObject** PyIU_AllocatePyObjectArray(Py_ssize_t num) { assert(num >= 0); return PyMem_Malloc((size_t)num * sizeof(PyObject *)); } static inline int PyIU_ErrorOccurredClearStopIteration() { if (PyErr_Occurred()) { if (PyErr_ExceptionMatches(PyExc_StopIteration)) { PyErr_Clear(); } else { return 1; } } return 0; } /****************************************************************************** * Function call abstractions * * To support the different calling conventions across Python versions *****************************************************************************/ #if PyIU_USE_VECTORCALL static inline PyObject* PyIU_PyObject_Vectorcall(PyObject *callable, PyObject *const *args, size_t nargsf, PyObject *kwnames) { #if PyIU_USE_UNDERSCORE_VECTORCALL return _PyObject_Vectorcall(callable, args, nargsf, kwnames); #else return PyObject_Vectorcall(callable, args, nargsf, kwnames); #endif } #endif static inline PyObject* PyIU_CallWithNoArgument(PyObject *callable) { assert(callable != NULL); #if PyIU_USE_VECTORCALL && !PyIU_USE_UNDERSCORE_VECTORCALL return PyObject_CallNoArgs(callable); #else /* Or maybe PyObject_CallObject ... not sure*/ return PyObject_CallFunctionObjArgs(callable, NULL); #endif } static inline PyObject* PyIU_CallWithOneArgument(PyObject *callable, PyObject *arg1) { assert(callable != NULL); assert(arg1 != NULL); #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL PyObject *args[1]; args[0] = arg1; return _PyObject_Vectorcall(callable, args, 1, NULL); #else return PyObject_CallOneArg(callable, arg1); #endif #elif PyIU_USE_FASTCALL PyObject *args[1]; args[0] = arg1; return _PyObject_FastCall(callable, args, 1); #else PyObject *result; PyObject *args = PyTuple_New(1); if (args == NULL) { return NULL; } Py_INCREF(arg1); PyTuple_SET_ITEM(args, 0, arg1); result = PyObject_Call(callable, args, NULL); Py_DECREF(args); return result; #endif } static inline PyObject* PyIU_CallWithTwoArguments(PyObject *callable, PyObject *arg1, PyObject *arg2) { assert(callable != NULL); assert(arg1 != NULL); assert(arg2 != NULL); #if PyIU_USE_VECTORCALL PyObject *args[2]; args[0] = arg1; args[1] = arg2; return PyIU_PyObject_Vectorcall(callable, args, 2, NULL); #elif PyIU_USE_FASTCALL PyObject *args[2]; args[0] = arg1; args[1] = arg2; return _PyObject_FastCall(callable, args, 2); #else PyObject *result; PyObject *args = PyTuple_New(2); if (args == NULL) { return NULL; } Py_INCREF(arg1); Py_INCREF(arg2); PyTuple_SET_ITEM(args, 0, arg1); PyTuple_SET_ITEM(args, 1, arg2); result = PyObject_Call(callable, args, NULL); Py_DECREF(args); return result; #endif } #define PyIU_USE_CPYTHON_INTERNALS PYIU_CPYTHON static inline void PyIU_CopyTupleToArray(PyObject *tuple, PyObject **array, size_t n_objects) { assert(tuple != NULL && PyTuple_CheckExact(tuple)); assert(array != NULL); assert(PyTuple_GET_SIZE(tuple) >= (Py_ssize_t)n_objects); #if PyIU_USE_CPYTHON_INTERNALS memcpy(array, ((PyTupleObject *)tuple)->ob_item, n_objects * sizeof(PyObject *)); #else Py_ssize_t i; for (i = 0; i < n_objects; i++) { array[i] = PyTuple_GET_ITEM(tuple, i); } #endif } static inline void PyIU_CopyListToArray(PyObject *list, PyObject **array, size_t n_objects) { assert(list != NULL && PyList_CheckExact(list)); assert(array != NULL); assert(PyList_GET_SIZE(list) >= (Py_ssize_t)n_objects); #if PyIU_USE_CPYTHON_INTERNALS memcpy(array, ((PyListObject *)list)->ob_item, n_objects * sizeof(PyObject *)); #else Py_ssize_t i; for (i = 0; i < n_objects; i++) { array[i] = PyList_GET_ITEM(list, i); } #endif } #undef PyIU_USE_TUPLE_INTERNALS #ifdef __cplusplus } #endif #endif 070701000000EA000081A400000000000000000000000165E3BCDA0000075F000000000000000000000000000000000000005700000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/helpercompat.h#ifndef PYIU_HELPERCOMPAT_H #define PYIU_HELPERCOMPAT_H #ifdef __cplusplus extern "C" { #endif #ifdef PYPY_VERSION #define PYIU_PYPY 1 #define PYIU_CPYTHON 0 #else #define PYIU_PYPY 0 #define PYIU_CPYTHON 1 #endif #if PYIU_PYPY // Both were added in PyPy3.6 7.2.ß. #ifndef Py_RETURN_NOTIMPLEMENTED #define Py_RETURN_NOTIMPLEMENTED return Py_INCREF(Py_NotImplemented), Py_NotImplemented #endif #ifndef Py_UNUSED #define Py_UNUSED(name) _unused_ ## name #endif #endif #if PYIU_PYPY // Taken from PyObject_HEAD_INIT implementation (it's a bit hacky...) #define PYIU_CREATE_SINGLETON_INSTANCE(type) { 1, 0, &type } #else #if PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION < 12 #define PYIU_CREATE_SINGLETON_INSTANCE(type) { _PyObject_EXTRA_INIT 1, &type } #elif PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION == 12 #define PYIU_CREATE_SINGLETON_INSTANCE(type) { _PyObject_EXTRA_INIT { _Py_IMMORTAL_REFCNT }, &type } #else // It seems like they will remove the _PyObject_EXTRA_INIT (not present on master of CPython anymore) // it could still be subject to change but compare to the definition for _Py_NotImplementedStruct or _Py_NoneStruct #define PYIU_CREATE_SINGLETON_INSTANCE(type) { { _Py_IMMORTAL_REFCNT }, &type } #endif #endif #define PyIU_USE_FASTCALL (PYIU_CPYTHON && PY_MAJOR_VERSION == 3 && (PY_MINOR_VERSION == 6 || PY_MINOR_VERSION == 7)) #define PyIU_USE_UNDERSCORE_VECTORCALL (PYIU_CPYTHON && PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION == 8) #define PyIU_USE_VECTORCALL (PYIU_CPYTHON && ((PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION >= 8) || PY_MAJOR_VERSION >= 4)) #define PyIU_USE_BUILTIN_MODULE_ADDTYPE ((PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION >= 9) || PY_MAJOR_VERSION >= 4) #ifdef Py_IS_TYPE #define PyIU_USE_BUILTIN_IS_TYPE 1 #else #define PyIU_USE_BUILTIN_IS_TYPE 0 #endif #ifdef __cplusplus } #endif #endif 070701000000EB000081A400000000000000000000000165E3BCDA00002A4D000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/intersperse.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "intersperse.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" PyDoc_STRVAR( intersperse_prop_fillvalue_doc, "(any type) The interspersed fillvalue (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( intersperse_doc, "intersperse(iterable, e)\n" "--\n\n" "Alternately yield an item from the `iterable` and `e`. Recipe based on the\n" "homonymous function in the `more-itertools` package ([0]_) but significantly\n" "modified.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The iterable to intersperse.\n" "\n" "e : any type\n" " The value with which to intersperse the `iterable`.\n" "\n" "Returns\n" "-------\n" "interspersed : generator\n" " Interspersed `iterable` as generator.\n" "\n" "Notes\n" "-----\n" "This is similar to\n" "``itertools.chain.from_iterable(zip(iterable, itertools.repeat(e)))`` except\n" "that `intersperse` does not yield `e` as last item.\n" "\n" "Examples\n" "--------\n" "A few simple examples::\n" "\n" " >>> from iteration_utilities import intersperse\n" " >>> list(intersperse([1,2,3], 0))\n" " [1, 0, 2, 0, 3]\n" "\n" " >>> list(intersperse('abc', 'x'))\n" " ['a', 'x', 'b', 'x', 'c']\n" "\n" "References\n" "----------\n" ".. [0] https://github.com/erikrose/more-itertools\n"); static PyObject * intersperse_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "e", NULL}; PyIUObject_Intersperse *self; PyObject *iterable; PyObject *filler; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO:intersperse", kwlist, &iterable, &filler)) { return NULL; } self = (PyIUObject_Intersperse *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } Py_INCREF(filler); self->filler = filler; self->nextitem = NULL; self->started = 0; return (PyObject *)self; } static void intersperse_dealloc(PyIUObject_Intersperse *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->filler); Py_XDECREF(self->nextitem); Py_TYPE(self)->tp_free(self); } static int intersperse_traverse(PyIUObject_Intersperse *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->filler); Py_VISIT(self->nextitem); return 0; } static int intersperse_clear(PyIUObject_Intersperse *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->filler); Py_CLEAR(self->nextitem); return 0; } static PyObject * intersperse_next(PyIUObject_Intersperse *self) { if (self->nextitem == NULL) { PyObject *item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); if (item == NULL) { return NULL; } /* If we haven't started we return the first item, otherwise we set the nextitem but return the filler. */ if (self->started == 0) { self->started = 1; return item; } self->nextitem = item; Py_INCREF(self->filler); return self->filler; } else { /* There was a next item, return it and reset nextitem. */ PyObject *item = self->nextitem; self->nextitem = NULL; return item; } } static PyObject * intersperse_reduce(PyIUObject_Intersperse *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on nextitem == NULL because otherwise "None" would be ambiguous. It could mean that we did not had a next item or that the next item was None. Better to make an "if" than to introduce another variable depending on nextitem == NULL. */ PyObject *value; if (self->nextitem == NULL) { value = Py_BuildValue("O(OO)(i)", Py_TYPE(self), self->iterator, self->filler, self->started); } else { value = Py_BuildValue("O(OO)(iO)", Py_TYPE(self), self->iterator, self->filler, self->started, self->nextitem); } return value; } static PyObject * intersperse_setstate(PyIUObject_Intersperse *self, PyObject *state) { int started; PyObject *nextitem = NULL; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "i|O:intersperse.__setstate__", &started, &nextitem)) { return NULL; } /* No need to check the type of "next" because any python object is valid. However we can make sure that "nextitem == NULL" if "started == 0" because otherwise this would produce an invalid "intersperse" instance. Not a segfault but this comparison isn't really costly. */ if (started == 0 && nextitem != NULL) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the second argument " "in the `state` is not given when the first argument is " "0, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(nextitem)->tp_name); return NULL; } Py_XINCREF(nextitem); Py_XSETREF(self->nextitem, nextitem); self->started = started; Py_RETURN_NONE; } static PyObject * intersperse_lengthhint(PyIUObject_Intersperse *self, PyObject *Py_UNUSED(args)) { Py_ssize_t len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } /* We need to multiply the length_hint with 2, to make sure this doesn't trigger undefined behaviour we convert it to "size_t" which can always hold the result (because it's maximum value is 2 * max(py_ssize_t) + 1) Also "LengthHint" always returns >= -1 and we already catched the case where it was -1 so it's not-negative (which could be a problem in the signed -> unsigned conversion). */ if (self->started == 0) { if (len == 0) { return PyLong_FromLong(0); } return PyLong_FromSize_t((size_t)len * 2 - 1); } else if (self->nextitem == NULL) { return PyLong_FromSize_t((size_t)len * 2); } else { /* The iterator is always one step advanced! */ return PyLong_FromSize_t((size_t)len * 2 + 1); } } static PyMethodDef intersperse_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)intersperse_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)intersperse_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)intersperse_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef intersperse_memberlist[] = { { "fillvalue", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Intersperse, filler), /* offset */ READONLY, /* flags */ intersperse_prop_fillvalue_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Intersperse = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.intersperse", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Intersperse), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)intersperse_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)intersperse_doc, /* tp_doc */ (traverseproc)intersperse_traverse, /* tp_traverse */ (inquiry)intersperse_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)intersperse_next, /* tp_iternext */ intersperse_methods, /* tp_methods */ intersperse_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)intersperse_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000EC000081A400000000000000000000000165E3BCDA0000018B000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/intersperse.h#ifndef PYIU_INTERSPERSE_H #define PYIU_INTERSPERSE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *filler; PyObject *nextitem; int started; } PyIUObject_Intersperse; extern PyTypeObject PyIUType_Intersperse; #ifdef __cplusplus } #endif #endif 070701000000ED000081A400000000000000000000000165E3BCDA00000BB8000000000000000000000000000000000000004E00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/isx.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "isx.h" #include "helper.h" /****************************************************************************** * is_None * * equivalent to: * * lambda value: value is None *****************************************************************************/ PyObject * PyIU_IsNone(PyObject *Py_UNUSED(m), PyObject *o) { if (o == Py_None) { Py_RETURN_TRUE; } else { Py_RETURN_FALSE; } } /****************************************************************************** * is_not_None * * equivalent to: * * lambda value: value is not None *****************************************************************************/ PyObject * PyIU_IsNotNone(PyObject *Py_UNUSED(m), PyObject *o) { if (o != Py_None) { Py_RETURN_TRUE; } else { Py_RETURN_FALSE; } } /****************************************************************************** * is_even * * equivalent to: * * lambda value: value % 2 == 0 *****************************************************************************/ PyObject * PyIU_IsEven(PyObject *Py_UNUSED(m), PyObject *o) { PyObject *remainder; int res; remainder = PyNumber_Remainder(o, PyIU_global_two); if (remainder == NULL) { return NULL; } res = PyObject_IsTrue(remainder); Py_DECREF(remainder); if (res > 0) { Py_RETURN_FALSE; } else if (res == 0) { Py_RETURN_TRUE; } else { return NULL; } } /****************************************************************************** * is_odd * * equivalent to: * * lambda value: value % 2 != 0 *****************************************************************************/ PyObject * PyIU_IsOdd(PyObject *Py_UNUSED(m), PyObject *o) { PyObject *remainder; int res; remainder = PyNumber_Remainder(o, PyIU_global_two); if (remainder == NULL) { return NULL; } res = PyObject_IsTrue(remainder); Py_DECREF(remainder); if (res > 0) { Py_RETURN_TRUE; } else if (res == 0) { Py_RETURN_FALSE; } else { return NULL; } } /****************************************************************************** * is_iterable * * equivalent to: * * try: * iter(value) * except TypeError: * return False * else: * return True *****************************************************************************/ PyObject * PyIU_IsIterable(PyObject *Py_UNUSED(m), PyObject *o) { PyObject *it = PyObject_GetIter(o); if (it == NULL) { if (PyErr_Occurred() && PyErr_ExceptionMatches(PyExc_TypeError)) { PyErr_Clear(); Py_RETURN_FALSE; } else { return NULL; } } else { Py_DECREF(it); Py_RETURN_TRUE; } } 070701000000EE000081A400000000000000000000000165E3BCDA000001F3000000000000000000000000000000000000004E00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/isx.h#ifndef PYIU_ISX_H #define PYIU_ISX_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_IsNone(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_IsNotNone(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_IsEven(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_IsOdd(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_IsIterable(PyObject *Py_UNUSED(m), PyObject *o); #ifdef __cplusplus } #endif #endif 070701000000EF000081A400000000000000000000000165E3BCDA00004517000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/itemidxkey.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "itemidxkey.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( itemidxkey_prop_item_doc, "(any type) The `item` to sort."); PyDoc_STRVAR( itemidxkey_prop_idx_doc, "(:py:class:`int`) The original position of the `item`."); PyDoc_STRVAR( itemidxkey_prop_key_doc, "(any type) The result of a key function applied to the `item`."); PyDoc_STRVAR( itemidxkey_doc, "ItemIdxKey(item, idx, /, key)\n" "--\n\n" "Helper class that makes it easier and faster to compare two values for\n" "*stable* sorting algorithms supporting key functions.\n" "\n" "Parameters\n" "----------\n" "item : any type\n" " The original `item`.\n" "\n" "idx : :py:class:`int`\n" " The position (index) of the `item`.\n" "\n" "key : any type, optional\n" " If given (even as ``None``) this should be the `item` processed by the \n" " `key` function. If it is set then comparisons will compare the `key` \n" " instead of the `item`.\n" "\n" "Notes\n" "-----\n" "Comparisons involving :py:class:`~iteration_utilities.ItemIdxKey` have some \n" "limitations:\n" "\n" "- Both have to be :py:class:`~iteration_utilities.ItemIdxKey` instances.\n" "- If the first operand has no :py:attr:`.key` then the :py:attr:`.item` are \n" " compared.\n" "- The :py:attr:`.idx` must be different.\n" "- only :py:meth:`< <.__lt__>` and :py:meth:`> <.__gt__>` are supported!\n" "\n" "The implementation is roughly like:\n" "\n" ".. code::\n" "\n" " _notgiven = object()\n" " \n" " class ItemIdxKey:\n" " def __init__(self, item, idx, key=_notgiven):\n" " self.item = item\n" " self.idx = idx\n" " self.key = key\n" " \n" " def __lt__(self, other):\n" " if type(other) != ItemIdxKey:\n" " raise TypeError()\n" " if self.key is _notgiven:\n" " item1, item2 = self.item, other.item\n" " else:\n" " item1, item2 = self.key, other.key\n" " if self.idx < other.idx:\n" " return item1 <= item2\n" " else:\n" " return item1 < item2\n" " \n" " def __gt__(self, other):\n" " if type(other) != ItemIdxKey:\n" " raise TypeError()\n" " if self.key is _notgiven:\n" " item1, item2 = self.item, other.item\n" " else:\n" " item1, item2 = self.key, other.key\n" " if self.idx < other.idx:\n" " return item1 >= item2\n" " else:\n" " return item1 > item2\n" "\n" ".. note::\n" " The actual C makes the initialization and comparisons several times faster\n" " than the above illustrated Python class! But it's only slightly faster\n" " than comparing :py:class:`tuple` or :py:class:`list`. If you do not plan \n" " to support `reverse` or `key` then there is no need to use this class!\n" "\n" ".. warning::\n" " You should **never** insert a :py:class:`~iteration_utilities.ItemIdxKey` \n" " instance as :py:attr:`.item` or :py:attr:`.key` in another\n" " :py:class:`~iteration_utilities.ItemIdxKey` instance. This would yield \n" " wrong results and breaks your computer! (the latter might not be true.)\n" "\n" "Examples\n" "--------\n" "Stability is one of the distinct features of sorting algorithms. This class\n" "aids in supporting those algorithms which allow `reverse` and `key`.\n" "This means that comparisons require absolute lesser (or greater if `reverse`)\n" "if the :py:attr:`.idx` is bigger but only require lesser or equal (or greater or equal)\n" "if the :py:attr:`.idx` is smaller. This class implements exactly these conditions::\n" "\n" " >>> # Use < for normal sorting.\n" " >>> ItemIdxKey(10, 2) < ItemIdxKey(10, 3)\n" " True\n" " >>> # and > for reverse sorting.\n" " >>> ItemIdxKey(10, 2) > ItemIdxKey(10, 3)\n" " True\n" "\n" "The result may seem surprising but if the :py:attr:`.item` (or :py:attr:`.key`) is equal then\n" "in either normal or `reverse` sorting the one with the smaller :py:attr:`.idx` should\n" "come first! If the :py:attr:`.item` (or :py:attr:`.key`) differ they take precedence.\n" "\n" " >>> ItemIdxKey(10, 2) < ItemIdxKey(11, 3)\n" " True\n" " >>> ItemIdxKey(10, 2) > ItemIdxKey(11, 3)\n" " False\n" "\n" "But it compares the :py:attr:`.key` instead of the :py:attr:`.item` if it's given::\n" "\n" " >>> ItemIdxKey(0, 2, 20) < ItemIdxKey(10, 3, 19)\n" " False\n" " >>> ItemIdxKey(0, 2, 20) > ItemIdxKey(10, 3, 19)\n" " True\n" "\n" "This allows to sort based on :py:attr:`.item` or :py:attr:`.key` but always \n" "to access the :py:attr:`.item` for the value that should be sorted.\n"); /****************************************************************************** * * Helper class that mimics a 2-tuple when compared but dynamically decides * which item to compare (item or key) and assumes that the idx is always * different. * * It also has a constructor function that bypasses the args/kwargs unpacking * to allow faster creation from within C code. *****************************************************************************/ /****************************************************************************** * New (only from C code) * * This bypasses the argument unpacking! *****************************************************************************/ PyObject * PyIU_ItemIdxKey_FromC(PyObject *item, Py_ssize_t idx, PyObject *key) { /* STEALS REFERENCES!!! */ assert(item != NULL); assert(!PyIU_ItemIdxKey_Check(item)); assert(key == NULL || !PyIU_ItemIdxKey_Check(key)); PyIUObject_ItemIdxKey *self; /* Create and fill new ItemIdxKey. */ self = PyObject_GC_New(PyIUObject_ItemIdxKey, &PyIUType_ItemIdxKey); if (self == NULL) { // So that the function always steals references even if allocation failed. Py_DECREF(item); Py_XDECREF(key); return NULL; } self->item = item; self->idx = idx; self->key = key; PyObject_GC_Track(self); return (PyObject *)self; } PyObject * PyIU_ItemIdxKey_Copy(PyObject *iik) { assert(iik != NULL && PyIU_IsTypeExact(iik, &PyIUType_ItemIdxKey)); PyIUObject_ItemIdxKey *o = (PyIUObject_ItemIdxKey *)iik; Py_INCREF(o->item); Py_XINCREF(o->key); return PyIU_ItemIdxKey_FromC(o->item, o->idx, o->key); } static PyObject * itemidxkey_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"item", "idx", "key", NULL}; PyIUObject_ItemIdxKey *self; PyObject *item; PyObject *key = NULL; Py_ssize_t idx; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "On|O:ItemIdxKey", kwlist, &item, &idx, &key)) { return NULL; } if (PyIU_ItemIdxKey_Check(item)) { PyErr_SetString(PyExc_TypeError, "`item` argument for `ItemIdxKey` must not be a " "`ItemIdxKey` instance."); return NULL; } if (key != NULL && PyIU_ItemIdxKey_Check(key)) { PyErr_SetString(PyExc_TypeError, "`key` argument for `ItemIdxKey` must not be a " "`ItemIdxKey` instance."); return NULL; } self = (PyIUObject_ItemIdxKey *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(item); Py_XINCREF(key); self->item = item; self->idx = idx; self->key = key; return (PyObject *)self; } static void itemidxkey_dealloc(PyIUObject_ItemIdxKey *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->item); Py_XDECREF(self->key); Py_TYPE(self)->tp_free((PyObject *)self); } static int itemidxkey_traverse(PyIUObject_ItemIdxKey *self, visitproc visit, void *arg) { Py_VISIT(self->item); Py_VISIT(self->key); return 0; } static int itemidxkey_clear(PyIUObject_ItemIdxKey *self) { Py_CLEAR(self->item); Py_CLEAR(self->key); return 0; } static PyObject * itemidxkey_repr(PyIUObject_ItemIdxKey *self) { PyObject *repr; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } if (self->key == NULL) { repr = PyUnicode_FromFormat("%s(item=%R, idx=%zd)", Py_TYPE(self)->tp_name, self->item, self->idx); } else { /* The representation of the item could modify/delete the key and then the representation of the key could segfault. Better to make the key undeletable as long as PyUnicode_FromFormat runs. */ PyObject *tmpkey = self->key; Py_INCREF(tmpkey); repr = PyUnicode_FromFormat("%s(item=%R, idx=%zd, key=%R)", Py_TYPE(self)->tp_name, self->item, self->idx, tmpkey); Py_DECREF(tmpkey); } Py_ReprLeave((PyObject *)self); return repr; } int PyIU_ItemIdxKey_Compare(PyObject *v, PyObject *w, int op) { assert(v != NULL && PyIU_ItemIdxKey_Check(v)); assert(w != NULL && PyIU_ItemIdxKey_Check(w)); assert(op == Py_GT || op == Py_LT); PyObject *item1; PyObject *item2; PyIUObject_ItemIdxKey *l; PyIUObject_ItemIdxKey *r; l = (PyIUObject_ItemIdxKey *)v; r = (PyIUObject_ItemIdxKey *)w; /* Compare items if key is NULL otherwise compare keys. */ if (l->key == NULL) { item1 = l->item; item2 = r->item; } else { item1 = l->key; item2 = r->key; } /* The order to check for equality and lt makes a huge performance difference: - lots of duplicates: first eq then "op" - no/few duplicates: first "op" then eq - first compare idx and if it's smaller check le/ge otherwise lt/gt --> I chose eq then "op" */ if (l->idx < r->idx) { op = (op == Py_LT) ? Py_LE : Py_GE; } return PyObject_RichCompareBool(item1, item2, op); } static PyObject * itemidxkey_richcompare(PyObject *v, PyObject *w, int op) { int ok; /* Only allow < and > for now */ switch (op) { case Py_LT: case Py_GT: break; default: Py_RETURN_NOTIMPLEMENTED; } /* only allow ItemIdxKey to be compared. */ if (!PyIU_ItemIdxKey_Check(v) || !PyIU_ItemIdxKey_Check(w)) Py_RETURN_NOTIMPLEMENTED; ok = PyIU_ItemIdxKey_Compare(v, w, op); if (ok == 1) { Py_RETURN_TRUE; } else if (ok == 0) { Py_RETURN_FALSE; } else { return NULL; } } static PyObject * itemidxkey_getitem(PyIUObject_ItemIdxKey *self, void *Py_UNUSED(closure)) { Py_INCREF(self->item); return self->item; } static int itemidxkey_setitem(PyIUObject_ItemIdxKey *self, PyObject *o, void *Py_UNUSED(closure)) { if (o == NULL) { PyErr_SetString(PyExc_TypeError, "cannot delete `item` attribute of `ItemIdxKey`."); return -1; } else if (PyIU_ItemIdxKey_Check(o)) { PyErr_SetString(PyExc_TypeError, "cannot use `ItemIdxKey` instance as `item` of " "`ItemIdxKey`."); return -1; } Py_INCREF(o); Py_SETREF(self->item, o); return 0; } static PyObject * itemidxkey_getidx(PyIUObject_ItemIdxKey *self, void *Py_UNUSED(closure)) { return PyLong_FromSsize_t(self->idx); } static int itemidxkey_setidx(PyIUObject_ItemIdxKey *self, PyObject *o, void *Py_UNUSED(closure)) { Py_ssize_t idx; if (o == NULL) { PyErr_SetString(PyExc_TypeError, "cannot delete `idx` attribute of `ItemIdxKey`."); return -1; } if (PyLong_Check(o)) { idx = PyLong_AsSsize_t(o); } else { PyErr_SetString(PyExc_TypeError, "an integer is required as `idx` attribute of " "`ItemIdxKey`."); return -1; } if (PyErr_Occurred()) { return -1; } self->idx = idx; return 0; } static PyObject * itemidxkey_getkey(PyIUObject_ItemIdxKey *self, void *Py_UNUSED(closure)) { if (self->key == NULL) { PyErr_SetString(PyExc_AttributeError, "the `key` attribute of `ItemIdxKey` instance is not " "set."); return NULL; } Py_INCREF(self->key); return self->key; } static int itemidxkey_setkey(PyIUObject_ItemIdxKey *self, PyObject *o, void *Py_UNUSED(closure)) { if (o != NULL && PyIU_ItemIdxKey_Check(o)) { PyErr_SetString(PyExc_TypeError, "cannot use `ItemIdxKey` instance as `key` attribute " "of `ItemIdxKey`."); return -1; } /* Cannot delete an non-existing attribute... */ if (o == NULL && self->key == NULL) { PyErr_SetString(PyExc_AttributeError, "the `key` attribute of `ItemIdxKey` instance is not " "set and cannot be deleted."); return -1; } Py_XINCREF(o); Py_XSETREF(self->key, o); return 0; } static PyObject * itemidxkey_reduce(PyIUObject_ItemIdxKey *self, PyObject *Py_UNUSED(args)) { if (self->key == NULL) { return Py_BuildValue("O(On)", Py_TYPE(self), self->item, self->idx); } else { return Py_BuildValue("O(OnO)", Py_TYPE(self), self->item, self->idx, self->key); } } static PyMethodDef itemidxkey_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)itemidxkey_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyGetSetDef itemidxkey_getsetlist[] = { { "item", /* name */ (getter)itemidxkey_getitem, /* get */ (setter)itemidxkey_setitem, /* set */ itemidxkey_prop_item_doc, /* doc */ (void *)NULL /* closure */ }, { "idx", /* name */ (getter)itemidxkey_getidx, /* get */ (setter)itemidxkey_setidx, /* set */ itemidxkey_prop_idx_doc, /* doc */ (void *)NULL /* closure */ }, { "key", /* name */ (getter)itemidxkey_getkey, /* get */ (setter)itemidxkey_setkey, /* set */ itemidxkey_prop_key_doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_ItemIdxKey = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.ItemIdxKey", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_ItemIdxKey), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)itemidxkey_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)itemidxkey_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)0, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)itemidxkey_doc, /* tp_doc */ (traverseproc)itemidxkey_traverse, /* tp_traverse */ (inquiry)itemidxkey_clear, /* tp_clear */ (richcmpfunc)itemidxkey_richcompare, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ itemidxkey_methods, /* tp_methods */ 0, /* tp_members */ itemidxkey_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)itemidxkey_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000F0000081A400000000000000000000000165E3BCDA0000027A000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/itemidxkey.h#ifndef PYIU_ITEMIDXKEY_H #define PYIU_ITEMIDXKEY_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *item; PyObject *key; Py_ssize_t idx; } PyIUObject_ItemIdxKey; extern PyTypeObject PyIUType_ItemIdxKey; #define PyIU_ItemIdxKey_Check(o) PyObject_TypeCheck(o, &PyIUType_ItemIdxKey) PyObject * PyIU_ItemIdxKey_FromC(PyObject *item, Py_ssize_t idx, PyObject *key); PyObject * PyIU_ItemIdxKey_Copy(PyObject *iik); int PyIU_ItemIdxKey_Compare(PyObject *v, PyObject *w, int op); #ifdef __cplusplus } #endif #endif 070701000000F1000081A400000000000000000000000165E3BCDA000022DB000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/iterexcept.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "iterexcept.h" #include "helper.h" #include <structmember.h> #include "docs_reduce.h" PyDoc_STRVAR( iterexcept_prop_func_doc, "(any type) The function that is called by `iter_except` (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( iterexcept_prop_exception_doc, "(any type) The exception that ends `iter_except` (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( iterexcept_prop_first_doc, "(any type) The function that is called once (as setup) by `iter_except` " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( iterexcept_doc, "iter_except(func, exception, first=None)\n" "--\n\n" "Call a function repeatedly until an `exception` is raised.\n" "\n" "Converts a call-until-exception interface to an iterator interface.\n" "Like ``iter(func, sentinel)`` but uses an `exception` instead of a sentinel\n" "to end the loop.\n" "\n" "Parameters\n" "----------\n" "func : callable\n" " The function that is called until `exception` is raised.\n" "\n" "exception : Exception\n" " The `exception` which terminates the iteration.\n" "\n" "first : callable or None, optional\n" " If not given (or not ``None``) this function is called once before the \n" " `func` is executed.\n" "\n" "Returns\n" "-------\n" "result : generator\n" " The result of the `func` calls as generator.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import iter_except\n" ">>> from collections import OrderedDict\n" "\n" ">>> d = OrderedDict([('a', 1), ('b', 2)])\n" ">>> list(iter_except(d.popitem, KeyError))\n" "[('b', 2), ('a', 1)]\n" "\n" ".. note::\n" " ``d.items()`` would yield the same result.\n" "\n" ">>> from math import sqrt\n" "\n" ">>> g = (sqrt(i) for i in [5, 4, 3, 2, 1, 0, -1, -2, -3])\n" ">>> def say_go():\n" "... return 'go'\n" ">>> list(iter_except(g.__next__, ValueError, say_go))\n" "['go', 2.23606797749979, 2.0, 1.7320508075688772, 1.4142135623730951, 1.0, 0.0]\n" "\n" "Notes\n" "-----\n" "Further examples:\n" "\n" "- ``bsd_db_iter = iter_except(db.next, bsddb.error, db.first)``\n" "- ``heap_iter = iter_except(functools.partial(heappop, h), IndexError)``\n" "- ``dict_iter = iter_except(d.popitem, KeyError)``\n" "- ``deque_iter = iter_except(d.popleft, IndexError)``\n" "- ``queue_iter = iter_except(q.get_nowait, Queue.Empty)``\n" "- ``set_iter = iter_except(s.pop, KeyError)``\n"); static PyObject * iterexcept_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"func", "exception", "first", NULL}; PyIUObject_Iterexcept *self; PyObject *func; PyObject *except; PyObject *first = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO|O:iter_except", kwlist, &func, &except, &first)) { return NULL; } self = (PyIUObject_Iterexcept *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(func); self->func = func; Py_INCREF(except); self->except = except; self->first = first == Py_None ? NULL : first; Py_XINCREF(self->first); return (PyObject *)self; } static void iterexcept_dealloc(PyIUObject_Iterexcept *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->func); Py_XDECREF(self->except); Py_XDECREF(self->first); Py_TYPE(self)->tp_free(self); } static int iterexcept_traverse(PyIUObject_Iterexcept *self, visitproc visit, void *arg) { Py_VISIT(self->func); Py_VISIT(self->except); Py_VISIT(self->first); return 0; } static int iterexcept_clear(PyIUObject_Iterexcept *self) { Py_CLEAR(self->func); Py_CLEAR(self->except); Py_CLEAR(self->first); return 0; } static PyObject * iterexcept_next(PyIUObject_Iterexcept *self) { PyObject *result; /* Call the first if it's set (nulling it thereafter) or the func if not. */ if (self->first == NULL) { result = PyIU_CallWithNoArgument(self->func); } else { result = PyIU_CallWithNoArgument(self->first); Py_CLEAR(self->first); } /* Stop if the result is NULL but only clear the exception if the expected exception happened otherwise just return the result (thereby bubbling up other exceptions). */ if (result == NULL && PyErr_Occurred() && PyErr_ExceptionMatches(self->except)) { PyErr_Clear(); return NULL; } else { return result; } } static PyObject * iterexcept_reduce(PyIUObject_Iterexcept *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OOO)", Py_TYPE(self), self->func, self->except, self->first ? self->first : Py_None); } static PyMethodDef iterexcept_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)iterexcept_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef iterexcept_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Iterexcept, func), /* offset */ READONLY, /* flags */ iterexcept_prop_func_doc /* doc */ }, { "exception", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Iterexcept, except), /* offset */ READONLY, /* flags */ iterexcept_prop_exception_doc /* doc */ }, { "first", /* name */ T_OBJECT_EX, /* type */ offsetof(PyIUObject_Iterexcept, first), /* offset */ READONLY, /* flags */ iterexcept_prop_first_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Iterexcept = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.iter_except", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Iterexcept), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)iterexcept_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)iterexcept_doc, /* tp_doc */ (traverseproc)iterexcept_traverse, /* tp_traverse */ (inquiry)iterexcept_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)iterexcept_next, /* tp_iternext */ iterexcept_methods, /* tp_methods */ iterexcept_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)iterexcept_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000F2000081A400000000000000000000000165E3BCDA0000016F000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/iterexcept.h#ifndef PYIU_ITEREXCEPT_H #define PYIU_ITEREXCEPT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; PyObject *except; PyObject *first; } PyIUObject_Iterexcept; extern PyTypeObject PyIUType_Iterexcept; #ifdef __cplusplus } #endif #endif 070701000000F3000081A400000000000000000000000165E3BCDA00000C5C000000000000000000000000000000000000005700000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/mathematical.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "mathematical.h" #include "helper.h" /****************************************************************************** * partial-like functions: * * square : lambda value: value ** 2 * double : lambda value: value * 2 * reciprocal : lambda value: 1 / value *****************************************************************************/ PyObject * PyIU_MathSquare(PyObject *Py_UNUSED(m), PyObject *o) { return PyNumber_Power(o, PyIU_global_two, Py_None); } PyObject * PyIU_MathDouble(PyObject *Py_UNUSED(m), PyObject *o) { return PyNumber_Multiply(o, PyIU_global_two); } PyObject * PyIU_MathReciprocal(PyObject *Py_UNUSED(m), PyObject *o) { return PyNumber_TrueDivide(PyIU_global_one, o); } /****************************************************************************** * Reverse arithmetic operators: * * radd : lambda o1, o2: o2 + o1 * rsub : lambda o1, o2: o2 - o1 * rmul : lambda o1, o2: o2 * o1 * rdiv : lambda o1, o2: o2 / o1 * rfdiv : lambda o1, o2: o2 // o1 * rpow : lambda o1, o2: o2 ** o1 * rmod : lambda o1, o2: o2 % o1 *****************************************************************************/ PyObject * PyIU_MathRadd(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "radd", 2, 2, &op1, &op2)) { return PyNumber_Add(op2, op1); } else { return NULL; } } PyObject * PyIU_MathRsub(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "rsub", 2, 2, &op1, &op2)) { return PyNumber_Subtract(op2, op1); } else { return NULL; } } PyObject * PyIU_MathRmul(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "rmul", 2, 2, &op1, &op2)) { return PyNumber_Multiply(op2, op1); } else { return NULL; } } PyObject * PyIU_MathRdiv(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "rdiv", 2, 2, &op1, &op2)) { return PyNumber_TrueDivide(op2, op1); } else { return NULL; } } PyObject * PyIU_MathRfdiv(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "rfdiv", 2, 2, &op1, &op2)) { return PyNumber_FloorDivide(op2, op1); } else { return NULL; } } PyObject * PyIU_MathRpow(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "rpow", 2, 2, &op1, &op2)) { return PyNumber_Power(op2, op1, Py_None); } else { return NULL; } } PyObject * PyIU_MathRmod(PyObject *Py_UNUSED(m), PyObject *args) { PyObject *op1; PyObject *op2; if (PyArg_UnpackTuple(args, "rmod", 2, 2, &op1, &op2)) { return PyNumber_Remainder(op2, op1); } else { return NULL; } } 070701000000F4000081A400000000000000000000000165E3BCDA00000364000000000000000000000000000000000000005700000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/mathematical.h#ifndef PYIU_MATHEMATICAL_H #define PYIU_MATHEMATICAL_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_MathSquare(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_MathDouble(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_MathReciprocal(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_MathRadd(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_MathRsub(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_MathRmul(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_MathRdiv(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_MathRfdiv(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_MathRpow(PyObject *Py_UNUSED(m), PyObject *args); PyObject * PyIU_MathRmod(PyObject *Py_UNUSED(m), PyObject *args); #ifdef __cplusplus } #endif #endif 070701000000F5000081A400000000000000000000000165E3BCDA00006E1C000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/merge.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "merge.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" #include "itemidxkey.h" PyDoc_STRVAR( merge_prop_key_doc, "(callable or None) The key function used by merge (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( merge_prop_reverse_doc, "(:py:class:`bool`) Indicates if merged by ``>`` instead of ``<`` " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( merge_doc, "merge(*iterables, key=None, reverse=False)\n" "--\n\n" "Merge several sorted `iterables` into one.\n" "\n" "Parameters\n" "----------\n" "iterables : iterable\n" " Any amount of already sorted `iterable` objects.\n" "\n" "key : callable or None, optional\n" " If not given compare the item themselves otherwise compare the\n" " result of ``key(item)``, like the `key` parameter for\n" " :py:func:`sorted`.\n" "\n" "reverse : :py:class:`bool`, optional\n" " If ``True`` then merge in decreasing order instead of increasing order.\n" " Default is ``False``.\n" "\n" "Returns\n" "-------\n" "merged : generator\n" " The merged iterables as generator.\n" "\n" "See also\n" "--------\n" "heapq.merge : Equivalent since Python 3.5 but in most cases slower!\n" " Earlier Python versions did not support the `key` or `reverse` argument.\n" "\n" "sorted : ``sorted(itertools.chain(*iterables))`` supports the same options\n" " and *can* be faster.\n" "\n" "Examples\n" "--------\n" "To merge multiple sorted `iterables`::\n" "\n" " >>> from iteration_utilities import merge\n" " >>> list(merge([1, 3, 5, 7, 9], [2, 4, 6, 8, 10]))\n" " [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]\n" "\n" "It's stable and allows a `key` function::\n" "\n" " >>> seq1 = [(1, 3), (3, 3)]\n" " >>> seq2 = [(-1, 3), (-3, 3)]\n" " >>> list(merge(seq1, seq2, key=lambda x: abs(x[0])))\n" " [(1, 3), (-1, 3), (3, 3), (-3, 3)]\n" "\n" "Also possible to `reverse` (biggest to smallest order) the merge::\n" "\n" " >>> list(merge([5,1,-8], [10, 2, 1, 0], reverse=True))\n" " [10, 5, 2, 1, 1, 0, -8]\n" "\n" "But also more than two `iterables`::\n" "\n" " >>> list(merge([1, 10, 11], [2, 9], [3, 8], [4, 7], [5, 6], range(10)))\n" " [0, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 10, 11]\n" "\n" "However if the `iterables` are not sorted the result will be unsorted\n" "(partially sorted)::\n" "\n" " >>> list(merge(range(10), [6,1,3,2,6,1,6]))\n" " [0, 1, 2, 3, 4, 5, 6, 6, 1, 3, 2, 6, 1, 6, 7, 8, 9]\n"); /****************************************************************************** * * IMPORTANT NOTE: * * The function does the same as "heapq.merge(*iterables)" or * "sorted(itertools.chain(*iterables))" it is included because heapq.merge * did not have the key and reverse parameter before Python 3.5 and it is * included for compatibility reasons. * * That this is (much) faster than heapq.merge for most inputs is a nice but * worrying side effect. :-( * *****************************************************************************/ /****************************************************************************** * ------------------------------- HELPER ------------------------------------- * * Find the position to insert a value in an already sorted tuple. Assumes that * the sorting should be stable and searches the rightmost place where the * tuple is still sorted. * * Function will compare first to the "hi-1"-th element and then start * bisecting. (See inline code for explanation). * * tuple : Sorted tuple to inspect * item : Value to search the position for. * hi : Upper index to search for. * cmpop : The comparison operator to use. For example Py_LT for a tuple sorted * from low to high or Py_GT for a tuple sorted from high to low. * * Returns -1 on failure otherwise a positive Py_ssize_t value. * * Copied and modified from the python bisect module. *****************************************************************************/ static Py_ssize_t PyIU_TupleBisectRight_LastFirst(PyObject *tuple, PyObject *item, Py_ssize_t hi, int cmpop) { PyObject *litem; int res; /* Indices for the left end and mid of the current part of the array. The right end (hi) is given as input. */ Py_ssize_t mid, lo = 0; /* Bisection has two worst cases: If it should be inserted in the first or last place. The list is reverse-ordered so it's likely that the bisection could return the last place (for bisect_left it would be the first) in the "merge_sorted" function. Checking the number of comparisons in "merge" shows that merge now uses slightly less comparisons than "sorted" in the average case, slightly more in the worst case and much less in the best case! */ /* So let's check the last item first! */ if (hi <= 0) { return 0; } litem = PyTuple_GET_ITEM(tuple, hi - 1); res = PyIU_ItemIdxKey_Compare(item, litem, cmpop); if (res == 1) { return hi; } else if (res == 0) { hi = hi - 1; } else { return -1; } /* Start the normal bisection algorithm from bisect.c */ while (lo < hi) { mid = ((size_t)lo + hi) / 2; litem = PyTuple_GET_ITEM(tuple, mid); res = PyIU_ItemIdxKey_Compare(item, litem, cmpop); if (res == 1) { lo = mid + 1; } else if (res == 0) { hi = mid; } else { return -1; } } return lo; } static PyObject * merge_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"key", "reverse", NULL}; PyIUObject_Merge *self; PyObject *keyfunc = NULL; int reverse = 0; if (!PyArg_ParseTupleAndKeywords(PyIU_global_0tuple, kwargs, "|Op:merge", kwlist, &keyfunc, &reverse)) { return NULL; } self = (PyIUObject_Merge *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iteratortuple = PyIU_CreateIteratorTuple(args); if (self->iteratortuple == NULL) { Py_DECREF(self); return NULL; } self->keyfunc = keyfunc == Py_None ? NULL : keyfunc; Py_XINCREF(self->keyfunc); self->reverse = reverse ? Py_GT : Py_LT; self->current = NULL; self->numactive = PyTuple_GET_SIZE(args); return (PyObject *)self; } static void merge_dealloc(PyIUObject_Merge *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iteratortuple); Py_XDECREF(self->keyfunc); Py_XDECREF(self->current); Py_TYPE(self)->tp_free(self); } static int merge_traverse(PyIUObject_Merge *self, visitproc visit, void *arg) { Py_VISIT(self->iteratortuple); Py_VISIT(self->keyfunc); Py_VISIT(self->current); return 0; } static int merge_clear(PyIUObject_Merge *self) { Py_CLEAR(self->iteratortuple); Py_CLEAR(self->keyfunc); Py_CLEAR(self->current); return 0; } static int merge_init_current(PyIUObject_Merge *self) { PyObject *current; Py_ssize_t i; Py_ssize_t tuplelength; current = PyTuple_New(self->numactive); if (current == NULL) { return -1; } tuplelength = 0; for (i = 0; i < self->numactive; i++) { PyObject *item; PyObject *iterator = PyTuple_GET_ITEM(self->iteratortuple, i); item = Py_TYPE(iterator)->tp_iternext(iterator); if (item != NULL) { PyObject *newitem; PyObject *keyval = NULL; /* The idea here is that we can keep stability by also remembering the index of the iterable (which is also useful to remember from which iterable to get the next item if it is yielded). */ if (self->keyfunc != NULL) { keyval = PyIU_CallWithOneArgument(self->keyfunc, item); if (keyval == NULL) { Py_DECREF(item); Py_DECREF(current); return -1; } } newitem = PyIU_ItemIdxKey_FromC(item, i, keyval); if (newitem == NULL) { Py_DECREF(current); return -1; } /* Insert the tuple into the current tuple. */ if (tuplelength == 0) { PyTuple_SET_ITEM(current, 0, newitem); } else { Py_ssize_t insert = PyIU_TupleBisectRight_LastFirst( current, newitem, tuplelength, self->reverse); if (insert < 0) { Py_DECREF(newitem); Py_DECREF(current); return -1; } PyIU_TupleInsert(current, insert, newitem, tuplelength + 1); } tuplelength++; } else { if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(current); return -1; } } } self->numactive = tuplelength; self->current = current; return 0; } static PyObject * merge_next(PyIUObject_Merge *self) { PyObject *iterator; PyObject *item; PyObject *val; Py_ssize_t insert = 0; Py_ssize_t active; PyIUObject_ItemIdxKey *next; /* No current then we create one. */ if (self->current == NULL) { if (merge_init_current(self) < 0) { return NULL; } } /* Finished as soon as there are no more active iterators. */ if (self->numactive == 0) { return NULL; } active = self->numactive - 1; /* Tuple containing the next value. */ next = (PyIUObject_ItemIdxKey *)PyTuple_GET_ITEM(self->current, active); Py_INCREF(next); /* Value to be returned. */ val = next->item; Py_INCREF(val); /* Get the next value from the iterable where the value was from. */ iterator = PyTuple_GET_ITEM(self->iteratortuple, next->idx); item = Py_TYPE(iterator)->tp_iternext(iterator); if (item == NULL) { /* No need to keep the extra reference for the ItemIdxKey because there is no successive value and we replace the item in the current tuple with NULL. */ PyTuple_SET_ITEM(self->current, active, NULL); Py_DECREF(next); // This really deletes the reference in self->current. Py_DECREF(next); next = NULL; if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(val); return NULL; } self->numactive = active; } else { if (self->keyfunc != NULL) { PyObject *keyval; keyval = PyIU_CallWithOneArgument(self->keyfunc, item); if (keyval == NULL) { Py_DECREF(item); Py_DECREF(val); Py_DECREF(next); return NULL; } Py_SETREF(next->key, keyval); keyval = NULL; } Py_SETREF(next->item, item); item = NULL; /* Insert the new value into the sorted current tuple. */ insert = PyIU_TupleBisectRight_LastFirst( self->current, (PyObject *)next, active, self->reverse); if (insert == -1) { Py_DECREF(val); Py_DECREF(next); return NULL; } PyIU_TupleInsert(self->current, insert, (PyObject *)next, self->numactive); Py_DECREF(next); next = NULL; } return val; } static PyObject * merge_reduce(PyIUObject_Merge *self, PyObject *Py_UNUSED(args)) { PyObject *res; PyObject *current; /* We need to expose the "current" tuple. However this tuple is modifed when calling next so we need a copy, otherwise people would have a mutable tuple. That must NOT happen! In case the number of elements in the tuple differs from the "numactive" attribute we can simply slice the trailing NULLs away. The "ItemIdxKey" instances inside the "current" tuple are mutable so we need to make sure these cannot be altered from outside. So we need to make more than a shallow copy... The "iteratortuple" isn't changed in the "next" call so we can simply expose it as-is. */ if (self->current == NULL) { current = Py_None; Py_INCREF(current); } else { Py_ssize_t i; current = PyTuple_New(self->numactive); if (current == NULL) { return NULL; } for (i = 0; i < self->numactive; i++) { PyObject *iik1 = PyTuple_GET_ITEM(self->current, i); PyObject *iik2 = PyIU_ItemIdxKey_Copy(iik1); if (iik2 == NULL) { return NULL; } PyTuple_SET_ITEM(current, i, iik2); } } /* No need to copy the iteratortuple because we don't modify it anywhere so we can easily get away by having more than one reference for it. */ res = Py_BuildValue("OO(OiOn)", Py_TYPE(self), self->iteratortuple, self->keyfunc ? self->keyfunc : Py_None, self->reverse, current, self->numactive); Py_DECREF(current); return res; } static PyObject * merge_setstate(PyIUObject_Merge *self, PyObject *state) { PyObject *current; PyObject *keyfunc; Py_ssize_t numactive; int reverse; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "OiOn:merge.__setstate__", &keyfunc, &reverse, ¤t, &numactive)) { return NULL; } if (current == Py_None) { current = NULL; } if (keyfunc == Py_None) { keyfunc = NULL; } /* If it's from a "reduce" call then it should work fine, but if someone tries to feed anything in here we need to check the conditions the next is based on: - 0 <= numactive <= len(iteratortuple) == len(current) (except when no current is initialized) - current may only contain ItemIdxKey instances - These must have NO key-attribute when keyfunc == NULL - These must have A key-attribute when keyfunc != NULL - These must not have an idx that is out of range for the iteratortuple These tests only make sure the function does not crash, the inputs may result in useless results! */ /* "numactive" must be positive and <= len(self->iteratortuple) otherwise item access might segfault. */ if (numactive < 0 || numactive > PyTuple_GET_SIZE(self->iteratortuple)) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the fourth (%zd) " "argument in the `state` is not negative and is smaller " "or equal to the number of iterators (%zd).", Py_TYPE(self)->tp_name, numactive, PyTuple_GET_SIZE(self->iteratortuple)); return NULL; } if (current != NULL) { Py_ssize_t i; Py_ssize_t currentsize; /* current must be a tuple, otherwise the PyTuple_GET_ITEM and PyTuple_SET_ITEM operations in "next" will segfault. */ if (!PyTuple_CheckExact(current)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple` instance as " "third argument in the `state`, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(current)->tp_name); return NULL; } /* The length of the current tuple and the "numactive" value must be identical, otherwise this might loose items (numactive smaller) or segfault (numactive bigger) because it is used to index this tuple. */ currentsize = PyTuple_GET_SIZE(current); if (currentsize != numactive) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the fourth (%zd) " "argument in the `state` is equal to the length of " "the third argument (%zd).", Py_TYPE(self)->tp_name, numactive, currentsize); return NULL; } /* Unfortunately we have to check each item in the "current" tuple to make sure the "next" function doesn't segfault. - Each item must be an "ItemIdxKey" item. - Each ItemIdxKey must have a key attribute if we have a keyfunction or mustn't have a key if we have no key function. - Each ItemIdxKey idx attribute must have a value that isn't out of bounds for the iteratortuple. There are some additional checks that could be done but aren't because they might have side-effects and could slow down this function unnecessarily: - The "idx" attribute of the ItemIdxKey instances should be unique within the "current" tuple. Meaning there shouldn't be more than one pointing to the same iterator. - The "current" tuple is supposed to be sorted (either decreasing if "reverse=False" or increasing otherwise) and using an unsorted "current" will break the function. However this requirement isn't enforced for the iterators when they are passed in so there is actually already a way to "break" the function. */ for (i = 0; i < currentsize; i++) { Py_ssize_t idx; PyObject *iik = PyTuple_GET_ITEM(current, i); if (!PyIU_IsTypeExact(iik, &PyIUType_ItemIdxKey)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected that the third " "argument in the `state` contains only " "`ItemIdxKey` instances, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(iik)->tp_name); return NULL; } if (keyfunc == NULL) { if (((PyIUObject_ItemIdxKey *)iik)->key != NULL) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected that `ItemIdxKey` " "instances in the third argument in the `state` " "have no `key` attribute when the first argument " "is `None`.", Py_TYPE(self)->tp_name); return NULL; } } else { if (((PyIUObject_ItemIdxKey *)iik)->key == NULL) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected that `ItemIdxKey` " "instances in the third argument in the `state` " "have a `key` attribute when the first argument " "is not `None`.", Py_TYPE(self)->tp_name); return NULL; } } idx = ((PyIUObject_ItemIdxKey *)iik)->idx; if (idx < 0 || idx >= PyTuple_GET_SIZE(self->iteratortuple)) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that `ItemIdxKey` " "instances in the third argument in the `state` " "have a `idx` attribute (%zd) that is smaller than " "the length of the `iteratortuple` (%zd)", Py_TYPE(self)->tp_name, idx, PyTuple_GET_SIZE(self->iteratortuple)); return NULL; } } } /* We need to make sure to copy the "current" because we will alter this tuple inside the "next" calls. If someone would hold a reference their tuple would change. This should never happen! Also the ItemIdxKey instances are mutable from outside so these have to be copied as well. */ if (current != NULL) { Py_ssize_t i; PyObject *new_current = PyTuple_New(numactive); if (new_current == NULL) { return NULL; } for (i = 0; i < numactive; i++) { PyObject *iik1 = PyTuple_GET_ITEM(current, i); PyObject *iik2 = PyIU_ItemIdxKey_Copy(iik1); if (iik2 == NULL) { return NULL; } PyTuple_SET_ITEM(new_current, i, iik2); } current = new_current; } Py_XINCREF(keyfunc); Py_XSETREF(self->keyfunc, keyfunc); /* No need to incref the "current" because we copied it already! */ Py_XSETREF(self->current, current); self->numactive = numactive; self->reverse = reverse; Py_RETURN_NONE; } static PyObject * merge_lengthhint(PyIUObject_Merge *self, PyObject *Py_UNUSED(args)) { Py_ssize_t i; size_t len = 0; /* TODO: The following cases share a lot of code, maybe the shared lines could be refactored into a helper function.... */ if (self->current == NULL) { /* If we have no current we simply sum the lengths of the iterators. */ for (i = 0; i < PyTuple_GET_SIZE(self->iteratortuple); i++) { PyObject *it = PyTuple_GET_ITEM(self->iteratortuple, i); Py_ssize_t len_tmp = PyObject_LengthHint(it, 0); if (len_tmp == -1) { return NULL; } len += (size_t)len_tmp; /* adding two py_ssize_t values (even when they are MAX) cannot overflow "size_t" so we can simply check if the new len is above "PY_SSIZE_T_MAX" to find out if we have overflow. */ if (len > (size_t)PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_OverflowError, "cannot fit 'int' into an index-sized " "integer"); return NULL; } } } else { /* Add the number of items in "current" to the length, because these were taken from the iterables already. */ len += (size_t)self->numactive; for (i = 0; i < self->numactive; i++) { Py_ssize_t len_tmp; /* We need to avoid the iterators that are already exhausted so just iterate over the "current" and only sum the iterators that are still in the "current". */ PyObject *iik = PyTuple_GET_ITEM(self->current, i); Py_ssize_t idx = ((PyIUObject_ItemIdxKey *)iik)->idx; PyObject *it = PyTuple_GET_ITEM(self->iteratortuple, idx); len_tmp = PyObject_LengthHint(it, 0); if (len_tmp == -1) { return NULL; } /* See above for explanation. */ len += (size_t)len_tmp; if (len > (size_t)PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_OverflowError, "cannot fit 'int' into an index-sized " "integer"); return NULL; } } } return PyLong_FromSize_t(len); } static PyObject * merge_get_reverse(PyIUObject_Merge *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->reverse); } static PyMethodDef merge_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)merge_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)merge_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)merge_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyGetSetDef merge_getsetlist[] = { { "reverse", /* name */ (getter)merge_get_reverse, /* get */ (setter)0, /* set */ merge_prop_reverse_doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; static PyMemberDef merge_memberlist[] = { { "key", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Merge, keyfunc), /* offset */ READONLY, /* flags */ merge_prop_key_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Merge = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.merge", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Merge), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)merge_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)merge_doc, /* tp_doc */ (traverseproc)merge_traverse, /* tp_traverse */ (inquiry)merge_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)merge_next, /* tp_iternext */ merge_methods, /* tp_methods */ merge_memberlist, /* tp_members */ merge_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)merge_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000F6000081A400000000000000000000000165E3BCDA00000192000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/merge.h#ifndef PYIU_MERGE_H #define PYIU_MERGE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iteratortuple; PyObject *keyfunc; PyObject *current; Py_ssize_t numactive; int reverse; } PyIUObject_Merge; extern PyTypeObject PyIUType_Merge; #ifdef __cplusplus } #endif #endif 070701000000F7000081A400000000000000000000000165E3BCDA0000178D000000000000000000000000000000000000005100000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/minmax.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "minmax.h" #include "helper.h" #define SWAP(x, y) \ do { \ PyObject *tmp = y; \ y = x; \ x = tmp; \ } while (0) PyObject * PyIU_MinMax(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"key", "default", NULL}; PyObject *sequence; PyObject *iterator = NULL; PyObject *defaultitem = NULL; PyObject *keyfunc = NULL; PyObject *item1 = NULL; PyObject *maxitem = NULL; PyObject *maxval = NULL; PyObject *minitem = NULL; PyObject *minval = NULL; PyObject *resulttuple = NULL; int positional = PyTuple_GET_SIZE(args) > 1; if (positional) { sequence = args; } else if (!PyArg_UnpackTuple(args, "minmax", 1, 1, &sequence)) { return NULL; } if (!PyArg_ParseTupleAndKeywords(PyIU_global_0tuple, kwargs, "|OO:minmax", kwlist, &keyfunc, &defaultitem)) { return NULL; } if (keyfunc == Py_None) { keyfunc = NULL; } if (positional && defaultitem != NULL) { PyErr_SetString(PyExc_TypeError, "Cannot specify a default for `minmax` with multiple " "positional arguments"); goto Fail; } iterator = PyObject_GetIter(sequence); if (iterator == NULL) { goto Fail; } while ((item1 = Py_TYPE(iterator)->tp_iternext(iterator))) { PyObject *val1; PyObject *val2; PyObject *item2; int cmp; if (keyfunc != NULL) { val1 = PyIU_CallWithOneArgument(keyfunc, item1); if (val1 == NULL) { Py_DECREF(item1); goto Fail; } } else { Py_INCREF(item1); val1 = item1; } item2 = Py_TYPE(iterator)->tp_iternext(iterator); /* item2 could be NULL (end of sequence) clear a StopIteration but immediately fail if it's another exception. It will check for exceptions in the end (again) but make sure it does not process an iterable when the iterator threw an exception! */ if (item2 == NULL) { if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(item1); Py_DECREF(val1); goto Fail; } Py_INCREF(item1); item2 = item1; Py_INCREF(val1); val2 = val1; } else { if (keyfunc != NULL) { val2 = PyIU_CallWithOneArgument(keyfunc, item2); if (val2 == NULL) { Py_DECREF(item1); Py_DECREF(val1); Py_DECREF(item2); goto Fail; } } else { val2 = item2; Py_INCREF(item2); } } /* maximum value and item are unset; set them. */ if (minval == NULL) { Py_INCREF(item1); Py_INCREF(val1); Py_INCREF(item2); Py_INCREF(val2); minitem = item1; minval = val1; maxitem = item1; maxval = val1; item1 = item2; val1 = val2; } if (val1 != val2) { /* If both are set swap them if val2 is smaller than val1. */ cmp = PyObject_RichCompareBool(val2, val1, Py_LT); if (cmp > 0) { SWAP(val1, val2); SWAP(item1, item2); } else if (cmp < 0) { Py_DECREF(item1); Py_DECREF(item2); Py_DECREF(val1); Py_DECREF(val2); goto Fail; } } /* val1 is smaller or equal to val2 so we compare only val1 with the current minimum. */ cmp = PyObject_RichCompareBool(val1, minval, Py_LT); if (cmp > 0) { SWAP(minval, val1); SWAP(minitem, item1); } else if (cmp < 0) { Py_DECREF(item1); Py_DECREF(item2); Py_DECREF(val1); Py_DECREF(val2); goto Fail; } Py_DECREF(item1); Py_DECREF(val1); val1 = NULL; item1 = NULL; /* Same for maximum. */ cmp = PyObject_RichCompareBool(val2, maxval, Py_GT); if (cmp > 0) { SWAP(maxval, val2); SWAP(maxitem, item2); } else if (cmp < 0) { Py_DECREF(item2); Py_DECREF(val2); goto Fail; } Py_DECREF(item2); Py_DECREF(val2); } Py_DECREF(iterator); iterator = NULL; if (PyIU_ErrorOccurredClearStopIteration()) { goto Fail; } if (minval == NULL) { if (maxval != NULL || minitem != NULL || maxitem != NULL) { /* This should be impossible to reach but better check. */ goto Fail; } if (defaultitem != NULL) { minitem = defaultitem; maxitem = defaultitem; Py_INCREF(defaultitem); Py_INCREF(defaultitem); } else { PyErr_SetString(PyExc_ValueError, "`minmax` `iterable` is an empty sequence"); goto Fail; } } else { Py_DECREF(minval); Py_DECREF(maxval); } resulttuple = PyTuple_Pack(2, minitem, maxitem); Py_DECREF(minitem); Py_DECREF(maxitem); if (resulttuple == NULL) { return NULL; } return resulttuple; Fail: Py_XDECREF(minval); Py_XDECREF(minitem); Py_XDECREF(maxval); Py_XDECREF(maxitem); Py_XDECREF(iterator); return NULL; } 070701000000F8000081A400000000000000000000000165E3BCDA00000114000000000000000000000000000000000000005100000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/minmax.h#ifndef PYIU_MINMAX_H #define PYIU_MINMAX_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_MinMax(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 070701000000F9000081A400000000000000000000000165E3BCDA000033DB000000000000000000000000000000000000004E00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/nth.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "nth.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( nth_prop_n_doc, "(:py:class:`int`) The index to get (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( nth_doc, "nth(x)\n" "--\n\n" "Class that returns the `n`-th found value.\n" "\n" "Parameters\n" "----------\n" "n : :py:class:`int`\n" " The index of the wanted item. If negative the last item is searched.\n" " \n" " .. note::\n" " This is the only parameter for ``__init__``. The following parameters\n" " have to be specified when calling the instance.\n" "\n" "iterable : iterable\n" " The `iterable` for which to determine the nth value.\n" "\n" "default : any type, optional\n" " If no nth value is found and `default` is given the `default` is \n" " returned.\n" "\n" "pred : callable, optional\n" " If given return the nth item for which ``pred(item)`` is ``True``.\n" " \n" " .. note::\n" " ``pred=None`` is equivalent to ``pred=bool``.\n" "\n" "truthy : :py:class:`bool`, optional\n" " If ``False`` search for the nth item for which ``pred(item)`` is ``False``.\n" " Default is ``True``.\n" "\n" " .. note::\n" " Parameter is ignored if `pred` is not given.\n" "\n" "retpred : :py:class:`bool`, optional\n" " If given return ``pred(item)`` instead of ``item``.\n" " Default is ``False``.\n" "\n" " .. note::\n" " Parameter is ignored if `pred` is not given.\n" "\n" "retidx : :py:class:`bool`, optional\n" " If given return the index of the `n`-th element instead of the value.\n" " Default is ``False``.\n" "\n" "Returns\n" "-------\n" "nth : any type\n" " The last value or the nth value for which `pred` is ``True``.\n" " If there is no such value then `default` is returned.\n" "\n" "Raises\n" "-------\n" "TypeError :\n" " If there is no nth element and no `default` is given.\n" "\n" "Examples\n" "--------\n" "Some basic examples including the use of ``pred``::\n" "\n" " >>> from iteration_utilities import nth\n" " >>> # First item\n" " >>> nth(0)([0, 1, 2])\n" " 0\n" " >>> # Second item\n" " >>> nth(1)([0, 1, 2])\n" " 1\n" " >>> # Last item\n" " >>> nth(-1)([0, 1, 2])\n" " 2\n" " \n" " >>> nth(1)([0, 10, '', tuple(), 20], pred=bool)\n" " 20\n" " \n" " >>> # second odd number\n" " >>> nth(1)([0, 2, 3, 5, 8, 9, 10], pred=lambda x: x%2)\n" " 5\n" " \n" " >>> # default value if empty or no true value\n" " >>> nth(0)([], default=100)\n" " 100\n" " >>> nth(-1)([0, 10, 0, 0], pred=bool, default=100)\n" " 10\n" "\n" "Given a `pred` it is also possible to look for the nth ``False`` value and \n" "return the result of ``pred(item)``::\n" "\n" " >>> nth(1)([1,2,0], pred=bool)\n" " 2\n" " >>> nth(-1)([1,0,2,0], pred=bool, truthy=False)\n" " 0\n" " >>> import operator\n" " >>> nth(-1)([[0,3], [0,1], [0,2]], pred=operator.itemgetter(1))\n" " [0, 2]\n" " >>> nth(-1)([[0,3], [0,1], [0,2]], pred=operator.itemgetter(1), retpred=True)\n" " 2\n" "\n" "There are already three predefined instances:\n" "\n" "- :py:func:`~iteration_utilities.first`: equivalent to ``nth(0)``.\n" "- :py:func:`~iteration_utilities.second`: equivalent to ``nth(1)``.\n" "- :py:func:`~iteration_utilities.third`: equivalent to ``nth(2)``.\n" "- :py:func:`~iteration_utilities.last`: equivalent to ``nth(-1)``.\n"); PyObject * PyIUNth_New(Py_ssize_t index) { PyIUObject_Nth *self; self = PyObject_GC_New(PyIUObject_Nth, &PyIUType_Nth); if (self == NULL) { return NULL; } self->index = index; PyObject_GC_Track(self); return (PyObject *)self; } static PyObject * nth_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { PyIUObject_Nth *self; Py_ssize_t index; if (!PyArg_ParseTuple(args, "n:nth", &index)) { return NULL; } self = (PyIUObject_Nth *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->index = index; return (PyObject *)self; } static void nth_dealloc(PyIUObject_Nth *self) { PyObject_GC_UnTrack(self); Py_TYPE(self)->tp_free(self); } static int nth_traverse(PyIUObject_Nth *self, visitproc visit, void *arg) { return 0; } static int nth_clear(PyIUObject_Nth *self) { return 0; } static PyObject * nth_call(PyIUObject_Nth *self, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "default", "pred", "truthy", "retpred", "retidx", NULL}; PyObject *iterable; PyObject *iterator; PyObject *item; PyObject *defaultitem = NULL; PyObject *func = NULL; PyObject *last = NULL; PyObject *val = NULL; int ok = 0; int truthy = 1; int retpred = 0; int retidx = 0; Py_ssize_t idx; Py_ssize_t nfound = -1; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|OOppp:nth.__call__", kwlist, &iterable, &defaultitem, &func, &truthy, &retpred, &retidx)) { return NULL; } if (func == (PyObject *)&PyBool_Type) { func = Py_None; } if (retpred && retidx) { PyErr_SetString(PyExc_ValueError, "can only specify `retpred` or `retidx` argument " "for `nth`."); return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } /* The loop variable "idx" is only incremented if a suitable item was found. */ for (idx = 0; idx <= self->index || self->index < 0;) { item = Py_TYPE(iterator)->tp_iternext(iterator); /* If the iterator terminates also terminate the loop and remove the last found item (except one looks for the last one "self->index == -1"). */ if (item == NULL) { if (self->index >= 0) { Py_XDECREF(last); last = NULL; } break; } /* In case the index of the found element should be returned we need to increment the "nfound" counter. */ if (retidx) { nfound++; } if (func == NULL) { /* If no function is given we can skip the remainder of the loop and just use the new item.*/ if (last != NULL) { Py_DECREF(last); } last = item; idx++; continue; } else if (func == Py_None) { /* If "None" (or "bool") is given as predicate we don't need to call the function explicitly. */ ok = PyObject_IsTrue(item); } else { /* Otherwise call the function. */ val = PyIU_CallWithOneArgument(func, item); if (val == NULL) { Py_DECREF(iterator); Py_DECREF(item); Py_XDECREF(last); return NULL; } ok = PyObject_IsTrue(val); } /* Compare if the "ok" variable matches the required "truthyness" and replace the last found item with the new found one. */ if (ok == truthy) { if (retpred) { /* If the predicate should be returned we don't need the original item but only keep the result of the function call. */ Py_DECREF(item); if (val == NULL) { /* Predicate was None or bool and no "val" was created. */ val = PyBool_FromLong(ok); } Py_XDECREF(last); last = val; /* Set val to NULL otherwise the next iteration might decref it inadvertently. */ val = NULL; } else { /* Otherwise discard the value from the function call and keep the item from the iterator. */ Py_XDECREF(val); if (last != NULL) { Py_DECREF(last); } last = item; } idx++; } else if (ok < 0) { /* Error happened when calling the function or when comparing to True. */ Py_DECREF(iterator); Py_DECREF(item); Py_XDECREF(val); return NULL; } else { /* The object is not considered suitable and it will be discarded. */ Py_DECREF(item); Py_XDECREF(val); } } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { Py_XDECREF(last); return NULL; } if (last != NULL) { /* We still have a last element (so the loop did not terminate without finding the indexed element). */ if (retidx) { Py_DECREF(last); return PyLong_FromSsize_t(nfound); } return last; } else if (defaultitem != NULL) { /* No last element but a default was given. */ Py_INCREF(defaultitem); return defaultitem; } else { /* No item, no default raises an IndexError. */ PyErr_SetString(PyExc_IndexError, "`iterable` for `nth` does not contain enough values."); return NULL; } } static PyObject * nth_repr(PyIUObject_Nth *self) { return PyUnicode_FromFormat("%s(%zd)", Py_TYPE(self)->tp_name, self->index); } static PyObject * nth_reduce(PyIUObject_Nth *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(n)", Py_TYPE(self), self->index); } static PyMethodDef nth_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)nth_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef nth_memberlist[] = { { "n", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Nth, index), /* offset */ READONLY, /* flags */ nth_prop_n_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Nth = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.nth", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Nth), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)nth_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)nth_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)nth_call, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)nth_doc, /* tp_doc */ (traverseproc)nth_traverse, /* tp_traverse */ (inquiry)nth_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ nth_methods, /* tp_methods */ nth_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)nth_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000FA000081A400000000000000000000000165E3BCDA00000155000000000000000000000000000000000000004E00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/nth.h#ifndef PYIU_NTH_H #define PYIU_NTH_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD Py_ssize_t index; } PyIUObject_Nth; extern PyTypeObject PyIUType_Nth; PyObject * PyIUNth_New(Py_ssize_t index); #ifdef __cplusplus } #endif #endif 070701000000FB000081A400000000000000000000000165E3BCDA00000544000000000000000000000000000000000000004E00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/one.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "one.h" #include "helper.h" PyObject * PyIU_One(PyObject *Py_UNUSED(m), PyObject *iterable) { PyObject *iterator; PyObject *item1; PyObject *item2; iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } item1 = Py_TYPE(iterator)->tp_iternext(iterator); if (item1 == NULL) { Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } PyErr_SetString(PyExc_ValueError, "not enough values to unpack in `one` (expected 1, got 0)"); return NULL; } item2 = Py_TYPE(iterator)->tp_iternext(iterator); if (item2 != NULL) { Py_DECREF(iterator); PyErr_Format(PyExc_ValueError, "too many values to unpack in `one` (expected 1, got '%R, %R[, ...]').", item1, item2); Py_DECREF(item1); Py_DECREF(item2); return NULL; } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(item1); return NULL; } return item1; } 070701000000FC000081A400000000000000000000000165E3BCDA000000FD000000000000000000000000000000000000004E00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/one.h#ifndef PYIU_ONE_H #define PYIU_ONE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_One(PyObject *Py_UNUSED(m), PyObject *iterable); #ifdef __cplusplus } #endif #endif 070701000000FD000081A400000000000000000000000165E3BCDA000024AB000000000000000000000000000000000000005100000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/packed.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "packed.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( packed_prop_func_doc, "(callable) The function with packed arguments (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( packed_doc, "packed(func, /)\n" "--\n\n" "Class that always returns ``func(*x)`` when called with ``packed(func)(x)``.\n" "\n" ".. versionadded:: 0.3\n" "\n" "Parameters\n" "----------\n" "func : callable\n" " The function that should be called when the packed-instance is called.\n" "\n" "Examples\n" "--------\n" "Creating :py:class:`~iteration_utilities.packed` instances::\n" "\n" " >>> from iteration_utilities import packed\n" " >>> from operator import eq\n" " >>> five = packed(eq)\n" " >>> five((2, 2))\n" " True\n" "\n" "This is a convenience class that emulates the behaviour of \n" ":py:func:`itertools.starmap` (compared to :py:func:`map`)::\n" "\n" " >>> from itertools import starmap\n" " >>> list(map(packed(eq), [(2, 2), (3, 3), (2, 3)]))\n" " [True, True, False]\n" " >>> list(starmap(eq, [(2, 2), (3, 3), (2, 3)]))\n" " [True, True, False]\n" "\n" "and :py:func:`~iteration_utilities.starfilter` (compared to \n" ":py:func:`filter`)::\n" "\n" " >>> from iteration_utilities import starfilter\n" " >>> list(filter(packed(eq), [(2, 2), (3, 3), (2, 3)]))\n" " [(2, 2), (3, 3)]\n" " >>> list(starfilter(eq, [(2, 2), (3, 3), (2, 3)]))\n" " [(2, 2), (3, 3)]\n" "\n" "Of course in these cases the appropriate `star`-function can be used but \n" "in case a function does not have such a convenience function already \n" ":py:class:`~iteration_utilities.packed` can be used.\n"); #if PyIU_USE_VECTORCALL static PyObject *packed_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames); #endif static PyObject * packed_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { PyIUObject_Packed *self; PyObject *func; if (!PyArg_UnpackTuple(args, "packed", 1, 1, &func)) { return NULL; } self = (PyIUObject_Packed *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(func); self->func = func; #if PyIU_USE_VECTORCALL self->vectorcall = packed_vectorcall; #endif return (PyObject *)self; } static void packed_dealloc(PyIUObject_Packed *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->func); Py_TYPE(self)->tp_free(self); } static int packed_traverse(PyIUObject_Packed *self, visitproc visit, void *arg) { Py_VISIT(self->func); return 0; } static int packed_clear(PyIUObject_Packed *self) { Py_CLEAR(self->func); return 0; } #if PyIU_USE_VECTORCALL static PyObject * packed_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyIUObject_Packed *self; PyObject *packed; PyObject *result; PyObject *small_stack[PyIU_SMALL_ARG_STACK_SIZE]; PyObject **stack = small_stack; int is_tuple; Py_ssize_t num_packed_args; Py_ssize_t num_new_args; Py_ssize_t num_keyword_args = kwnames == NULL ? 0 : PyTuple_Size(kwnames); if (PyVectorcall_NARGS(nargsf) != 1) { PyErr_Format(PyExc_TypeError, "expected one argument."); return NULL; } packed = args[0]; is_tuple = PyTuple_CheckExact(packed); self = (PyIUObject_Packed *)obj; if (!is_tuple && !PyList_CheckExact(packed)) { packed = PySequence_Tuple(packed); if (packed == NULL) { return NULL; } is_tuple = 1; } else { Py_INCREF(packed); } /* From this point on the "packed" is either a list or a tuple. */ if (is_tuple) { num_packed_args = PyTuple_GET_SIZE(packed); } else { num_packed_args = PyList_GET_SIZE(packed); } num_new_args = num_packed_args + num_keyword_args; if (num_new_args > PyIU_SMALL_ARG_STACK_SIZE) { stack = PyIU_AllocatePyObjectArray(num_new_args); if (stack == NULL) { Py_DECREF(packed); return PyErr_NoMemory(); } } // Positional arguments if (is_tuple) { PyIU_CopyTupleToArray(packed, stack, num_packed_args); } else { // list PyIU_CopyListToArray(packed, stack, num_packed_args); } // Keyword arguments memcpy(stack + num_packed_args, args + 1, (num_new_args - num_packed_args) * sizeof(PyObject *)); result = PyIU_PyObject_Vectorcall(self->func, stack, num_packed_args, kwnames); Py_DECREF(packed); if (stack != small_stack) { PyMem_Free(stack); } return result; } #else static PyObject * packed_call(PyIUObject_Packed *self, PyObject *args, PyObject *kwargs) { PyObject *packed; PyObject *newpacked; PyObject *res; if (!PyArg_UnpackTuple(args, "packed.__call__", 1, 1, &packed)) { return NULL; } Py_INCREF(packed); if (!PyTuple_CheckExact(packed)) { newpacked = PySequence_Tuple(packed); Py_DECREF(packed); if (newpacked == NULL) { return NULL; } packed = newpacked; } res = PyObject_Call(self->func, packed, kwargs); Py_DECREF(packed); return res; } #endif static PyObject * packed_repr(PyIUObject_Packed *self) { PyObject *result = NULL; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } result = PyUnicode_FromFormat("%s(%R)", Py_TYPE(self)->tp_name, self->func); Py_ReprLeave((PyObject *)self); return result; } static PyObject * packed_reduce(PyIUObject_Packed *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(O)", Py_TYPE(self), self->func); } static PyMethodDef packed_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)packed_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef packed_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Packed, func), /* offset */ READONLY, /* flags */ packed_prop_func_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Packed = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.packed", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Packed), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)packed_dealloc, /* tp_dealloc */ #if PyIU_USE_VECTORCALL offsetof(PyIUObject_Packed, vectorcall), /* tp_vectorcall_offset */ #else (printfunc)0, /* tp_print */ #endif (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)packed_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ #if PyIU_USE_VECTORCALL (ternaryfunc)PyVectorcall_Call, /* tp_call */ #else (ternaryfunc)packed_call, /* tp_call */ #endif (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL | _Py_TPFLAGS_HAVE_VECTORCALL #else | Py_TPFLAGS_HAVE_VECTORCALL #endif #endif , /* tp_flags */ (const char *)packed_doc, /* tp_doc */ (traverseproc)packed_traverse, /* tp_traverse */ (inquiry)packed_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ packed_methods, /* tp_methods */ packed_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)packed_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 070701000000FE000081A400000000000000000000000165E3BCDA00000172000000000000000000000000000000000000005100000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/packed.h#ifndef PYIU_PACKED_H #define PYIU_PACKED_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; #if PyIU_USE_VECTORCALL vectorcallfunc vectorcall; #endif } PyIUObject_Packed; extern PyTypeObject PyIUType_Packed; #ifdef __cplusplus } #endif #endif 070701000000FF000081A400000000000000000000000165E3BCDA000084BA000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/partial.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "partial.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "docs_sizeof.h" #include "helper.h" #include "placeholder.h" PyDoc_STRVAR( partial_prop_func_doc, "(callable) Function object to use in future partial calls (readonly)."); PyDoc_STRVAR( partial_prop_args_doc, "(:py:class:`tuple`) arguments for future partial calls (readonly)."); PyDoc_STRVAR( partial_prop_keywords_doc, "(:py:class:`dict`) keyword arguments for future partial calls (readonly)."); PyDoc_STRVAR( partial_prop_nplaceholders_doc, "(:py:class:`int`) Number of placeholders in the args (readonly)."); PyDoc_STRVAR(partial_prop___dict___doc, ""); PyDoc_STRVAR( partial_doc, "partial(func, *args, **kwargs)\n" "--\n\n" "Like :py:func:`functools.partial` but supporting placeholders.\n" "\n" ".. versionadded:: 0.4.0\n" "\n" "Parameters\n" "----------\n" "\n" "func : callable\n" " The function to partially wrap.\n" "\n" "args : any type\n" " The positional arguments for `func`.\n" " \n" " .. note::\n" " Using :py:attr:`.partial._` as one or multiple positional arguments \n" " will be interpreted as placeholder that need to be filled when the \n" " :py:class:`~iteration_utilities.partial` instance is called.\n" "\n" "kwargs : any type\n" " The keyword arguments for `func`.\n" "\n" "Returns\n" "-------\n" "\n" "partial : callable\n" " The `func` where the given positional arguments are fixed (or represented\n" " as placeholders) and with optional keyword arguments.\n" "\n" "Notes\n" "-----\n" "While placeholders can be used for the :py:attr:`args` they can't be used \n" "for the :py:attr:`keywords`.\n" "\n" "Examples\n" "--------\n" "The :py:class:`iteration_utilities.partial` can be used as slightly slower\n" "drop-in replacement for :py:func:`functools.partial`. However it offers the\n" "possibility to pass in placeholders as positional arguments. This can be\n" "especially useful if a function does not allow keyword arguments::\n" "\n" " >>> from iteration_utilities import partial\n" " >>> isint = partial(isinstance, partial._, int)\n" " >>> isint(10)\n" " True\n" " >>> isint(11.11)\n" " False\n" "\n" "In this case the `isint` function is equivalent but faster than\n" "``lambda x: isinstance(x, int)``.\n" "The :py:attr:`.partial._` attribute or the \n" ":py:const:`~iteration_utilities.Placeholder` can be used as placeholders \n" "for the positional arguments.\n" "\n" "For example most iterators in :py:mod:`iteration_utilities` take the `iterable` \n" "as the first argument so other arguments can be easily added::\n" "\n" " >>> from iteration_utilities import accumulate, Placeholder\n" " >>> from operator import mul\n" " >>> cumprod = partial(accumulate, Placeholder, mul)\n" " >>> list(cumprod([1,2,3,4,5]))\n" " [1, 2, 6, 24, 120]\n"); #if PyIU_USE_VECTORCALL static PyObject *partial_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames); #endif /****************************************************************************** * Helper to get the amount and positions of Placeholders in a tuple. *****************************************************************************/ static Py_ssize_t PyIUPlaceholder_NumInTuple(PyObject *tup) { Py_ssize_t cnts = 0; Py_ssize_t i; /* Find the placeholders (if any) in the tuple. */ for (i = 0; i < PyTuple_GET_SIZE(tup); i++) { if (PyTuple_GET_ITEM(tup, i) == PYIU_Placeholder) { cnts++; } } return cnts; } static Py_ssize_t * PyIUPlaceholder_PosInTuple(PyObject *tup, Py_ssize_t cnts) { assert(cnts >= 0); Py_ssize_t j = 0; Py_ssize_t i; Py_ssize_t *pos = PyMem_Malloc((size_t)cnts * sizeof(Py_ssize_t)); if (pos == NULL) { PyErr_SetNone(PyExc_MemoryError); return NULL; } /* Find the placeholders (if any) in the tuple. */ for (i = 0; i < PyTuple_GET_SIZE(tup); i++) { if (PyTuple_GET_ITEM(tup, i) == PYIU_Placeholder) { pos[j] = i; j++; } } if (j != cnts) { PyErr_SetString(PyExc_TypeError, "Something went wrong... totally wrong!"); PyMem_Free(pos); return NULL; } return pos; } /****************************************************************************** * Parts are taken from the CPython package (PSF licensed). *****************************************************************************/ static void partial_dealloc(PyIUObject_Partial *self) { PyObject_GC_UnTrack(self); if (self->weakreflist != NULL) { PyObject_ClearWeakRefs((PyObject *)self); } Py_XDECREF(self->fn); Py_XDECREF(self->args); Py_XDECREF(self->kw); Py_XDECREF(self->dict); if (self->posph != NULL) { PyMem_Free(self->posph); } Py_TYPE(self)->tp_free(self); } static int partial_traverse(PyIUObject_Partial *self, visitproc visit, void *arg) { Py_VISIT(self->fn); Py_VISIT(self->args); Py_VISIT(self->kw); Py_VISIT(self->dict); return 0; } static int partial_clear(PyIUObject_Partial *self) { Py_CLEAR(self->fn); Py_CLEAR(self->args); Py_CLEAR(self->kw); Py_CLEAR(self->dict); /* TODO: Is it necessary to clear the self->posph array here? Probably not because it doesn't contain PyObjects ... */ return 0; } static PyObject * partial_new(PyTypeObject *type, PyObject *args, PyObject *kw) { PyObject *func; PyObject *nargs; PyObject *pargs = NULL; PyObject *pkw = NULL; PyIUObject_Partial *self = NULL; Py_ssize_t startslice = 1; if (PyTuple_GET_SIZE(args) < 1) { PyErr_SetString(PyExc_TypeError, "`partial` takes at least one argument"); goto Fail; } /* create PyIUObject_Partial structure */ self = (PyIUObject_Partial *)type->tp_alloc(type, 0); if (self == NULL) { goto Fail; } func = PyTuple_GET_ITEM(args, 0); /* Unwrap the function; if it's another partial and we're not in a subclass and (that's important) there is no custom attribute set (__dict__ = NULL). That means even if the dict was only accessed but empty! */ if (PyIU_IsTypeExact(func, &PyIUType_Partial) && type == &PyIUType_Partial && ((PyIUObject_Partial *)func)->dict == NULL) { Py_ssize_t tuplesize = PyTuple_GET_SIZE(args) - 1; PyIUObject_Partial *part = (PyIUObject_Partial *)func; if (part->numph && tuplesize) { /* Creating a partial from another partial which had placeholders needs to be specially treated. At least if there are positional keywords given these will replace the placeholders! */ Py_ssize_t i, stop; pargs = PyIU_TupleCopy(part->args); if (pargs == NULL) { return NULL; } /* Only replace min(part->numph, tuplesize) placeholders, otherwise this will make out of bounds memory accesses (besides doing something undefined). */ stop = part->numph > tuplesize ? tuplesize : part->numph; for (i = 0; i < stop; i++) { PyObject *tmp = PyTuple_GET_ITEM(args, i + 1); PyObject *ph = PyTuple_GET_ITEM(pargs, part->posph[i]); Py_INCREF(tmp); PyTuple_SET_ITEM(pargs, part->posph[i], tmp); Py_DECREF(ph); } /* Just alter the startslice so the arguments will be sliced correctly later. It is also a good indicator if the pargs need to be decremented later. */ startslice = startslice + stop; } else { pargs = part->args; } pkw = part->kw; func = part->fn; } if (!PyCallable_Check(func)) { PyErr_SetString(PyExc_TypeError, "the first argument for `partial` must be callable"); goto Fail; } self->posph = NULL; self->fn = func; Py_INCREF(func); nargs = PyTuple_GetSlice(args, startslice, PY_SSIZE_T_MAX); if (nargs == NULL) { goto Fail; } if (pargs == NULL || PyTuple_GET_SIZE(pargs) == 0) { /* Save the arguments. */ self->args = nargs; Py_INCREF(nargs); } else if (PyTuple_GET_SIZE(nargs) == 0) { self->args = pargs; Py_INCREF(pargs); } else { self->args = PySequence_Concat(pargs, nargs); if (self->args == NULL) { Py_DECREF(nargs); goto Fail; } } /* Check how many placeholders exist and at which positions. */ self->numph = PyIUPlaceholder_NumInTuple(self->args); if (self->numph) { self->posph = PyIUPlaceholder_PosInTuple(self->args, self->numph); if (self->posph == NULL) { goto Fail; } } Py_DECREF(nargs); /* If we already exchanged placeholders we already got a reference to pargs so we need to decrement them once. */ if (startslice != 1) { Py_DECREF(pargs); startslice = 1; /* So the "Fail" won't decrement them again. */ } if (pkw == NULL || PyDict_Size(pkw) == 0) { if (kw == NULL) { self->kw = PyDict_New(); } else if (PYIU_CPYTHON && (Py_REFCNT(kw) == 1)) { Py_INCREF(kw); self->kw = kw; } else { self->kw = PyDict_Copy(kw); } } else { self->kw = PyDict_Copy(pkw); if (kw != NULL && self->kw != NULL) { if (PyDict_Merge(self->kw, kw, 1) != 0) { goto Fail; } } } if (self->kw == NULL) { goto Fail; } #if PyIU_USE_VECTORCALL self->vectorcall = partial_vectorcall; #endif return (PyObject *)self; Fail: if (startslice != 1) { Py_DECREF(pargs); } Py_XDECREF(self); return NULL; } #if PyIU_USE_VECTORCALL static PyObject * partial_vectorcall(PyObject *obj, PyObject *const *args, size_t nargsf, PyObject *kwnames) { PyObject *small_stack[PyIU_SMALL_ARG_STACK_SIZE]; PyObject **stack = small_stack; PyIUObject_Partial *self = (PyIUObject_Partial *)obj; Py_ssize_t n_args = PyVectorcall_NARGS(nargsf); Py_ssize_t n_kwargs = kwnames == NULL ? 0 : PyTuple_GET_SIZE(kwnames); Py_ssize_t n_self_args = PyTuple_GET_SIZE(self->args); Py_ssize_t n_self_kwargs = PyDict_Size(self->kw); Py_ssize_t n_duplicate_kwargs = 0; PyObject *kwnames_lookup = kwnames; PyObject *final_kwnames = NULL; PyObject *result = NULL; if (n_args < self->numph) { PyErr_SetString(PyExc_TypeError, "not enough values to fill the placeholders in " "`partial`."); return NULL; } Py_ssize_t n_final_args = n_self_args + n_args - self->numph; Py_ssize_t n_final_kwargs = n_self_kwargs + n_kwargs; Py_ssize_t n_final = n_final_args + n_final_kwargs; /* Since n_final doesn't account for duplicate keyword arguments in self->kw and kwnames this will be an overestimate. But I think an overestimate is good enough in most cases. */ if (n_final > PyIU_SMALL_ARG_STACK_SIZE) { stack = PyIU_AllocatePyObjectArray(n_final); if (stack == NULL) { return PyErr_NoMemory(); } } // Fill args PyIU_CopyTupleToArray(self->args, stack, (size_t)n_self_args); // Fill placeholders Py_ssize_t idx; for (idx = 0; idx < self->numph; idx++) { stack[self->posph[idx]] = args[idx]; } // Fill remaining additional args memcpy(stack + n_self_args, args + self->numph, (n_args - self->numph) * sizeof(PyObject *)); Py_ssize_t current_idx = n_final_args; /* In the following we need to ensure that we don't let arbitrary Python code run which might alter the instance. Since we're potentially using the __hash__ and __eq__ of the keyword names the keywords must be unicodes (not subclasses)! The self->kw should always be a real dictionary, so there's (probably) no way this could trigger Python code while iterating over it. */ if (kwnames != NULL) { Py_ssize_t kwname_idx; for (kwname_idx = 0; kwname_idx < n_kwargs; kwname_idx++) { PyObject *kwname = PyTuple_GET_ITEM(kwnames, kwname_idx); if (!PyUnicode_CheckExact(kwname)) { PyErr_SetString(PyExc_TypeError, "keyword names must be strings."); goto CleanUp; } } } /* The check that all keyword names are strings is done in one of the following loops. */ /* In case we have both kwargs in the partial and in the partial call we have to check for duplicates. To ensure that the call doesn't suffer from the quadratic lookup behavior when checking if each kwarg in the instance is also present in the kwnames-TUPLE this creates a set for the kwnames in case we have more than X values in the instance kwargs and more than Y values in the passed kwargs. The values chosen here aren't based on any empirical testing they are just educated guesses. This is probably unnecessary because there will likely be very rare that so many kwargs are in the instance and in the actual call. */ if (n_self_kwargs > 5 && n_kwargs >= 10) { kwnames_lookup = PyFrozenSet_New(kwnames); if (kwnames_lookup == NULL) { goto CleanUp; } } // Fill in the keywords stored in the instance. if (n_self_kwargs != 0) { PyObject *key; PyObject *value; Py_ssize_t pos = 0; if (kwnames == NULL) { /* No keyword arguments when the partial is called, we can simply use the values.*/ while (PyDict_Next(self->kw, &pos, &key, &value)) { if (PyUnicode_CheckExact(key)) { stack[current_idx] = value; current_idx++; } else { PyErr_SetString(PyExc_TypeError, "keyword names must be strings."); goto CleanUp; } } } else { while (PyDict_Next(self->kw, &pos, &key, &value)) { if (PyUnicode_CheckExact(key)) { int ok; ok = PySequence_Contains(kwnames_lookup, key); if (ok == 1) { /* The keyword is also present in the call. Skip it. */ n_duplicate_kwargs++; } else if (ok == 0) { /* The keyword is not present in the call. */ stack[current_idx] = value; current_idx++; } else { /* It's very unlikely that there will be a lookup failure since the kwargs for the instance and the kwargs for the call are already validated by Pythons function call infrastructure. However better to have this in-place in case it really ever happens... */ goto CleanUp; } } else { PyErr_SetString(PyExc_TypeError, "keyword names must be strings."); goto CleanUp; } } } } /* No special treatment for the keyword arguments passed to the call of the partial. We can simply add them to the stack. */ if (n_kwargs != 0) { Py_ssize_t arg_idx = n_args; for (arg_idx = n_args; arg_idx < n_args + n_kwargs; arg_idx++) { stack[current_idx] = args[arg_idx]; current_idx++; } } /* 4 cases: - self->kwargs && kwargs - self->kwargs && not kwargs - not self->kwargs && kwargs - not self->kwargs && not kwargs The last two cases are easily treated because we can simply use the keyword names that are passed in this call. The first two require more special treatment because the instance kwargs are a dict and need to be converted to a tuple of kwnames. In case the call also has kwargs we also need to account for the duplicate keywords. */ if (n_self_kwargs == 0) { final_kwnames = kwnames; } else { PyObject *key; PyObject *value; Py_ssize_t self_kwargs_pos = 0; Py_ssize_t final_kwnames_idx = 0; /* At this point we know the exact number of keyword arguments without duplicates. So we can create the tuple holding the keyword names. */ final_kwnames = PyTuple_New(n_final_kwargs - n_duplicate_kwargs); if (final_kwnames == NULL) { goto CleanUp; } /* Fill in the keywords stored in the instance. This relies on the fact that the self->kw dictionary hasn't changed between the previous step where we added the values and this step where we use the keys. */ if (n_duplicate_kwargs == 0) { while (PyDict_Next(self->kw, &self_kwargs_pos, &key, &value)) { Py_INCREF(key); PyTuple_SET_ITEM(final_kwnames, final_kwnames_idx, key); final_kwnames_idx++; } } else { while (PyDict_Next(self->kw, &self_kwargs_pos, &key, &value)) { int ok; ok = PySequence_Contains(kwnames_lookup, key); if (ok == 0) { Py_INCREF(key); PyTuple_SET_ITEM(final_kwnames, final_kwnames_idx, key); final_kwnames_idx++; } else if (ok == 1) { // Keyword is skipped. } else { goto CleanUp; } } } /* Fill in the keyword argument names from the call. */ if (kwnames != NULL) { Py_ssize_t kwnames_idx; for (kwnames_idx = 0; kwnames_idx < n_kwargs; kwnames_idx++) { PyObject *kwname = PyTuple_GET_ITEM(kwnames, kwnames_idx); Py_INCREF(kwname); PyTuple_SET_ITEM(final_kwnames, final_kwnames_idx, kwname); final_kwnames_idx++; } } } result = PyIU_PyObject_Vectorcall(self->fn, stack, n_final_args, final_kwnames); CleanUp: if (stack != small_stack) { PyMem_Free(stack); } if (kwnames_lookup != kwnames) { Py_DECREF(kwnames_lookup); } if (final_kwnames != kwnames) { Py_XDECREF(final_kwnames); } return result; } #else static PyObject * partial_call(PyIUObject_Partial *self, PyObject *args, PyObject *kw) { PyObject *ret = NULL; PyObject *finalargs = NULL; PyObject *finalkw = NULL; Py_ssize_t num_placeholders = self->numph; Py_ssize_t selfargsize = PyTuple_GET_SIZE(self->args); Py_ssize_t passargsize = PyTuple_GET_SIZE(args); if (selfargsize == 0) { /* No own args, so we can simply use these passed to the function. */ finalargs = args; Py_INCREF(args); } else if (passargsize == 0) { /* No passed arguments, we can simply reuse the own arguments except when these contain placeholders. */ if (num_placeholders) { PyErr_SetString(PyExc_TypeError, "not enough values to fill the placeholders in " "`partial`."); goto Fail; } finalargs = self->args; Py_INCREF(self->args); } else { /* In case both the own arguments and the passed arguments contain at least one item we need to create a new tuple that contains them all (filling potential placeholders). */ if (num_placeholders > passargsize) { PyErr_SetString(PyExc_TypeError, "not enough values to fill the placeholders in " "`partial`."); goto Fail; } /* In theory it would be possible to not create a new tuple for the call but only if the function doesn't keep that tuple (some functions could!). So probably best to always create a new tuple. */ finalargs = PyTuple_New(selfargsize + passargsize - num_placeholders); if (finalargs == NULL) { return NULL; } else { Py_ssize_t i, j; /* Copy the elements from the self->args into the new tuple including the placeholders. */ for (i = 0; i < selfargsize; i++) { PyObject *tmp = PyTuple_GET_ITEM(self->args, i); Py_INCREF(tmp); PyTuple_SET_ITEM(finalargs, i, tmp); } /* Replace the placeholders with the first items of the passed arguments. This doesn't decrement the reference count for the placeholders yet. */ for (i = 0; i < num_placeholders; i++) { PyObject *tmp = PyTuple_GET_ITEM(args, i); Py_INCREF(tmp); PyTuple_SET_ITEM(finalargs, self->posph[i], tmp); } /* Now decrement the placeholders. */ for (i = 0; i < num_placeholders; i++) { Py_DECREF(PYIU_Placeholder); } /* Now insert the remaining items of the passed arguments into the final tuple. */ for (i = num_placeholders, j = selfargsize; i < passargsize; i++, j++) { PyObject *tmp = PyTuple_GET_ITEM(args, i); Py_INCREF(tmp); PyTuple_SET_ITEM(finalargs, j, tmp); } } } if (PyDict_Size(self->kw) == 0) { finalkw = kw; Py_XINCREF(finalkw); } else { finalkw = PyDict_Copy(self->kw); if (finalkw == NULL) { goto Fail; } if (kw != NULL) { if (PyDict_Merge(finalkw, kw, 1) != 0) { goto Fail; } } } /* Actually call the function. */ ret = PyObject_Call(self->fn, finalargs, finalkw); Fail: Py_XDECREF(finalargs); Py_XDECREF(finalkw); return ret; } #endif #if PYIU_PYPY /****************************************************************************** * __dict__ getter and setter * * only needed for python2 or python3 < 3.3 because later there are generic * options available *****************************************************************************/ static PyObject * partial_get_dict(PyIUObject_Partial *self, void *Py_UNUSED(closure)) { if (self->dict == NULL) { self->dict = PyDict_New(); if (self->dict == NULL) { return NULL; } } Py_INCREF(self->dict); return self->dict; } static int partial_set_dict(PyIUObject_Partial *self, PyObject *value, void *Py_UNUSED(closure)) { PyObject *tmp; /* It is illegal to del p.__dict__ */ if (value == NULL) { PyErr_SetString(PyExc_TypeError, "a `partial` object's dictionary may not be deleted"); return -1; } /* Can only set __dict__ to a dictionary */ if (!PyDict_Check(value)) { PyErr_SetString(PyExc_TypeError, "setting `partial` object's dictionary to a non-dict"); return -1; } tmp = self->dict; Py_INCREF(value); self->dict = value; Py_XDECREF(tmp); return 0; } #endif static PyObject * partial_repr(PyIUObject_Partial *self) { PyObject *result = NULL; PyObject *arglist; PyObject *key; PyObject *value; Py_ssize_t n; Py_ssize_t i; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } arglist = PyUnicode_FromString(""); if (arglist == NULL) { goto done; } /* Pack positional arguments */ n = PyTuple_GET_SIZE(self->args); for (i = 0; i < n; i++) { PyObject *tmp = PyUnicode_FromFormat("%U, %R", arglist, PyTuple_GET_ITEM(self->args, i)); Py_XSETREF(arglist, tmp); if (arglist == NULL) { goto done; } } /* Pack keyword arguments */ i = 0; while (PyDict_Next(self->kw, &i, &key, &value)) { PyObject *tmp; /* This is mostly a special case because of Python2 which segfaults for normal strings when used as "%U" in "PyUnicode_FromFormat" However setstate also allows to pass in arbitary dictionaries with non-string keys. To prevent segfaults in that case this branch is also important for python3. */ PyObject *othertmp = PyUnicode_FromObject(key); if (othertmp == NULL) { Py_DECREF(arglist); goto done; } tmp = PyUnicode_FromFormat("%U, %U=%R", arglist, othertmp, value); Py_DECREF(othertmp); Py_XSETREF(arglist, tmp); if (arglist == NULL) { goto done; } } result = PyUnicode_FromFormat("%s(%R%U)", Py_TYPE(self)->tp_name, self->fn, arglist); Py_DECREF(arglist); done: Py_ReprLeave((PyObject *)self); return result; } static PyObject * partial_reduce(PyIUObject_Partial *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(O)(OOOO)", Py_TYPE(self), self->fn, self->fn, self->args, self->kw, self->dict ? self->dict : Py_None); } static PyObject * partial_setstate(PyIUObject_Partial *self, PyObject *state) { PyObject *fn; PyObject *fnargs; PyObject *kw; PyObject *dict; if (!PyTuple_Check(state) || !PyArg_ParseTuple(state, "OOOO", &fn, &fnargs, &kw, &dict) || !PyCallable_Check(fn) || !PyTuple_Check(fnargs) || (kw != Py_None && !PyDict_Check(kw))) { PyErr_SetString(PyExc_TypeError, "invalid `partial` state"); return NULL; } if (!PyTuple_CheckExact(fnargs)) { fnargs = PySequence_Tuple(fnargs); } else { Py_INCREF(fnargs); } if (fnargs == NULL) { return NULL; } if (kw == Py_None) { kw = PyDict_New(); } else if (!PyDict_CheckExact(kw)) { kw = PyDict_Copy(kw); } else { Py_INCREF(kw); } if (kw == NULL) { Py_DECREF(fnargs); return NULL; } Py_INCREF(fn); if (dict == Py_None) { dict = NULL; } Py_XINCREF(dict); Py_XSETREF(self->fn, fn); Py_XSETREF(self->args, fnargs); Py_XSETREF(self->kw, kw); Py_XSETREF(self->dict, dict); /* Free potentially existing array of positions and recreate it. */ if (self->posph != NULL) { PyMem_Free(self->posph); } self->numph = PyIUPlaceholder_NumInTuple(self->args); if (self->numph) { self->posph = PyIUPlaceholder_PosInTuple(self->args, self->numph); if (self->posph == NULL) { return NULL; } } else { self->posph = NULL; } Py_RETURN_NONE; } static PyObject * partial_sizeof(PyIUObject_Partial *self, PyObject *Py_UNUSED(args)) { Py_ssize_t res; res = sizeof(PyIUObject_Partial); /* Include the size of the posph array. */ res += self->numph * sizeof(Py_ssize_t); return PyLong_FromSsize_t(res); } static PyMethodDef partial_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)partial_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)partial_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, { "__sizeof__", /* ml_name */ (PyCFunction)partial_sizeof, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_sizeof_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; #if PYIU_PYPY static PyGetSetDef partial_getsetlist[] = { { "__dict__", /* name */ (getter)partial_get_dict, /* get */ (setter)partial_set_dict, /* set */ partial_prop___dict___doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; #else static PyGetSetDef partial_getsetlist[] = { { "__dict__", /* name */ PyObject_GenericGetDict, /* get */ PyObject_GenericSetDict, /* set */ partial_prop___dict___doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; #endif static PyMemberDef partial_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Partial, fn), /* offset */ READONLY, /* flags */ partial_prop_func_doc /* doc */ }, { "args", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Partial, args), /* offset */ READONLY, /* flags */ partial_prop_args_doc /* doc */ }, { "keywords", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Partial, kw), /* offset */ READONLY, /* flags */ partial_prop_keywords_doc /* doc */ }, { "num_placeholders", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Partial, numph), /* offset */ READONLY, /* flags */ partial_prop_nplaceholders_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Partial = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.partial", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Partial), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)partial_dealloc, /* tp_dealloc */ #if PyIU_USE_VECTORCALL offsetof(PyIUObject_Partial, vectorcall), /* tp_vectorcall_offset */ #else (printfunc)0, /* tp_print */ #endif (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)partial_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ #if PyIU_USE_VECTORCALL (ternaryfunc)PyVectorcall_Call, /* tp_call */ #else (ternaryfunc)partial_call, /* tp_call */ #endif (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)PyObject_GenericSetAttr, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE #if PyIU_USE_VECTORCALL #if PyIU_USE_UNDERSCORE_VECTORCALL | _Py_TPFLAGS_HAVE_VECTORCALL #else | Py_TPFLAGS_HAVE_VECTORCALL #endif #endif , /* tp_flags */ (const char *)partial_doc, /* tp_doc */ (traverseproc)partial_traverse, /* tp_traverse */ (inquiry)partial_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)offsetof(PyIUObject_Partial, weakreflist), /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ partial_methods, /* tp_methods */ partial_memberlist, /* tp_members */ partial_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)offsetof(PyIUObject_Partial, dict), /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)partial_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000100000081A400000000000000000000000165E3BCDA00000214000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/partial.h#ifndef PYIU_PARTIAL_H #define PYIU_PARTIAL_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *fn; PyObject *args; PyObject *kw; PyObject *dict; PyObject *weakreflist; /* List of weak references */ Py_ssize_t numph; Py_ssize_t *posph; #if PyIU_USE_VECTORCALL vectorcallfunc vectorcall; #endif } PyIUObject_Partial; extern PyTypeObject PyIUType_Partial; #ifdef __cplusplus } #endif #endif 07070100000101000081A400000000000000000000000165E3BCDA00000A51000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/partition.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "partition.h" #include "helper.h" PyObject * PyIU_Partition(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "pred", NULL}; PyObject *iterable = NULL; PyObject *func = NULL; PyObject *iterator = NULL; PyObject *result1 = NULL; PyObject *result2 = NULL; PyObject *result = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|O:partition", kwlist, &iterable, &func)) { return NULL; } iterator = PyObject_GetIter(iterable); if (iterator == NULL) { return NULL; } result1 = PyList_New(0); if (result1 == NULL) { goto Fail; } result2 = PyList_New(0); if (result2 == NULL) { goto Fail; } if (func == Py_None || func == (PyObject *)&PyBool_Type) { func = NULL; } for (;;) { PyObject *item; PyObject *temp; int ok; item = Py_TYPE(iterator)->tp_iternext(iterator); if (item == NULL) { break; } if (func == NULL) { temp = item; Py_INCREF(temp); } else { temp = PyIU_CallWithOneArgument(func, item); if (temp == NULL) { Py_DECREF(item); goto Fail; } } ok = PyObject_IsTrue(temp); Py_DECREF(temp); temp = NULL; if (ok == 1) { ok = PyList_Append(result2, item); } else if (ok == 0) { ok = PyList_Append(result1, item); } /* No need to check here if the "IsTrue" failed here. The "ok" variable is reused and the case where "IsTrue" failed and the case where "PyList_Append" failed can be handled in one go after decrementing the item! else { Py_DECREF(item); goto Fail; } */ Py_DECREF(item); item = NULL; if (ok == -1) { goto Fail; } } Py_DECREF(iterator); if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(result1); Py_DECREF(result2); return NULL; } result = PyTuple_Pack(2, result1, result2); Py_DECREF(result1); Py_DECREF(result2); return result; Fail: Py_XDECREF(result1); Py_XDECREF(result2); Py_XDECREF(iterator); return NULL; } 07070100000102000081A400000000000000000000000165E3BCDA0000011D000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/partition.h#ifndef PYIU_PARTITION_H #define PYIU_PARTITION_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_Partition(PyObject *Py_UNUSED(m), PyObject *args, PyObject *kwargs); #ifdef __cplusplus } #endif #endif 07070100000103000081A400000000000000000000000165E3BCDA00000F24000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/placeholder.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "placeholder.h" #include <structmember.h> #include "docs_reduce.h" PyDoc_STRVAR( placeholder_doc, "_PlaceholderType(/)\n" "--\n\n" "A placeholder for :py:func:`iteration_utilities.partial`. It defines the\n" "class for :attr:`iteration_utilities.partial._` and \n" ":py:const:`iteration_utilities.Placeholder`." "\n" "Notes\n" "-------\n" "There is only one instance of this class. And this class cannot be subclassed.\n"); static PyObject * placeholder_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { if (PyTuple_GET_SIZE(args) || (kwargs != NULL && PyDict_Size(kwargs))) { PyErr_Format(PyExc_TypeError, "`%.200s.__new__` takes no arguments.", Placeholder_Type.tp_name); return NULL; } Py_INCREF(PYIU_Placeholder); return PYIU_Placeholder; } static PyObject * placeholder_repr(PyObject *self) { return PyUnicode_FromString("_"); } static PyObject * placeholder_reduce(PyObject *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O()", Py_TYPE(self)); } static PyMethodDef placeholder_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)placeholder_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; PyTypeObject Placeholder_Type = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities._iteration_utilities._PlaceholderType", /* tp_name */ (Py_ssize_t)0, /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)0, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)placeholder_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)0, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT, /* tp_flags */ (const char *)placeholder_doc, /* tp_doc */ (traverseproc)0, /* tp_traverse */ (inquiry)0, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ placeholder_methods, /* tp_methods */ 0, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)placeholder_new, /* tp_new */ }; PyObject PlaceholderStruct = PYIU_CREATE_SINGLETON_INSTANCE(Placeholder_Type); 07070100000104000081A400000000000000000000000165E3BCDA00000143000000000000000000000000000000000000005600000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/placeholder.h#ifndef PYIU_PLACEHOLDER_H #define PYIU_PLACEHOLDER_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" extern PyObject PlaceholderStruct; extern PyTypeObject Placeholder_Type; #define PYIU_Placeholder (&PlaceholderStruct) #ifdef __cplusplus } #endif #endif 07070100000105000081A400000000000000000000000165E3BCDA00002DDE000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/replicate.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "replicate.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" PyDoc_STRVAR( replicate_prop_times_doc, "(:py:class:`int`) The number of times each item is replicated (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( replicate_prop_timescurrent_doc, "(:py:class:`int`) A counter indicating how often the current item was " "already replicated (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( replicate_prop_current_doc, "(any type) The item that is currently replicated (readonly).\n" "\n" "Only available if an item has been replicated.\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( replicate_doc, "replicate(iterable, times)\n" "--\n\n" "Replicates each item in the `iterable` for `times` times.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The iterable which contains the elements to be replicated.\n" "\n" "times : positive :py:class:`int`\n" " The number of `times` each element is replicated.\n" "\n" "Returns\n" "-------\n" "repeated_iterable : generator\n" " A generator containing the replicated items from `iterable`.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import replicate\n" ">>> ''.join(replicate('abc', 3))\n" "'aaabbbccc'\n" "\n" ">>> list(replicate(range(3), 5))\n" "[0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2]\n"); static PyObject * replicate_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "times", NULL}; PyIUObject_Replicate *self; PyObject *iterable; Py_ssize_t times; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "On:replicate", kwlist, &iterable, ×)) { return NULL; } if (times <= 1) { PyErr_Format(PyExc_ValueError, "`times` argument for `replicate` must be greater " "than 1, not `%zd`", times); return NULL; } self = (PyIUObject_Replicate *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->current = NULL; self->repeattotal = times; self->repeatcurrent = 0; return (PyObject *)self; } static void replicate_dealloc(PyIUObject_Replicate *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->current); Py_TYPE(self)->tp_free(self); } static int replicate_traverse(PyIUObject_Replicate *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->current); return 0; } static int replicate_clear(PyIUObject_Replicate *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->current); return 0; } static PyObject * replicate_next(PyIUObject_Replicate *self) { if (self->current == NULL) { /* First time around we need to get the first element of the iterator to fill the current. */ self->current = Py_TYPE(self->iterator)->tp_iternext(self->iterator); } else if (self->repeatcurrent == self->repeattotal) { /* If we had x repeats then we also need to get the next element, and dereference the old one. */ PyObject *item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); Py_SETREF(self->current, item); self->repeatcurrent = 0; } if (self->current == NULL) { /* In case something unexpected happened or the iterator finished we can stop now. */ if (PyErr_Occurred() && PyErr_ExceptionMatches(PyExc_StopIteration)) { PyErr_Clear(); } return NULL; } /* Otherwise just return the current item. */ self->repeatcurrent++; Py_INCREF(self->current); return self->current; } static PyObject * replicate_reduce(PyIUObject_Replicate *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on current == NULL because otherwise "None" would be ambiguous. It could mean that we did not had a current item or that the current item was None. Better to make an "if" than to introduce another variable depending on current == NULL. */ if (self->current == NULL) { return Py_BuildValue("O(On)", Py_TYPE(self), self->iterator, self->repeattotal); } else { return Py_BuildValue("O(On)(On)", Py_TYPE(self), self->iterator, self->repeattotal, self->current, self->repeatcurrent); } } static PyObject * replicate_setstate(PyIUObject_Replicate *self, PyObject *state) { PyObject *current; Py_ssize_t repeatcurrent; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "On:replicate.__setstate__", ¤t, &repeatcurrent)) { return NULL; } if (repeatcurrent < 0 || repeatcurrent > self->repeattotal) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected a that the second item " "in the `state` is greater or equal to zero and below " "the `times` (%zd), not `%zd`.", Py_TYPE(self)->tp_name, self->repeattotal, repeatcurrent); return NULL; } Py_INCREF(current); Py_XSETREF(self->current, current); self->repeatcurrent = repeatcurrent; Py_RETURN_NONE; } static PyObject * replicate_lengthhint(PyIUObject_Replicate *self, PyObject *Py_UNUSED(args)) { Py_ssize_t len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } /* Check if it is safe (no overflow) to multiply it. */ if (len > PY_SSIZE_T_MAX / self->repeattotal) { PyErr_SetString(PyExc_OverflowError, "cannot fit 'int' into an index-sized " "integer"); return NULL; } len *= self->repeattotal; if (self->current != NULL) { /* We need to avoid signed integer overflow so do the operation on size_t instead. "repeattotal" >= "repeatcurrent" so we only deal with positive values here and we're overflow safe when doing the operation on (size_t) because 2*PY_SSIZE_T_MAX is still below SIZE_T_MAX. We also don't need to check for overflow because we can simply return "PyLong_FromSize_t", which will fail when someone else wants it as Py_ssize_t (later). */ size_t ulen = (size_t)len; ulen += (size_t)(self->repeattotal - self->repeatcurrent); return PyLong_FromSize_t(ulen); } return PyLong_FromSsize_t(len); } static PyMethodDef replicate_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)replicate_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)replicate_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)replicate_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef replicate_memberlist[] = { { "times", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Replicate, repeattotal), /* offset */ READONLY, /* flags */ replicate_prop_times_doc /* doc */ }, { "timescurrent", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Replicate, repeatcurrent), /* offset */ READONLY, /* flags */ replicate_prop_timescurrent_doc /* doc */ }, { "current", /* name */ T_OBJECT_EX, /* type */ offsetof(PyIUObject_Replicate, current), /* offset */ READONLY, /* flags */ replicate_prop_current_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Replicate = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.replicate", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Replicate), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)replicate_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)replicate_doc, /* tp_doc */ (traverseproc)replicate_traverse, /* tp_traverse */ (inquiry)replicate_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)replicate_next, /* tp_iternext */ replicate_methods, /* tp_methods */ replicate_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)replicate_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000106000081A400000000000000000000000165E3BCDA00000195000000000000000000000000000000000000005400000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/replicate.h#ifndef PYIU_REPLICATE_H #define PYIU_REPLICATE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *current; Py_ssize_t repeattotal; Py_ssize_t repeatcurrent; } PyIUObject_Replicate; extern PyTypeObject PyIUType_Replicate; #ifdef __cplusplus } #endif #endif 07070100000107000081A400000000000000000000000165E3BCDA000004CB000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/returnx.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "returnx.h" #include "helper.h" /****************************************************************************** * return_identity : lambda o: o * return_called : lambda o: o() * return_first_arg : (roughly) lambda *args, **kwargs: args[0] *****************************************************************************/ PyObject * PyIU_ReturnIdentity(PyObject *Py_UNUSED(m), PyObject *o) { Py_INCREF(o); return o; } PyObject * PyIU_ReturnCalled(PyObject *Py_UNUSED(m), PyObject *o) { return PyIU_CallWithNoArgument(o); } PyObject * PyIU_ReturnFirstArg(PyObject *Py_UNUSED(m), PyObject *args, PyObject *Py_UNUSED(kwargs)) { PyObject *first; if (!PyTuple_CheckExact(args) || PyTuple_GET_SIZE(args) == 0) { PyErr_SetString(PyExc_TypeError, "`return_first_arg` expected at least one positional " "argument."); return NULL; } first = PyTuple_GET_ITEM(args, 0); Py_INCREF(first); return first; } 07070100000108000081A400000000000000000000000165E3BCDA000001B1000000000000000000000000000000000000005200000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/returnx.h#ifndef PYIU_RETURNX_H #define PYIU_RETURNX_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" PyObject * PyIU_ReturnIdentity(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_ReturnCalled(PyObject *Py_UNUSED(m), PyObject *o); PyObject * PyIU_ReturnFirstArg(PyObject *Py_UNUSED(m), PyObject *args, PyObject *Py_UNUSED(kwargs)); #ifdef __cplusplus } #endif #endif 07070100000109000081A400000000000000000000000165E3BCDA00002F35000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/roundrobin.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "roundrobin.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( roundrobin_doc, "roundrobin(*iterables)\n" "--\n\n" "Round-Robin implementation ([0]_).\n" "\n" "Parameters\n" "----------\n" "iterables : iterable\n" " `Iterables` to combine using the round-robin. Any amount of iterables\n" " are supported.\n" "\n" "Returns\n" "-------\n" "roundrobin : generator\n" " Iterable filled with the values of the `iterables`.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import roundrobin\n" ">>> list(roundrobin('ABC', 'D', 'EF'))\n" "['A', 'D', 'E', 'B', 'F', 'C']\n" "\n" "References\n" "----------\n" ".. [0] https://en.wikipedia.org/wiki/Round-robin_scheduling\n"); static PyObject * roundrobin_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { PyIUObject_Roundrobin *self; self = (PyIUObject_Roundrobin *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iteratortuple = PyIU_CreateIteratorTuple(args); if (self->iteratortuple == NULL) { Py_DECREF(self); return NULL; } self->numactive = PyTuple_GET_SIZE(args); self->active = 0; return (PyObject *)self; } static void roundrobin_dealloc(PyIUObject_Roundrobin *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iteratortuple); Py_TYPE(self)->tp_free(self); } static int roundrobin_traverse(PyIUObject_Roundrobin *self, visitproc visit, void *arg) { Py_VISIT(self->iteratortuple); return 0; } static int roundrobin_clear(PyIUObject_Roundrobin *self) { Py_CLEAR(self->iteratortuple); return 0; } static PyObject * roundrobin_next(PyIUObject_Roundrobin *self) { PyObject *iterator; PyObject *item; /* Stop if there is no active iterator left. */ if (self->numactive == 0) { return NULL; } iterator = PyTuple_GET_ITEM(self->iteratortuple, self->active); while ((item = Py_TYPE(iterator)->tp_iternext(iterator)) == NULL) { if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } if (self->active == self->numactive - 1) { /* If the last iterable in the iteratortuple is empty simply set it to NULL and reset the active pointer to 0. */ PyTuple_SET_ITEM(self->iteratortuple, self->active, NULL); self->active = 0; } else { /* Otherwise move each item in the tuple (after the empty iterator) one to the left. */ PyIU_TupleRemove(self->iteratortuple, self->active, self->numactive); } self->numactive--; Py_DECREF(iterator); /* End the loop as soon as no active iterators are available or use the next iterator. */ if (self->numactive == 0) { break; } else { iterator = PyTuple_GET_ITEM(self->iteratortuple, self->active); } } if (self->numactive == 0) { return NULL; } /* Increment the active pointer (potentially wrapping it around to 0). */ self->active = (self->active + 1) % (self->numactive); return item; } static PyObject * roundrobin_reduce(PyIUObject_Roundrobin *self, PyObject *Py_UNUSED(args)) { PyObject *ittuple; PyObject *res; /* The "next" method modifies the "iteratortuple" so when someone uses "reduce" they would see how it changes. So we need to return a copy of that tuple here. An additional side-effect of "next" is that exhausted iterators are removed from the tuple, decremented and finally replaced by NULL at the end of the tuple. Python interpreters shouldn't see these NULLs (otherwise that might segfault) so we can simply "copy by slicing" when there are already exhausted iterators inside. */ if (PyTuple_GET_SIZE(self->iteratortuple) != self->numactive) { ittuple = PyIU_TupleGetSlice(self->iteratortuple, self->numactive); } else { ittuple = PyIU_TupleCopy(self->iteratortuple); } /* The error handling for both branches. */ if (ittuple == NULL) { return NULL; } res = Py_BuildValue("OO(nn)", Py_TYPE(self), ittuple, self->numactive, self->active); Py_DECREF(ittuple); return res; } static PyObject * roundrobin_setstate(PyIUObject_Roundrobin *self, PyObject *state) { Py_ssize_t numactive, active; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "nn:roundrobin.__setstate__", &numactive, &active)) { return NULL; } /* active and numactive must be greater than zero, otherwise the "next" method could access out-of-bounds indices for the iteratortuple. */ if (active < 0 || numactive < 0) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the first (%zd) and " "second (%zd) argument in the `state` are not negative.", Py_TYPE(self)->tp_name, numactive, active); return NULL; } /* If numactive is not zero than the active must be strictly smaller than numactive, otherwise the next "next" call would access an out of bounds index of the iteratortuple (or NULL). If "numactive" is zero then "active" must be zero as well (in this case it must not be greater but equal). */ if (numactive != 0 && active >= numactive) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the first (%zd) " "argument in the `state` is strictly greater than the " "second (%zd) argument, if the first argument isn't zero.", Py_TYPE(self)->tp_name, numactive, active); return NULL; } else if (numactive == 0 && active != 0) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the second (%zd) " "argument in the `state` is zero if the first " "argument (%zd) argument is zero.", Py_TYPE(self)->tp_name, active, numactive); return NULL; } /* The "numactive" argument must match the number of not-NULL arguments in the iteratortuple. Luckily the NULLs are at the end of the "iteratortuple". */ if (1) { Py_ssize_t tupsize = PyTuple_GET_SIZE(self->iteratortuple); /* decrement the tuple size as long as the last item is NULL. */ while (tupsize > 0 && PyTuple_GET_ITEM(self->iteratortuple, tupsize - 1) == NULL) { tupsize--; } if (numactive != tupsize) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the first " "argument in the `state` (%zd) is equal to the number " "of not exhausted iterators (%zd) in the instance.", Py_TYPE(self)->tp_name, numactive, tupsize); return NULL; } } self->numactive = numactive; self->active = active; Py_RETURN_NONE; } static PyObject * roundrobin_lengthhint(PyIUObject_Roundrobin *self, PyObject *Py_UNUSED(args)) { Py_ssize_t i; size_t len = 0; for (i = 0; i < self->numactive; i++) { PyObject *it = PyTuple_GET_ITEM(self->iteratortuple, i); Py_ssize_t len_tmp = PyObject_LengthHint(it, 0); if (len_tmp == -1) { return NULL; } /* The logic to avoid overflow is the same as in merge. Basically adding the current length + next iterator length cannot lead to overflow for size_t because we check after each addition if the current length goes above py_ssize_t maximum. */ len += (size_t)len_tmp; if (len > (size_t)PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_OverflowError, "cannot fit 'int' into an index-sized " "integer"); return NULL; } } return PyLong_FromSsize_t(len); } static PyMethodDef roundrobin_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)roundrobin_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)roundrobin_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)roundrobin_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; PyTypeObject PyIUType_Roundrobin = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.roundrobin", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Roundrobin), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)roundrobin_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)roundrobin_doc, /* tp_doc */ (traverseproc)roundrobin_traverse, /* tp_traverse */ (inquiry)roundrobin_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)roundrobin_next, /* tp_iternext */ roundrobin_methods, /* tp_methods */ 0, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)roundrobin_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 0707010000010A000081A400000000000000000000000165E3BCDA0000017E000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/roundrobin.h#ifndef PYIU_ROUNDROBIN_H #define PYIU_ROUNDROBIN_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iteratortuple; Py_ssize_t numactive; Py_ssize_t active; } PyIUObject_Roundrobin; extern PyTypeObject PyIUType_Roundrobin; #ifdef __cplusplus } #endif #endif 0707010000010B000081A400000000000000000000000165E3BCDA000042A8000000000000000000000000000000000000004F00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/seen.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "seen.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( seen_prop_seenset_doc, "(:py:class:`set`) The (hashable) seen values (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( seen_prop_seenlist_doc, "(:py:class:`list` or None) The (unhashable) seen values (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( seen_doc, "Seen(seenset=None, seenlist=None)\n" "--\n\n" "Helper class which adds the items after each :py:meth:`.contains_add` check.\n" "\n" "Parameters\n" "----------\n" "seenset : :py:class:`set` or None, optional\n" " A :py:class:`set` containing initial values.\n" "\n" "seenlist : :py:class:`list` or None, optional\n" " A :py:class:`list` containing only unhashable initial values.\n" " \n" " .. note::\n" " The `seenlist` should not contain hashable values (these will \n" " be ignored for all practical purposes)!\n" "\n" "Examples\n" "--------\n" "This class adds each item after :py:meth:`.contains_add` call but also \n" "supports normal :py:meth:`in <.__contains__>` operations::\n" "\n" " >>> from iteration_utilities import Seen\n" " >>> x = Seen()\n" " >>> # normal \"in\" operations do not add the element to the instance\n" " >>> 1 in x\n" " False\n" " >>> 1 in x\n" " False\n" " \n" " >>> # \"contains_add\" checks if the item is contained but also adds it\n" " >>> x.contains_add(2)\n" " False\n" " >>> x.contains_add(2)\n" " True\n" " >>> x\n" " iteration_utilities.Seen({2})\n" " \n" " >>> x.contains_add([1, 2])\n" " False\n" " >>> [1, 2] in x\n" " True\n" " >>> x\n" " iteration_utilities.Seen({2}, seenlist=[[1, 2]])\n" "\n" "This class does only support :py:meth:`in <.__contains__>`, \n" ":py:meth:`== <.__eq__>`, :py:meth:`\\!= <.__ne__>` and \n" ":py:meth:`len <.__len__>`.\n" "It is mostly included because it unified the code in \n" ":py:func:`~iteration_utilities.duplicates`,\n" ":py:func:`~iteration_utilities.unique_everseen`, and \n" ":py:func:`~iteration_utilities.all_distinct` and might be useful in other \n" "applications.\n"); PyDoc_STRVAR( seen_containsadd_doc, "contains_add($self, o, /)\n" "--\n\n" "Check if `o` is already contained in `self` and return the result.\n" "But also adds `o` to `self` if it's not contained.\n" "\n" "Parameters\n" "----------\n" "o : any type\n" " The object to check if it's contained in `self` and added to\n" " `self` if not.\n" "\n" "Returns\n" "-------\n" "contained : :py:class:`bool`\n" " ``True`` if `o` is contained in `self` otherwise ``False``.\n" "\n" "Examples\n" "--------\n" "A simple example::\n" "\n" " >>> from iteration_utilities import Seen\n" " >>> x = Seen()\n" " >>> 10 in x\n" " False\n" " >>> x.contains_add(10)\n" " False\n" " >>> 10 in x\n" " True\n" " >>> x.contains_add(10)\n" " True\n" " >>> x\n" " iteration_utilities.Seen({10})\n"); /****************************************************************************** * * Helper class that wraps a set and list. This class is simply for convenience * so "contains and add if not contained"-operations are seperated from the * logic of "uniques_everseen", "all_distinct" doesn't need to contain it. * * TODO: This refactoring slowed down the code a bit (not-negligible in my * opinion) but it makes it much more concise. Need to check for * possibilities to improve performance. * * Public macros: * - PyIUSeen_Check(PyObject*) * * Public functions: * - PyIUSeen_New(void) -> PyObject* * - PyIUSeen_Size(PyIUObject_Seen*) -> Py_ssize_t * - PyIUSeen_Contains(PyIUObject_Seen*, PyObject*) -> int * (-1 failure, 0 not contained, 1 contained) *****************************************************************************/ /****************************************************************************** * Creates a new PyIUSeen objects with empty seenset and seenlist. * Returns ``NULL`` on failure with the appropriate exception. *****************************************************************************/ PyObject * PyIUSeen_New(void) { /* Create and fill new object. */ PyIUObject_Seen *self; PyObject *seenset; seenset = PySet_New(NULL); if (seenset == NULL) { return NULL; } self = PyObject_GC_New(PyIUObject_Seen, &PyIUType_Seen); if (self == NULL) { Py_DECREF(seenset); return NULL; } self->seenset = seenset; self->seenlist = NULL; PyObject_GC_Track(self); return (PyObject *)self; } static PyObject * seen_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"seenset", "seenlist", NULL}; PyIUObject_Seen *self; PyObject *seenset = NULL; PyObject *seenlist = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "|OO:Seen", kwlist, &seenset, &seenlist)) { return NULL; } if (seenset == Py_None) { seenset = NULL; } if (seenlist == Py_None) { seenlist = NULL; } if (seenset == NULL) { seenset = PySet_New(NULL); if (seenset == NULL) { return NULL; } } else { if (!PyIU_IsTypeExact(seenset, &PySet_Type)) { PyErr_Format(PyExc_TypeError, "`seenset` argument for `Seen` must be a set or " "None, not `%.200s`.", Py_TYPE(seenset)->tp_name); return NULL; } Py_INCREF(seenset); } if (seenlist != NULL && !PyList_CheckExact(seenlist)) { PyErr_Format(PyExc_TypeError, "`seenlist` argument for `Seen` must be a list or None, " "not `%.200s`.", Py_TYPE(seenlist)->tp_name); Py_DECREF(seenset); return NULL; } self = (PyIUObject_Seen *)type->tp_alloc(type, 0); if (self == NULL) { Py_DECREF(seenset); return NULL; } Py_XINCREF(seenlist); self->seenset = seenset; self->seenlist = seenlist; return (PyObject *)self; } static void seen_dealloc(PyIUObject_Seen *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->seenset); Py_XDECREF(self->seenlist); Py_TYPE(self)->tp_free((PyObject *)self); } static int seen_traverse(PyIUObject_Seen *self, visitproc visit, void *arg) { Py_VISIT(self->seenset); Py_VISIT(self->seenlist); return 0; } static int seen_clear(PyIUObject_Seen *self) { Py_CLEAR(self->seenset); Py_CLEAR(self->seenlist); return 0; } static PyObject * seen_repr(PyIUObject_Seen *self) { PyObject *repr; int ok; ok = Py_ReprEnter((PyObject *)self); if (ok != 0) { return ok > 0 ? PyUnicode_FromString("...") : NULL; } if (self->seenlist != NULL && PyList_GET_SIZE(self->seenlist) > 0) { repr = PyUnicode_FromFormat("%s(%R, seenlist=%R)", Py_TYPE(self)->tp_name, self->seenset, self->seenlist); } else { repr = PyUnicode_FromFormat("%s(%R)", Py_TYPE(self)->tp_name, self->seenset); } Py_ReprLeave((PyObject *)self); return repr; } static PyObject * seen_richcompare(PyObject *v, PyObject *w, int op) { PyIUObject_Seen *l; PyIUObject_Seen *r; int ok; /* Only allow == and != for now. */ switch (op) { case Py_EQ: case Py_NE: break; default: Py_RETURN_NOTIMPLEMENTED; } if (!PyIUSeen_Check(v) || !(PyIUSeen_Check(w))) { PyErr_SetString(PyExc_TypeError, "`Seen` instances can only compared to other `Seen` " "instances."); return NULL; } l = (PyIUObject_Seen *)v; r = (PyIUObject_Seen *)w; /* Check if either both have seenlists or none. */ if ((l->seenlist == NULL && r->seenlist != NULL && PyList_GET_SIZE(r->seenlist)) || (r->seenlist == NULL && l->seenlist != NULL && PyList_GET_SIZE(l->seenlist))) { if (op == Py_NE) { Py_RETURN_TRUE; } else { Py_RETURN_FALSE; } /* If both have seenlists then compare them. */ } else if (l->seenlist != NULL && r->seenlist != NULL) { ok = PyObject_RichCompareBool(l->seenlist, r->seenlist, op); if (op == Py_EQ && ok == 0) { Py_RETURN_FALSE; } else if (op == Py_NE && ok == 1) { Py_RETURN_TRUE; } else if (ok == -1) { return NULL; } } ok = PyObject_RichCompareBool(l->seenset, r->seenset, op); if (ok == 1) { Py_RETURN_TRUE; } else if (ok == 0) { Py_RETURN_FALSE; } else { return NULL; } } static PyObject * seen_reduce(PyIUObject_Seen *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OO)", Py_TYPE(self), self->seenset, self->seenlist ? self->seenlist : Py_None); } /****************************************************************************** * Len * * May be not overflow safe ... *****************************************************************************/ Py_ssize_t PyIUSeen_Size(PyIUObject_Seen *self) { assert(self != NULL); if (self->seenlist != NULL) { return PySet_Size(self->seenset) + PyList_GET_SIZE(self->seenlist); } else { return PySet_Size(self->seenset); } } static Py_ssize_t seen_len(PyObject *self) { return PyIUSeen_Size((PyIUObject_Seen *)self); } /****************************************************************************** * ContainsAdd * * Checks if the object is contained in seenset or seenlist and returns * 1 - if the item was found * 0 - if the item was not found * -1 - if some exception happened. *****************************************************************************/ static int seen_containsadd_direct(PyIUObject_Seen *self, PyObject *o) { int ok; Py_ssize_t oldsize = PySet_GET_SIZE(self->seenset); ok = PySet_Add(self->seenset, o); if (ok == 0) { /* No error: If the size of the set hasn't changed then the item was contained in the set already. */ return PySet_GET_SIZE(self->seenset) == oldsize ? 1 : 0; } else { /* Clear TypeErrors because they are thrown if the object is unhashable. */ if (PyErr_Occurred()) { if (PyErr_ExceptionMatches(PyExc_TypeError)) { PyErr_Clear(); } else { return -1; } } if (self->seenlist == NULL && !(self->seenlist = PyList_New(0))) { return -1; } ok = PySequence_Contains(self->seenlist, o); if (ok == 1) { /* Unhashable, found */ return 1; } else if (ok == 0) { /* Unhashable, not found */ if (PyList_Append(self->seenlist, o) == -1) { return -1; } return 0; } else { /* Unhashable and exception when looking it up in the list. */ return -1; } } } static int seen_containsnoadd_direct(PyIUObject_Seen *self, PyObject *o) { int ok = PySet_Contains(self->seenset, o); if (ok != -1) { return ok; } else { /* Clear TypeErrors because they are thrown if the object is unhashable. */ if (PyErr_Occurred()) { if (PyErr_ExceptionMatches(PyExc_TypeError)) { PyErr_Clear(); } else { return -1; } } if (self->seenlist == NULL) { return 0; } return PySequence_Contains(self->seenlist, o); } } int PyIUSeen_ContainsAdd(PyObject *self, PyObject *o) { assert(self != NULL && PyIU_IsTypeExact(self, &PyIUType_Seen)); return seen_containsadd_direct((PyIUObject_Seen *)self, o); } static PyObject * seen_containsadd(PyObject *self, PyObject *o) { int ok; ok = seen_containsadd_direct((PyIUObject_Seen *)self, o); if (ok == 0) { Py_RETURN_FALSE; } else if (ok == 1) { Py_RETURN_TRUE; } else { return NULL; } } static PySequenceMethods seen_as_sequence = { (lenfunc)seen_len, /* sq_length */ (binaryfunc)0, /* sq_concat */ (ssizeargfunc)0, /* sq_repeat */ (ssizeargfunc)0, /* sq_item */ (void *)0, /* unused */ (ssizeobjargproc)0, /* sq_ass_item */ (void *)0, /* unused */ (objobjproc)seen_containsnoadd_direct, /* sq_contains */ (binaryfunc)0, /* sq_inplace_concat */ (ssizeargfunc)0, /* sq_inplace_repeat */ }; static PyMethodDef seen_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)seen_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "contains_add", /* ml_name */ (PyCFunction)seen_containsadd, /* ml_meth */ METH_O, /* ml_flags */ seen_containsadd_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef seen_memberlist[] = { { "seenset", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Seen, seenset), /* offset */ READONLY, /* flags */ seen_prop_seenset_doc /* doc */ }, { "seenlist", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Seen, seenlist), /* offset */ READONLY, /* flags */ seen_prop_seenlist_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Seen = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.Seen", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Seen), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)seen_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)seen_repr, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)&seen_as_sequence, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)0, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)seen_doc, /* tp_doc */ (traverseproc)seen_traverse, /* tp_traverse */ (inquiry)seen_clear, /* tp_clear */ (richcmpfunc)seen_richcompare, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)0, /* tp_iter */ (iternextfunc)0, /* tp_iternext */ seen_methods, /* tp_methods */ seen_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)seen_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 0707010000010C000081A400000000000000000000000165E3BCDA00000212000000000000000000000000000000000000004F00000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/seen.h#ifndef PYIU_SEEN_H #define PYIU_SEEN_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *seenset; PyObject *seenlist; } PyIUObject_Seen; extern PyTypeObject PyIUType_Seen; #define PyIUSeen_Check(o) (PyObject_TypeCheck(o, &PyIUType_Seen)) PyObject * PyIUSeen_New(void); Py_ssize_t PyIUSeen_Size(PyIUObject_Seen *self); int PyIUSeen_ContainsAdd(PyObject *self, PyObject *o); #ifdef __cplusplus } #endif #endif 0707010000010D000081A400000000000000000000000165E3BCDA0000467E000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/sideeffect.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "sideeffect.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( sideeffects_prop_func_doc, "(callable) The function that is called by `sideeffects` (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( sideeffects_prop_times_doc, "(:py:class:`int`) A counter indicating after how many items the `func` " "is called (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( sideeffects_prop_count_doc, "(:py:class:`int`) The current count for the next `func` call (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( sideeffects_doc, "sideeffects(iterable, func, times=0)\n" "--\n\n" "Does a normal iteration over `iterable` and only uses `func` each `times` \n" "items for it's side effects.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " `Iterable` containing the elements.\n" "\n" "func : callable\n" " Function that is called for the side effects.\n" "\n" "times : :py:class:`int`, optional\n" " Call the function each `times` items with the last `times` items. \n" " If ``0`` the argument for `func` will be the item itself. For any \n" " number greater than zero the argument will be a tuple.\n" " Default is ``0``.\n" "\n" "Returns\n" "-------\n" "iterator : generator\n" " A normal iterator over `iterable`.\n" "\n" "Examples\n" "--------\n" "A simple example::\n" "\n" " >>> from iteration_utilities import sideeffects\n" " >>> list(sideeffects([1,2,3,4], print))\n" " 1\n" " 2\n" " 3\n" " 4\n" " [1, 2, 3, 4]\n" " >>> list(sideeffects([1,2,3,4,5], print, 2))\n" " (1, 2)\n" " (3, 4)\n" " (5,)\n" " [1, 2, 3, 4, 5]\n"); static PyObject * sideeffects_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "func", "times", NULL}; PyIUObject_Sideeffects *self; PyObject *iterable; PyObject *func; Py_ssize_t times = 0; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO|n:sideeffects", kwlist, &iterable, &func, ×)) { return NULL; } self = (PyIUObject_Sideeffects *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->times = times <= 0 ? 0 : times; if (times <= 0) { /* negative values will be interpreted as zero... */ self->collected = NULL; } else { self->collected = PyTuple_New(self->times); if (self->collected == NULL) { Py_DECREF(self); } } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_XDECREF(self); return NULL; } Py_INCREF(func); self->func = func; self->count = 0; return (PyObject *)self; } static void sideeffects_dealloc(PyIUObject_Sideeffects *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->func); Py_XDECREF(self->collected); Py_TYPE(self)->tp_free(self); } static int sideeffects_traverse(PyIUObject_Sideeffects *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->func); Py_VISIT(self->collected); return 0; } static int sideeffects_clear(PyIUObject_Sideeffects *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->func); Py_CLEAR(self->collected); return 0; } static PyObject * sideeffects_next(PyIUObject_Sideeffects *self) { PyObject *item; PyObject *temp = NULL; PyObject *tmptuple = NULL; Py_ssize_t i; item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); if (item == NULL) { /* We don't expect that the sideeffect function is called when an exception other than StopIteration is raised by the iterator so exit early in that case. */ if (PyIU_ErrorOccurredClearStopIteration()) { return NULL; } if (self->count != 0) { /* Call function with the remaining items. */ tmptuple = PyIU_TupleGetSlice(self->collected, self->count); if (tmptuple == NULL) { return NULL; } temp = PyIU_CallWithOneArgument(self->func, tmptuple); Py_DECREF(tmptuple); if (temp != NULL) { Py_DECREF(temp); } /* The case where temp == NULL is handled by the following "return NULL" anyway so it does not need to be a special case here. */ } return NULL; } if (self->times == 0) { /* Always call the function if times == 0 */ temp = PyIU_CallWithOneArgument(self->func, item); if (temp == NULL) { goto Fail; } else { Py_DECREF(temp); } } else { Py_INCREF(item); /* Add the item to the collected tuple and call the function if count == times after incrementing the count. */ PyTuple_SET_ITEM(self->collected, self->count, item); self->count++; if (self->count == self->times) { self->count = 0; temp = PyIU_CallWithOneArgument(self->func, self->collected); if (temp == NULL) { goto Fail; } else { Py_DECREF(temp); } /* Try to reuse collected if possible. In this case the "funcargs" and the class own a reference to collected so we can only reuse the collected tuple IF nobody except the instance owns the "funcargs". This can be up to 40-50% faster for small "times" values. Even for relatively bigger ones this is still 10% faster. To avoid needing to decrement the values in the tuple while iterating these are simply set to NULL. */ if (PYIU_CPYTHON && (Py_REFCNT(self->collected) == 1)) { for (i = 0; i < self->times; i++) { temp = PyTuple_GET_ITEM(self->collected, i); PyTuple_SET_ITEM(self->collected, i, NULL); Py_DECREF(temp); } } else { PyObject *new_collected = PyTuple_New(self->times); if (new_collected == NULL) { goto Fail; } Py_SETREF(self->collected, new_collected); } } } return item; Fail: Py_XDECREF(item); return NULL; } static PyObject * sideeffects_reduce(PyIUObject_Sideeffects *self, PyObject *Py_UNUSED(args)) { PyObject *collected; PyObject *res; /* There are several issues that prevent from simply wrapping the attributes. */ if (self->collected == NULL) { /* When "collected" is NULL we wrap it as None, and no further processing is needed. */ Py_INCREF(Py_None); collected = Py_None; } else { /* When we have "collected" then it's a tuple that may contain NULLs. The Python interpreter does not like NULLs so these must be replaced by some fillvalue (in this case None). However we modify the tuple inside the "next" method so if someone called "reduce" that person could **see** the "collected" tuple change. That must be avoided so we MUST return a copy of the "collected" tuple. */ Py_ssize_t i; Py_ssize_t collected_size = PyTuple_GET_SIZE(self->collected); collected = PyTuple_New(collected_size); if (collected == NULL) { return NULL; } for (i = 0; i < collected_size; i++) { PyObject *tmp = PyTuple_GET_ITEM(self->collected, i); if (tmp == NULL) { tmp = Py_None; } Py_INCREF(tmp); PyTuple_SET_ITEM(collected, i, tmp); } } res = Py_BuildValue("O(OOn)(nO)", Py_TYPE(self), self->iterator, self->func, self->times, self->count, collected); Py_DECREF(collected); return res; } static PyObject * sideeffects_setstate(PyIUObject_Sideeffects *self, PyObject *state) { Py_ssize_t count; PyObject *collected; PyObject *newcollected = NULL; Py_ssize_t collected_size = 0; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "nO:sideeffects.__setstate__", &count, &collected)) { return NULL; } /* The "collected" argument should be a tuple (because we use PyTuple_GET_ITEM and PyTuple_SET_ITEM and thus would risk segmentation faults if we don't check that it's a tuple) or None. */ if (PyTuple_CheckExact(collected)) { /* The class itself has a "times" attribute, if that attribute is zero we do not need a "collected" tuple, it should have been "None". */ if (self->times == 0) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected `None` as second " "argument in the `state` when `self->times == 0`, " "got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(collected)->tp_name); return NULL; } /* The "count" must not be negative or bigger/equal to the size of the "collected" tuple. Otherwise we would access indices that are out of bounds for the tuple in "next". */ collected_size = PyTuple_GET_SIZE(collected); if (count < 0 || count >= collected_size) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the first " "argument in the `state` (%zd) is not negative and " "smaller than the length of the second argument " "(%zd).", Py_TYPE(self)->tp_name, count, collected_size); return NULL; } /* The length of the "collected" tuple must also be equal to the "self->times" attribute. */ if (self->times != collected_size) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the second " "argument in the `state` has a length (%zd) " "equal to the `self->times` (%zd) attribute.", Py_TYPE(self)->tp_name, collected_size, self->times); return NULL; } } else if (collected == Py_None) { /* We only expect None if self->times and count is zero. */ if (count != 0 || self->times != 0) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple` as second " "argument in the `state` when `self->times != 0` or " "the first argument in the `state` is not zero, " "got None", Py_TYPE(self)->tp_name); return NULL; } } else { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple` or `None` as " "second argument in the `state`, got %.200s", Py_TYPE(self)->tp_name, Py_TYPE(collected)->tp_name); return NULL; } /* In any case we need to process the "collected" value. In case it is "None" we simply set it to NULL. However if it's not None then it's a tuple. We process the tuple in the "next" function but it's possible that someone still holds a reference to the tuple he passed in. So to make sure that we don't mutate tuples that are in use elsewhere we create a new tuple here. That also has the additional advantage that we can leave the values with index below "count" as NULL. The "next" method assumes that it doesn't have to decrement items that it sets so this makes sure we don't create a memory leak there. */ newcollected = NULL; if (collected != Py_None) { Py_ssize_t i; newcollected = PyTuple_New(collected_size); if (newcollected == NULL) { return NULL; } for (i = 0; i < count; i++) { PyObject *tmp = PyTuple_GET_ITEM(collected, i); Py_INCREF(tmp); PyTuple_SET_ITEM(newcollected, i, tmp); } } self->count = count; /* We already created a new tuple for "collected" or it's None so no need to increment the reference count here. */ Py_XSETREF(self->collected, newcollected); Py_RETURN_NONE; } static PyObject * sideeffects_lengthhint(PyIUObject_Sideeffects *self, PyObject *Py_UNUSED(args)) { Py_ssize_t len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } return PyLong_FromSsize_t(len); } static PyMethodDef sideeffects_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)sideeffects_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)sideeffects_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)sideeffects_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef sideeffects_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Sideeffects, func), /* offset */ READONLY, /* flags */ sideeffects_prop_func_doc /* doc */ }, { "times", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Sideeffects, times), /* offset */ READONLY, /* flags */ sideeffects_prop_times_doc /* doc */ }, { "count", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Sideeffects, count), /* offset */ READONLY, /* flags */ sideeffects_prop_count_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Sideeffects = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.sideeffects", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Sideeffects), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)sideeffects_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)sideeffects_doc, /* tp_doc */ (traverseproc)sideeffects_traverse, /* tp_traverse */ (inquiry)sideeffects_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)sideeffects_next, /* tp_iternext */ sideeffects_methods, /* tp_methods */ sideeffects_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)sideeffects_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 0707010000010E000081A400000000000000000000000165E3BCDA00000258000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/sideeffect.h#ifndef PYIU_SIDEEFFECT_H #define PYIU_SIDEEFFECT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; /* iterator over data */ PyObject *func; /* Function to call */ Py_ssize_t times; /* Call side effects each x items */ Py_ssize_t count; /* Current counter when to call func */ PyObject *collected; /* Collect items to pass to side-effects */ } PyIUObject_Sideeffects; extern PyTypeObject PyIUType_Sideeffects; #ifdef __cplusplus } #endif #endif 0707010000010F000081A400000000000000000000000165E3BCDA00004023000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/split.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "split.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( split_prop_key_doc, "(callable or any type) The function or value by which to split (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( split_prop_maxsplit_doc, "(:py:class:`int`) The number of maximum splits (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( split_prop_keep_doc, "(:py:class:`bool`) Keep the delimiter (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( split_prop_keepbefore_doc, "(:py:class:`bool`) Keep the delimiter as last item of the last group " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( split_prop_keepafter_doc, "(:py:class:`bool`) Keep the delimiter as first item of the next group " "(readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( split_prop_eq_doc, "(:py:class:`bool`) Instead of calling :py:attr:`key` compare the items " "with it (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( split_doc, "split(iterable, key, maxsplit=-1, keep=False, keep_before=False, keep_after=False, eq=False)\n" "--\n\n" "Splits an `iterable` by a `key` function or delimiter.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " The `iterable` to split.\n" "\n" "key : callable\n" " The function by which to split the `iterable` (split where\n" " ``key(item) == True``).\n" "\n" "maxsplit : :py:class:`int`, optional\n" " The number of maximal splits. If ``maxsplit=-1`` then there is no limit.\n" " Default is ``-1``.\n" "\n" "keep : :py:class:`bool`\n" " If ``True`` also include the items where ``key(item)=True`` as separate list.\n" " Default is ``False``.\n" "\n" "keep_before : :py:class:`bool`\n" " If ``True`` also include the items where ``key(item)=True`` in the \n" " list before splitting.\n" " Default is ``False``.\n" "\n" "keep_after : :py:class:`bool`\n" " If ``True`` also include the items where ``key(item)=True`` as first \n" " item in the list after splitting.\n" " Default is ``False``.\n" "\n" "eq : :py:class:`bool`\n" " If ``True`` split the `iterable` where ``key == item`` instead of\n" " ``key(item) == True``. This can significantly speed up the function if a\n" " single delimiter is used.\n" " Default is ``False``.\n" "\n" "Returns\n" "-------\n" "splitted_iterable : generator\n" " Generator containing the splitted `iterable` (lists).\n" "\n" "Raises\n" "-------\n" "TypeError\n" " If ``maxsplit`` is smaller than ``-2``. If more than one of the ``keep``\n" " arguments is ``True``.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import split\n" ">>> list(split(range(1, 10), lambda x: x%3==0))\n" "[[1, 2], [4, 5], [7, 8]]\n" "\n" ">>> list(split(range(1, 10), lambda x: x%3==0, keep=True))\n" "[[1, 2], [3], [4, 5], [6], [7, 8], [9]]\n" "\n" ">>> list(split(range(1, 10), lambda x: x%3==0, keep_before=True))\n" "[[1, 2, 3], [4, 5, 6], [7, 8, 9]]\n" "\n" ">>> list(split(range(1, 10), lambda x: x%3==0, keep_after=True))\n" "[[1, 2], [3, 4, 5], [6, 7, 8], [9]]\n" "\n" ">>> list(split(range(1, 10), lambda x: x%3==0, maxsplit=1))\n" "[[1, 2], [4, 5, 6, 7, 8, 9]]\n" "\n" ">>> list(split([1,2,3,4,5,3,7,8,3], 3, eq=True))\n" "[[1, 2], [4, 5], [7, 8]]\n"); static PyObject * split_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "key", "maxsplit", "keep", "keep_before", "keep_after", "eq", NULL}; PyIUObject_Split *self; PyObject *iterable; PyObject *delimiter; Py_ssize_t maxsplit = -1; /* -1 means no maxsplit! */ int keep_delimiter = 0; int keep_before = 0; int keep_after = 0; int cmp = 0; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO|npppp:split", kwlist, &iterable, &delimiter, &maxsplit, &keep_delimiter, &keep_before, &keep_after, &cmp)) { return NULL; } if (maxsplit <= -2) { PyErr_SetString(PyExc_ValueError, "`maxsplit` argument for `split` must be -1 or greater."); return NULL; } if ((keep_delimiter ? 1 : 0) + (keep_before ? 1 : 0) + (keep_after ? 1 : 0) > 1) { PyErr_SetString(PyExc_ValueError, "only one or none of `keep`, `keep_before`, " "`keep_after` arguments for `split` may be set."); return NULL; } self = (PyIUObject_Split *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } Py_INCREF(delimiter); self->delimiter = delimiter; self->maxsplit = maxsplit; if (keep_delimiter) { self->keep = PyIU_Split_Keep; } else if (keep_before) { self->keep = PyIU_Split_KeepBefore; } else if (keep_after) { self->keep = PyIU_Split_KeepAfter; } else { self->keep = PyIU_Split_KeepNone; } self->cmp = cmp; self->next = NULL; return (PyObject *)self; } static void split_dealloc(PyIUObject_Split *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->delimiter); Py_XDECREF(self->next); Py_TYPE(self)->tp_free(self); } static int split_traverse(PyIUObject_Split *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->delimiter); Py_VISIT(self->next); return 0; } static int split_clear(PyIUObject_Split *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->delimiter); Py_CLEAR(self->next); return 0; } static PyObject * split_next(PyIUObject_Split *self) { PyObject *result; PyObject *item; int ok; /* Create a list to hold the result. */ result = PyList_New(0); if (result == NULL) { return NULL; } if (self->next != NULL) { /* If there was already a value saved as next just append it and return it. This case happens if someone wants to keep the delimiter. */ ok = PyList_Append(result, self->next); Py_CLEAR(self->next); if (ok == 0) { if (self->keep != PyIU_Split_KeepAfter) { return result; } } else { Py_DECREF(result); return NULL; } } while ((item = Py_TYPE(self->iterator)->tp_iternext(self->iterator))) { int should_split; /* Compare the value to the delimiter or call the delimiter function on it to determine if we should split here. */ if (self->maxsplit == 0) { should_split = 0; } else { if (self->cmp) { should_split = PyObject_RichCompareBool(self->delimiter, item, Py_EQ); } else { PyObject *val = PyIU_CallWithOneArgument(self->delimiter, item); if (val == NULL) { Py_DECREF(item); Py_DECREF(result); return NULL; } should_split = PyObject_IsTrue(val); Py_DECREF(val); } } if (should_split == 0) { /* Value is not delimiter or we already used up the maxsplit splittings. */ ok = PyList_Append(result, item); Py_DECREF(item); if (ok != 0) { Py_DECREF(result); return NULL; } } else if (should_split == 1) { /* Split here. */ if (self->maxsplit != -1) { self->maxsplit--; } /* Keep the delimiter (if requested) as next item. */ if (self->keep == PyIU_Split_Keep || self->keep == PyIU_Split_KeepAfter) { self->next = item; } else if (self->keep == PyIU_Split_KeepBefore) { ok = PyList_Append(result, item); Py_DECREF(item); if (ok != 0) { Py_DECREF(result); return NULL; } } else { Py_DECREF(item); } return result; } else { Py_DECREF(item); Py_DECREF(result); return NULL; } } if (PyIU_ErrorOccurredClearStopIteration()) { Py_DECREF(result); return NULL; } /* Only return the last result if there is something in it. */ if (PyList_GET_SIZE(result) == 0) { Py_DECREF(result); return NULL; } else { return result; } } static PyObject * split_reduce(PyIUObject_Split *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on next == NULL because otherwise "None" would be ambiguous. It could mean that we did not had a next item or that the next item was None. Better to make an "if" than to introduce another variable depending on next == NULL. */ if (self->next == NULL) { return Py_BuildValue("O(OOniiii)", Py_TYPE(self), self->iterator, self->delimiter, self->maxsplit, self->keep == PyIU_Split_Keep, self->keep == PyIU_Split_KeepBefore, self->keep == PyIU_Split_KeepAfter, self->cmp); } else { return Py_BuildValue("O(OOniiii)(O)", Py_TYPE(self), self->iterator, self->delimiter, self->maxsplit, self->keep == PyIU_Split_Keep, self->keep == PyIU_Split_KeepBefore, self->keep == PyIU_Split_KeepAfter, self->cmp, self->next); } } static PyObject * split_setstate(PyIUObject_Split *self, PyObject *state) { PyObject *next; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "O:split.__setstate__", &next)) { return NULL; } /* No need to check the type of "next" because any python object is valid. */ Py_INCREF(next); Py_XSETREF(self->next, next); Py_RETURN_NONE; } static PyObject * split_getkeep(PyIUObject_Split *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->keep == PyIU_Split_Keep); } static PyObject * split_getkeepbefore(PyIUObject_Split *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->keep == PyIU_Split_KeepBefore); } static PyObject * split_getkeepafter(PyIUObject_Split *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->keep == PyIU_Split_KeepAfter); } static PyObject * split_get_eq(PyIUObject_Split *self, void *Py_UNUSED(closure)) { return PyBool_FromLong(self->cmp); } static PyMethodDef split_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)split_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)split_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyGetSetDef split_getsetlist[] = { { "keep", /* name */ (getter)split_getkeep, /* get */ (setter)0, /* set */ split_prop_keep_doc, /* doc */ (void *)NULL /* closure */ }, { "keep_before", /* name */ (getter)split_getkeepbefore, /* get */ (setter)0, /* set */ split_prop_keepbefore_doc, /* doc */ (void *)NULL /* closure */ }, { "keep_after", /* name */ (getter)split_getkeepafter, /* get */ (setter)0, /* set */ split_prop_keepafter_doc, /* doc */ (void *)NULL /* closure */ }, { "eq", /* name */ (getter)split_get_eq, /* get */ (setter)0, /* set */ split_prop_eq_doc, /* doc */ (void *)NULL /* closure */ }, {NULL} /* sentinel */ }; static PyMemberDef split_memberlist[] = { { "key", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Split, delimiter), /* offset */ READONLY, /* flags */ split_prop_key_doc /* doc */ }, { "maxsplit", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Split, maxsplit), /* offset */ READONLY, /* flags */ split_prop_maxsplit_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Split = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.split", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Split), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)split_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)split_doc, /* tp_doc */ (traverseproc)split_traverse, /* tp_traverse */ (inquiry)split_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)split_next, /* tp_iternext */ split_methods, /* tp_methods */ split_memberlist, /* tp_members */ split_getsetlist, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)split_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000110000081A400000000000000000000000165E3BCDA0000022D000000000000000000000000000000000000005000000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/split.h#ifndef PYIU_SPLIT_H #define PYIU_SPLIT_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" enum PyIU_SplitKeepOption { PyIU_Split_KeepNone, PyIU_Split_Keep, PyIU_Split_KeepAfter, PyIU_Split_KeepBefore }; typedef struct { PyObject_HEAD PyObject *iterator; PyObject *delimiter; Py_ssize_t maxsplit; enum PyIU_SplitKeepOption keep; int cmp; PyObject *next; } PyIUObject_Split; extern PyTypeObject PyIUType_Split; #ifdef __cplusplus } #endif #endif 07070100000111000081A400000000000000000000000165E3BCDA00001C08000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/starfilter.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "starfilter.h" #include <structmember.h> #include "docs_reduce.h" PyDoc_STRVAR( starfilter_prop_pred_doc, "(callable) The function by which to filter (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( starfilter_doc, "starfilter(pred, iterable)\n" "--\n\n" "Like :py:func:`filter` but unpacks the current item in `iterable` when \n" "calling `pred`. This is similar to the difference between :py:func:`map` and \n" ":py:func:`itertools.starmap`.\n" "\n" ".. versionadded:: 0.3\n" "\n" "Parameters\n" "----------\n" "pred : callable\n" " The predicate function that is called to determine if the items should\n" " be kept.\n" "\n" " .. note::\n" " Unlike :py:func:`filter` the `pred` cannot be ``None``.\n" "\n" "iterable : iterable\n" " `Iterable` containing the elements.\n" "\n" "Returns\n" "-------\n" "iterator : generator\n" " A normal iterator over `iterable` containing only the items where \n" " ``pred(*item)`` is ``True``.\n" "\n" "Notes\n" "-----\n" "This is identical to ``filter(lambda x: pred(*x), iterable)`` but faster.\n" "\n" "Examples\n" "--------\n" "A simple example::\n" "\n" " >>> from iteration_utilities import starfilter\n" " >>> from operator import eq\n" " >>> list(starfilter(eq, zip([1,2,3], [2,2,2])))\n" " [(2, 2)]\n" "\n" "See also\n" "--------\n" "filter\n" "iteration_utilities.packed\n"); static PyObject * starfilter_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"pred", "iterable", NULL}; PyIUObject_Starfilter *self; PyObject *iterable; PyObject *func; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO:starfilter", kwlist, &func, &iterable)) { return NULL; } self = (PyIUObject_Starfilter *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } Py_INCREF(func); self->func = func; return (PyObject *)self; } static void starfilter_dealloc(PyIUObject_Starfilter *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->func); Py_TYPE(self)->tp_free(self); } static int starfilter_traverse(PyIUObject_Starfilter *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->func); return 0; } static int starfilter_clear(PyIUObject_Starfilter *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->func); return 0; } static PyObject * starfilter_next(PyIUObject_Starfilter *self) { PyObject *item; while ((item = Py_TYPE(self->iterator)->tp_iternext(self->iterator))) { PyObject *newargs; PyObject *val; int ok; if (!PyTuple_CheckExact(item)) { newargs = PySequence_Tuple(item); if (newargs == NULL) { Py_DECREF(item); return NULL; } } else { Py_INCREF(item); newargs = item; } val = PyObject_Call(self->func, newargs, NULL); Py_DECREF(newargs); if (val == NULL) { Py_DECREF(item); return NULL; } ok = PyObject_IsTrue(val); Py_DECREF(val); if (ok > 0) { return item; } Py_DECREF(item); if (ok < 0) { return NULL; } } return NULL; } static PyObject * starfilter_reduce(PyIUObject_Starfilter *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OO)", Py_TYPE(self), self->func, self->iterator); } static PyMethodDef starfilter_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)starfilter_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef starfilter_memberlist[] = { { "pred", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Starfilter, func), /* offset */ READONLY, /* flags */ starfilter_prop_pred_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Starfilter = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.starfilter", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Starfilter), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)starfilter_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)starfilter_doc, /* tp_doc */ (traverseproc)starfilter_traverse, /* tp_traverse */ (inquiry)starfilter_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)starfilter_next, /* tp_iternext */ starfilter_methods, /* tp_methods */ starfilter_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)starfilter_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000112000081A400000000000000000000000165E3BCDA0000015C000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/starfilter.h#ifndef PYIU_STARFILTER_H #define PYIU_STARFILTER_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; PyObject *iterator; } PyIUObject_Starfilter; extern PyTypeObject PyIUType_Starfilter; #ifdef __cplusplus } #endif #endif 07070100000113000081A400000000000000000000000165E3BCDA00002FEB000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/successive.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "successive.h" #include <structmember.h> #include "docs_lengthhint.h" #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( successive_prop_times_doc, "(:py:class:`int`) The number of successive items (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( successive_doc, "successive(iterable, times=2)\n" "--\n\n" "Like the recipe for pairwise but allows to get an arbitrary number\n" "of successive elements.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " Get the successive elements from this `iterable`.\n" "\n" "times : :py:class:`int`, optional\n" " The number of successive elements.\n" " Default is ``2``.\n" "\n" "Returns\n" "-------\n" "successive_elements : generator\n" " The successive elements as generator. Each element of the generator\n" " is a tuple containing `times` successive elements.\n" "\n" "Examples\n" "--------\n" "Each item of the `iterable` is returned as ``tuple`` with `times` successive\n" "items::\n" "\n" " >>> from iteration_utilities import successive\n" " >>> list(successive(range(5)))\n" " [(0, 1), (1, 2), (2, 3), (3, 4)]\n" "\n" "Varying the `times` can give you also 3 successive elements::\n" "\n" " >>> list(successive(range(5), times=3))\n" " [(0, 1, 2), (1, 2, 3), (2, 3, 4)]\n" " >>> list(successive('Hello!', times=2))\n" " [('H', 'e'), ('e', 'l'), ('l', 'l'), ('l', 'o'), ('o', '!')]\n"); static PyObject * successive_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "times", NULL}; PyIUObject_Successive *self; PyObject *iterable; Py_ssize_t times = 2; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|n:successive", kwlist, &iterable, ×)) { return NULL; } if (times <= 0) { PyErr_Format(PyExc_ValueError, "`times` argument for `successive` must be greater than 0."); return NULL; } self = (PyIUObject_Successive *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->times = times; self->result = NULL; return (PyObject *)self; } static void successive_dealloc(PyIUObject_Successive *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->result); Py_TYPE(self)->tp_free(self); } static int successive_traverse(PyIUObject_Successive *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->result); return 0; } static int successive_clear(PyIUObject_Successive *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->result); return 0; } static PyObject * successive_next(PyIUObject_Successive *self) { PyObject *result = self->result; PyObject *item; Py_ssize_t i; /* First call needs to create a tuple for the result. */ if (result == NULL) { result = PyTuple_New(self->times); if (result == NULL) { return NULL; } for (i = 0; i < self->times; i++) { item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); if (item == NULL) { Py_DECREF(result); return NULL; } PyTuple_SET_ITEM(result, i, item); } Py_INCREF(result); self->result = result; return result; } /* After the first element we can use the normal procedure. */ item = Py_TYPE(self->iterator)->tp_iternext(self->iterator); if (item == NULL) { return NULL; } /* Recycle old tuple or create a new one. */ if (PYIU_CPYTHON && (Py_REFCNT(result) == 1)) { /* Remove the first item of the result. */ PyObject *temp = PyTuple_GET_ITEM(result, 0); PyIU_TupleRemove(result, 0, self->times); Py_XDECREF(temp); /* Insert the new item (at the end) and return it. */ PyTuple_SET_ITEM(result, self->times - 1, item); Py_INCREF(result); return result; } else { PyObject *newresult = PyTuple_New(self->times); if (newresult == NULL) { Py_DECREF(item); return NULL; } /* Shift all earlier items one index to the left. */ for (i = 1; i < self->times; i++) { PyObject *olditem = PyTuple_GET_ITEM(result, i); Py_INCREF(olditem); PyTuple_SET_ITEM(newresult, i - 1, olditem); } /* Insert the new item (at the end), then replace the saved result. */ PyTuple_SET_ITEM(newresult, self->times - 1, item); Py_INCREF(newresult); Py_SETREF(self->result, newresult); return newresult; } } static PyObject * successive_reduce(PyIUObject_Successive *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on the status of "result". We use and modify it in next. It is copied in next when the refcount isn't 1, so we don't need to copy it for reduce. However using "reduce" a lot will definitely slow the function down. But it does not matter if the slowdown is in "next" or "reduce". :) */ if (self->result == NULL) { return Py_BuildValue("O(On)", Py_TYPE(self), self->iterator, self->times); } else { return Py_BuildValue("O(On)(O)", Py_TYPE(self), self->iterator, self->times, self->result); } } static PyObject * successive_setstate(PyIUObject_Successive *self, PyObject *state) { PyObject *result; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "O:successive.__setstate__", &result)) { return NULL; } /* The result must be a tuple, otherwise we could risk segfaults (because "next" use PyTuple_GET_ITEM). It also needs to have the same size as "self->times" otherwise the for-loop in "next" could go beyond the tuple-size (again risking undefined behaviour). */ if (!PyTuple_CheckExact(result)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple` instance as " "first argument in the `state`, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(result)->tp_name); return NULL; } if (PyTuple_GET_SIZE(result) != self->times) { PyErr_Format(PyExc_ValueError, "`%.200s.__setstate__` expected that the first argument " "in the `state`, satisfies `len(firstarg) == self->times`. " "But `%zd != %zd`.", Py_TYPE(self)->tp_name, PyTuple_GET_SIZE(result), self->times); return NULL; } /* No need to copy the "result". If it has a refcount different from 1 it will be copied in "next" before it is mutated. */ Py_XINCREF(result); Py_XSETREF(self->result, result); Py_RETURN_NONE; } static PyObject * successive_lengthhint(PyIUObject_Successive *self, PyObject *Py_UNUSED(args)) { Py_ssize_t len = PyObject_LengthHint(self->iterator, 0); if (len == -1) { return NULL; } /* If we are already started we will have one result for every remaining item in the iterator. However if we haven't started we have less than that. We need "self->times" objects to fill the first return value, so we need to subtract "self->times - 1" from the length. In case the "times > len" the function won't return anything so we can set the length simply to 0. */ if (self->result == NULL) { if (self->times > len) { len = 0; } else { len -= (self->times - 1); } } return PyLong_FromSsize_t(len); } static PyMethodDef successive_methods[] = { { "__length_hint__", /* ml_name */ (PyCFunction)successive_lengthhint, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_lenhint_doc /* ml_doc */ }, { "__reduce__", /* ml_name */ (PyCFunction)successive_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)successive_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef successive_memberlist[] = { { "times", /* name */ T_PYSSIZET, /* type */ offsetof(PyIUObject_Successive, times), /* offset */ READONLY, /* flags */ successive_prop_times_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Successive = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.successive", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Successive), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)successive_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)successive_doc, /* tp_doc */ (traverseproc)successive_traverse, /* tp_traverse */ (inquiry)successive_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)successive_next, /* tp_iternext */ successive_methods, /* tp_methods */ successive_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)PyType_GenericAlloc, /* tp_alloc */ (newfunc)successive_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000114000081A400000000000000000000000165E3BCDA00000174000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/successive.h#ifndef PYIU_SUCCESSIVE_H #define PYIU_SUCCESSIVE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; Py_ssize_t times; PyObject *result; } PyIUObject_Successive; extern PyTypeObject PyIUType_Successive; #ifdef __cplusplus } #endif #endif 07070100000115000081A400000000000000000000000165E3BCDA00001C8E000000000000000000000000000000000000005300000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/tabulate.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "tabulate.h" #include <structmember.h> #include "docs_reduce.h" #include "helper.h" PyDoc_STRVAR( tabulate_prop_func_doc, "(callable) The function to tabulate (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( tabulate_prop_current_doc, "(any type) The current value to tabulate (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( tabulate_doc, "tabulate(func, start=0)\n" "--\n\n" "Return ``function(0)``, ``function(1)``, ...\n" "\n" "Parameters\n" "----------\n" "func : callable\n" " The `function` to apply.\n" "\n" "start : any type, optional\n" " The starting value to apply the `function` on. Each time `tabulate` is\n" " called this value will be incremented by one.\n" " Default is ``0``.\n" "\n" "Returns\n" "-------\n" "tabulated : generator\n" " An infinite generator containing the results of the `function` applied\n" " on the values beginning by `start`.\n" "\n" "Examples\n" "--------\n" "Since the return is an infinite generator you need some other function\n" "to extract only the needed values. For example\n" ":py:func:`~iteration_utilities.getitem`::\n" "\n" " >>> from iteration_utilities import tabulate, getitem\n" " >>> from math import sqrt\n" " >>> t = tabulate(sqrt, 0)\n" " >>> list(getitem(t, stop=3))\n" " [0.0, 1.0, 1.4142135623730951]\n" "\n" ".. warning::\n" " This will return an infinitely long generator so do **not** try to do\n" " something like ``list(tabulate())``!\n" "\n" "This is equivalent to:\n" "\n" ".. code::\n" "\n" " import itertools\n" " \n" " def tabulate(function, start=0)\n" " return map(function, itertools.count(start))\n"); static PyObject * tabulate_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"func", "start", NULL}; PyIUObject_Tabulate *self; PyObject *func; PyObject *cnt = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|O:tabulate", kwlist, &func, &cnt)) { return NULL; } self = (PyIUObject_Tabulate *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } Py_INCREF(func); self->func = func; self->cnt = cnt == NULL ? PyIU_global_zero : cnt; Py_XINCREF(self->cnt); return (PyObject *)self; } static void tabulate_dealloc(PyIUObject_Tabulate *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->func); Py_XDECREF(self->cnt); Py_TYPE(self)->tp_free(self); } static int tabulate_traverse(PyIUObject_Tabulate *self, visitproc visit, void *arg) { Py_VISIT(self->func); Py_VISIT(self->cnt); return 0; } static int tabulate_clear(PyIUObject_Tabulate *self) { Py_CLEAR(self->func); Py_CLEAR(self->cnt); return 0; } static PyObject * tabulate_next(PyIUObject_Tabulate *self) { PyObject *result = NULL; PyObject *new_count; if (self->cnt == NULL) { return NULL; } /* Call the function with the current value as argument. */ result = PyIU_CallWithOneArgument(self->func, self->cnt); if (result == NULL) { Py_CLEAR(self->cnt); return NULL; } /* Increment the counter. */ new_count = PyNumber_Add(self->cnt, PyIU_global_one); Py_SETREF(self->cnt, new_count); if (self->cnt == NULL) { Py_DECREF(result); return NULL; } return result; } static PyObject * tabulate_reduce(PyIUObject_Tabulate *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OO)", Py_TYPE(self), self->func, self->cnt); } static PyMethodDef tabulate_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)tabulate_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef tabulate_memberlist[] = { { "func", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Tabulate, func), /* offset */ READONLY, /* flags */ tabulate_prop_func_doc /* doc */ }, { "current", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_Tabulate, cnt), /* offset */ READONLY, /* flags */ tabulate_prop_current_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_Tabulate = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.tabulate", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_Tabulate), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)tabulate_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)tabulate_doc, /* tp_doc */ (traverseproc)tabulate_traverse, /* tp_traverse */ (inquiry)tabulate_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)tabulate_next, /* tp_iternext */ tabulate_methods, /* tp_methods */ tabulate_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)tabulate_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000116000081A400000000000000000000000165E3BCDA0000014F000000000000000000000000000000000000005300000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/tabulate.h#ifndef PYIU_TABULATE_H #define PYIU_TABULATE_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *func; PyObject *cnt; } PyIUObject_Tabulate; extern PyTypeObject PyIUType_Tabulate; #ifdef __cplusplus } #endif #endif 07070100000117000081A400000000000000000000000165E3BCDA00002635000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/uniqueever.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "uniqueever.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" #include "seen.h" PyDoc_STRVAR( uniqueever_prop_seen_doc, "(:py:class:`~iteration_utilities.Seen`) Already seen values (readonly)."); PyDoc_STRVAR( uniqueever_prop_key_doc, "(callable or None) The key function (readonly)."); PyDoc_STRVAR( uniqueever_doc, "unique_everseen(iterable, key=None)\n" "--\n\n" "Find unique elements, preserving their order. Remembers all elements ever seen.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " `Iterable` containing the elements.\n" "\n" "key : callable, optional\n" " If given it must be a callable taking one argument and this\n" " callable is applied to the value before checking if it was seen yet.\n" "\n" "Returns\n" "-------\n" "iterable : generator\n" " An iterable containing all unique values ever seen in the `iterable`.\n" "\n" "Notes\n" "-----\n" "The items in the `iterable` should implement equality.\n" "\n" "If the items are hashable the function is much faster.\n" "\n" "Examples\n" "--------\n" "Some simple examples::\n" "\n" " >>> from iteration_utilities import unique_everseen\n" " >>> list(unique_everseen('AAAABBBCCDAABBB'))\n" " ['A', 'B', 'C', 'D']\n" " \n" " >>> list(unique_everseen('ABBCcAD', str.lower))\n" " ['A', 'B', 'C', 'D']\n" " \n" "Even unhashable values can be processed, like `list`::\n" "\n" " >>> list(unique_everseen([[1, 2], [1, 1], [1, 2]]))\n" " [[1, 2], [1, 1]]\n" " \n" "However using ``key=tuple`` (to make them hashable) will be faster::\n" "\n" " >>> list(unique_everseen([[1, 2], [1, 1], [1, 2]], key=tuple))\n" " [[1, 2], [1, 1]]\n" " \n" "One can access the already seen values by accessing the `seen` attribute.\n"); /****************************************************************************** * * IMPORTANT NOTE (Implementation): * * This function is almost identical to "duplicates", so any changes * or bugfixes should also be implemented there!!! * *****************************************************************************/ static PyObject * uniqueever_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "key", NULL}; PyIUObject_UniqueEver *self; PyObject *iterable; PyObject *key = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|O:unique_everseen", kwlist, &iterable, &key)) { return NULL; } self = (PyIUObject_UniqueEver *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->seen = PyIUSeen_New(); if (self->seen == NULL) { Py_DECREF(self); return NULL; } self->key = key == Py_None ? NULL : key; Py_XINCREF(self->key); return (PyObject *)self; } static void uniqueever_dealloc(PyIUObject_UniqueEver *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->key); Py_XDECREF(self->seen); Py_TYPE(self)->tp_free(self); } static int uniqueever_traverse(PyIUObject_UniqueEver *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->key); Py_VISIT(self->seen); return 0; } static int uniqueever_clear(PyIUObject_UniqueEver *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->key); Py_CLEAR(self->seen); return 0; } static PyObject * uniqueever_next(PyIUObject_UniqueEver *self) { PyObject *item; while ((item = Py_TYPE(self->iterator)->tp_iternext(self->iterator))) { PyObject *temp; int ok; /* Use the item if key is not given, otherwise apply the key. */ if (self->key == NULL) { Py_INCREF(item); temp = item; } else { temp = PyIU_CallWithOneArgument(self->key, item); if (temp == NULL) { Py_DECREF(item); return NULL; } } ok = PyIUSeen_ContainsAdd(self->seen, temp); Py_DECREF(temp); if (ok == 0) { return item; } Py_DECREF(item); if (ok == -1) { return NULL; } } return NULL; } static PyObject * uniqueever_reduce(PyIUObject_UniqueEver *self, PyObject *Py_UNUSED(args)) { return Py_BuildValue("O(OO)(O)", Py_TYPE(self), self->iterator, self->key ? self->key : Py_None, self->seen); } static PyObject * uniqueever_setstate(PyIUObject_UniqueEver *self, PyObject *state) { PyObject *seen; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "O:unique_everseen.__setstate__", &seen)) { return NULL; } /* object passed in must be an instance of Seen. Otherwise the function calls could result in an segmentation fault. */ if (!PyIU_IsTypeExact(seen, &PyIUType_Seen)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `Seen` instance as " "first argument in the `state`, got %.200s.", Py_TYPE(self)->tp_name, Py_TYPE(seen)->tp_name); return NULL; } Py_INCREF(seen); Py_XSETREF(self->seen, seen); Py_RETURN_NONE; } static PyMethodDef uniqueever_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)uniqueever_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)uniqueever_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef uniqueever_memberlist[] = { { "seen", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_UniqueEver, seen), /* offset */ READONLY, /* flags */ uniqueever_prop_seen_doc /* doc */ }, { "key", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_UniqueEver, key), /* offset */ READONLY, /* flags */ uniqueever_prop_key_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_UniqueEver = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.unique_everseen", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_UniqueEver), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)uniqueever_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)uniqueever_doc, /* tp_doc */ (traverseproc)uniqueever_traverse, /* tp_traverse */ (inquiry)uniqueever_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)uniqueever_next, /* tp_iternext */ uniqueever_methods, /* tp_methods */ uniqueever_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)uniqueever_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 07070100000118000081A400000000000000000000000165E3BCDA0000016F000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/uniqueever.h#ifndef PYIU_UNIQUEEVER_H #define PYIU_UNIQUEEVER_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *key; PyObject *seen; } PyIUObject_UniqueEver; extern PyTypeObject PyIUType_UniqueEver; #ifdef __cplusplus } #endif #endif 07070100000119000081A400000000000000000000000165E3BCDA00002473000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/uniquejust.c/****************************************************************************** * Licensed under Apache License Version 2.0 - see LICENSE *****************************************************************************/ #include "uniquejust.h" #include <structmember.h> #include "docs_reduce.h" #include "docs_setstate.h" #include "helper.h" PyDoc_STRVAR( uniquejust_prop_key_doc, "(callable or None) The key function (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( uniquejust_prop_lastseen_doc, "(any type) The last seen item (readonly).\n" "\n" ".. versionadded:: 0.6"); PyDoc_STRVAR( uniquejust_doc, "unique_justseen(iterable, key=None)\n" "--\n\n" "List unique elements, preserving order. Remember only the element just seen.\n" "\n" "Parameters\n" "----------\n" "iterable : iterable\n" " `Iterable` to check.\n" "\n" "key : callable or None, optional\n" " If ``None`` the values are taken as they are. If it's a callable the\n" " callable is applied to the value before comparing it.\n" " Default is ``None``.\n" "\n" "Returns\n" "-------\n" "iterable : generator\n" " An iterable containing all unique values just seen in the `iterable`.\n" "\n" "Examples\n" "--------\n" ">>> from iteration_utilities import unique_justseen\n" ">>> list(unique_justseen('AAAABBBCCDAABBB'))\n" "['A', 'B', 'C', 'D', 'A', 'B']\n" "\n" ">>> list(unique_justseen('ABBCcAD', str.lower))\n" "['A', 'B', 'C', 'A', 'D']\n"); static PyObject * uniquejust_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { static char *kwlist[] = {"iterable", "key", NULL}; PyIUObject_UniqueJust *self; PyObject *iterable; PyObject *keyfunc = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "O|O:unique_justseen", kwlist, &iterable, &keyfunc)) { return NULL; } self = (PyIUObject_UniqueJust *)type->tp_alloc(type, 0); if (self == NULL) { return NULL; } self->iterator = PyObject_GetIter(iterable); if (self->iterator == NULL) { Py_DECREF(self); return NULL; } self->keyfunc = keyfunc == Py_None ? NULL : keyfunc; Py_XINCREF(self->keyfunc); self->lastitem = NULL; return (PyObject *)self; } static void uniquejust_dealloc(PyIUObject_UniqueJust *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->iterator); Py_XDECREF(self->keyfunc); Py_XDECREF(self->lastitem); Py_TYPE(self)->tp_free(self); } static int uniquejust_traverse(PyIUObject_UniqueJust *self, visitproc visit, void *arg) { Py_VISIT(self->iterator); Py_VISIT(self->keyfunc); Py_VISIT(self->lastitem); return 0; } static int uniquejust_clear(PyIUObject_UniqueJust *self) { Py_CLEAR(self->iterator); Py_CLEAR(self->keyfunc); Py_CLEAR(self->lastitem); return 0; } static PyObject * uniquejust_next(PyIUObject_UniqueJust *self) { PyObject *item; while ((item = Py_TYPE(self->iterator)->tp_iternext(self->iterator))) { PyObject *val; int ok; /* Apply keyfunc or use the original */ if (self->keyfunc == NULL) { Py_INCREF(item); val = item; } else { val = PyIU_CallWithOneArgument(self->keyfunc, item); if (val == NULL) { Py_DECREF(item); return NULL; } } /* If no lastitem set it to the current and simply return the item. */ if (self->lastitem == NULL) { self->lastitem = val; return item; } /* Otherwise compare it with the last item and only return it if it differs. */ ok = PyObject_RichCompareBool(val, self->lastitem, Py_EQ); if (ok == 0) { Py_SETREF(self->lastitem, val); return item; } Py_DECREF(val); Py_DECREF(item); if (ok < 0) { return NULL; } } return NULL; } static PyObject * uniquejust_reduce(PyIUObject_UniqueJust *self, PyObject *Py_UNUSED(args)) { /* Separate cases depending on lastitem == NULL because otherwise "None" would be ambiguous. It could mean that we did not had a last item or that the last item was None. Better to make an "if" than to introduce another variable depending on lastitem == NULL. */ if (self->lastitem != NULL) { return Py_BuildValue("O(OO)(O)", Py_TYPE(self), self->iterator, self->keyfunc ? self->keyfunc : Py_None, self->lastitem); } else { return Py_BuildValue("O(OO)", Py_TYPE(self), self->iterator, self->keyfunc ? self->keyfunc : Py_None); } } static PyObject * uniquejust_setstate(PyIUObject_UniqueJust *self, PyObject *state) { PyObject *lastitem; if (!PyTuple_Check(state)) { PyErr_Format(PyExc_TypeError, "`%.200s.__setstate__` expected a `tuple`-like argument" ", got `%.200s` instead.", Py_TYPE(self)->tp_name, Py_TYPE(state)->tp_name); return NULL; } if (!PyArg_ParseTuple(state, "O:unique_justseen.__setstate__", &lastitem)) { return NULL; } /* No need to check the type of "lastitem" because any python object is valid. */ Py_INCREF(lastitem); Py_XSETREF(self->lastitem, lastitem); Py_RETURN_NONE; } static PyMethodDef uniquejust_methods[] = { { "__reduce__", /* ml_name */ (PyCFunction)uniquejust_reduce, /* ml_meth */ METH_NOARGS, /* ml_flags */ PYIU_reduce_doc /* ml_doc */ }, { "__setstate__", /* ml_name */ (PyCFunction)uniquejust_setstate, /* ml_meth */ METH_O, /* ml_flags */ PYIU_setstate_doc /* ml_doc */ }, {NULL, NULL} /* sentinel */ }; static PyMemberDef uniquejust_memberlist[] = { { "key", /* name */ T_OBJECT, /* type */ offsetof(PyIUObject_UniqueJust, keyfunc), /* offset */ READONLY, /* flags */ uniquejust_prop_key_doc /* doc */ }, { "lastseen", /* name */ T_OBJECT_EX, /* type */ offsetof(PyIUObject_UniqueJust, lastitem), /* offset */ READONLY, /* flags */ uniquejust_prop_lastseen_doc /* doc */ }, {NULL} /* sentinel */ }; PyTypeObject PyIUType_UniqueJust = { PyVarObject_HEAD_INIT(NULL, 0)(const char *) "iteration_utilities.unique_justseen", /* tp_name */ (Py_ssize_t)sizeof(PyIUObject_UniqueJust), /* tp_basicsize */ (Py_ssize_t)0, /* tp_itemsize */ /* methods */ (destructor)uniquejust_dealloc, /* tp_dealloc */ (printfunc)0, /* tp_print */ (getattrfunc)0, /* tp_getattr */ (setattrfunc)0, /* tp_setattr */ 0, /* tp_reserved */ (reprfunc)0, /* tp_repr */ (PyNumberMethods *)0, /* tp_as_number */ (PySequenceMethods *)0, /* tp_as_sequence */ (PyMappingMethods *)0, /* tp_as_mapping */ (hashfunc)0, /* tp_hash */ (ternaryfunc)0, /* tp_call */ (reprfunc)0, /* tp_str */ (getattrofunc)PyObject_GenericGetAttr, /* tp_getattro */ (setattrofunc)0, /* tp_setattro */ (PyBufferProcs *)0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, /* tp_flags */ (const char *)uniquejust_doc, /* tp_doc */ (traverseproc)uniquejust_traverse, /* tp_traverse */ (inquiry)uniquejust_clear, /* tp_clear */ (richcmpfunc)0, /* tp_richcompare */ (Py_ssize_t)0, /* tp_weaklistoffset */ (getiterfunc)PyObject_SelfIter, /* tp_iter */ (iternextfunc)uniquejust_next, /* tp_iternext */ uniquejust_methods, /* tp_methods */ uniquejust_memberlist, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ 0, /* tp_dict */ (descrgetfunc)0, /* tp_descr_get */ (descrsetfunc)0, /* tp_descr_set */ (Py_ssize_t)0, /* tp_dictoffset */ (initproc)0, /* tp_init */ (allocfunc)0, /* tp_alloc */ (newfunc)uniquejust_new, /* tp_new */ (freefunc)PyObject_GC_Del, /* tp_free */ }; 0707010000011A000081A400000000000000000000000165E3BCDA00000177000000000000000000000000000000000000005500000000iteration_utilities-0.12.1/src/iteration_utilities/_iteration_utilities/uniquejust.h#ifndef PYIU_UNIQUEJUST_H #define PYIU_UNIQUEJUST_H #ifdef __cplusplus extern "C" { #endif #define PY_SSIZE_T_CLEAN #include <Python.h> #include "helpercompat.h" typedef struct { PyObject_HEAD PyObject *iterator; PyObject *keyfunc; PyObject *lastitem; } PyIUObject_UniqueJust; extern PyTypeObject PyIUType_UniqueJust; #ifdef __cplusplus } #endif #endif 0707010000011B000081A400000000000000000000000165E3BCDA00002D11000000000000000000000000000000000000003F00000000iteration_utilities-0.12.1/src/iteration_utilities/_recipes.py# Licensed under Apache License Version 2.0 - see LICENSE # Parts are taken from the CPython package (PSF licensed). """ API: Official recipes --------------------- """ # Built-ins from collections import deque from copy import copy from itertools import islice, chain, repeat, starmap, tee, combinations, filterfalse from random import choice, sample, randrange __all__ = ['consume', 'flatten', 'ipartition', 'ncycles', 'nth_combination', 'powerset', 'random_combination', 'random_product', 'random_permutation', 'repeatfunc', 'tail', 'tee_lookahead'] def tail(iterable, n): """Return an iterator over the last `n` items. Parameters ---------- iterable : iterable The `iterable` from which to take the last items. n : :py:class:`int` How many elements. Returns ------- iterator : iterator The last `n` items of `iterable` as iterator. Examples -------- >>> from iteration_utilities import tail >>> list(tail('ABCDEFG', 3)) ['E', 'F', 'G'] """ # tail(3, 'ABCDEFG') --> E F G return iter(deque(iterable, maxlen=n)) def consume(iterator, n): """Advance the `iterator` `n`-steps ahead. If `n` is ``None``, consume \ entirely. Parameters ---------- iterator : iterator Any `iterator` from which to consume the items. n : :py:class:`int` or None Number of items to consume from the `iterator`. If ``None`` consume it entirely. Examples -------- >>> from iteration_utilities import consume >>> g = (x**2 for x in range(10)) >>> consume(g, 2) >>> list(g) [4, 9, 16, 25, 36, 49, 64, 81] >>> g = (x**2 for x in range(10)) >>> consume(g, None) >>> list(g) [] """ # Use functions that consume iterators at C speed. if n is None: # feed the entire iterator into a zero-length deque deque(iterator, maxlen=0) else: # advance to the empty slice starting at position n next(islice(iterator, n, n), None) def ncycles(iterable, n): """Returns the sequence elements n times. Parameters ---------- iterable : iterable Any `iterable` to repeat. n : :py:class:`int` Number of repetitions. Returns ------- repeated_iterable : generator The `iterable` repeated `n` times. Examples -------- >>> from iteration_utilities import ncycles >>> list(ncycles([1,2,3], 3)) [1, 2, 3, 1, 2, 3, 1, 2, 3] """ return chain.from_iterable(repeat(tuple(iterable), n)) def flatten(iterable): """Flatten one level of nesting. Parameters ---------- iterable : iterable Any `iterable` to flatten. Returns ------- flattened_iterable : generator The `iterable` with the first level of nesting flattened. Examples -------- >>> from iteration_utilities import flatten >>> list(flatten([[1,2,3,4], [4,3,2,1]])) [1, 2, 3, 4, 4, 3, 2, 1] """ return chain.from_iterable(iterable) def repeatfunc(func, *args, times=None): """Repeat calls to `func` with specified arguments. Parameters ---------- func : callable The function that will be called. args : optional arguments for the `func`. times : :py:class:`int`, None, optional The number of `times` the function is called. If ``None`` there will be no limit. Default is ``None``. Returns ------- iterable : generator The result of the repeatedly called function. Examples -------- >>> from iteration_utilities import repeatfunc, getitem >>> import random >>> random.seed(5) >>> list(getitem(repeatfunc(random.random), stop=5)) [0.6229016948897019, 0.7417869892607294, 0.7951935655656966, 0.9424502837770503, 0.7398985747399307] >>> random.seed(2) >>> list(repeatfunc(random.random, times=3)) [0.9560342718892494, 0.9478274870593494, 0.05655136772680869] >>> random.seed(None) .. warning:: This will return an infinitely long generator if you don't specify ``times``. """ if times is None: return starmap(func, repeat(args)) return starmap(func, repeat(args, times)) def ipartition(iterable, pred): """Use a predicate to partition entries into ``False`` entries and ``True`` entries. Parameters ---------- iterable : iterable `Iterable` to partition. pred : callable or None The predicate which determines the group in which the value of the `iterable` belongs. If ``None`` it will use ``bool`` to determine the truth-value of the items. Returns ------- false_values : generator An iterable containing the values for which the predicate was False. true_values : generator An iterable containing the values for which the predicate was True. Examples -------- >>> from iteration_utilities import ipartition >>> def is_odd(val): return val % 2 >>> [list(i) for i in ipartition(range(10), is_odd)] [[0, 2, 4, 6, 8], [1, 3, 5, 7, 9]] """ if pred is None: pred = bool evaluated = ((pred(item), item) for item in iterable) t1, t2 = tee(evaluated) return ( (item for p, item in t1 if not p), (item for p, item in t2 if p) ) def nth_combination(iterable, r, index): """Equivalent to ``list(itertools.combinations(iterable, r))[index]``. .. versionadded:: 0.9.0 Parameters ---------- iterable : iterable The `iterable` to combine with :py:func:`itertools.combinations`. r : :py:class:`int` The number of elements to combine. index : :py:class:`int` The index of the combination. Returns ------- random_combination : tuple The nth combination. Examples -------- >>> from iteration_utilities import nth_combination >>> nth_combination([1,2,3,4,5,6], r=4, index=2) (1, 2, 3, 6) """ pool = tuple(iterable) n = len(pool) if r < 0 or r > n: raise ValueError c = 1 k = min(r, n - r) for i in range(1, k + 1): c = c * (n - k + i) // i if index < 0: index += c if index < 0 or index >= c: raise IndexError result = [] while r: c, n, r = c * r // n, n - 1, r - 1 while index >= c: index -= c c, n = c * (n - r) // n, n - 1 result.append(pool[-1 - n]) return tuple(result) def powerset(iterable): """Create all possible sets of values from an `iterable`. Parameters ---------- iterable : iterable `Iterable` for which to create a powerset. Returns ------- powerset : generator An iterable containing all powersets as tuple. Examples -------- >>> from iteration_utilities import powerset >>> list(powerset([1,2,3])) [(), (1,), (2,), (3,), (1, 2), (1, 3), (2, 3), (1, 2, 3)] """ s = list(iterable) return chain.from_iterable(combinations(s, r) for r in range(len(s)+1)) def random_product(*iterables, repeat=1): """Random selection from :py:func:`itertools.product`. Parameters ---------- iterables : iterable Any amount of `iterables` from to pass to :py:func:`itertools.product`. repeat : :py:class:`int`, optional The number of random samples. Default is ``1``. Returns ------- sample : tuple A tuple containing the random samples. Raises ------ IndexError If any `iterable` is empty. Examples -------- Take one random sample:: >>> from iteration_utilities import random_product >>> import random >>> random.seed(70) >>> random_product(['a', 'b'], [1, 2], [0.5, 0.25]) ('a', 2, 0.25) Or take multiple samples:: >>> random.seed(10) >>> random_product(['a', 'b'], [1, 2], [0.5, 0.25], repeat=5) ('a', 2, 0.25, 'a', 1, 0.25, 'b', 2, 0.5, 'a', 2, 0.25, 'a', 1, 0.25) >>> random.seed(None) """ pools = [tuple(pool) for pool in iterables] * repeat return tuple(choice(pool) for pool in pools) def random_permutation(iterable, r=None): """Random selection from :py:func:`itertools.permutations`. Parameters ---------- iterable : iterable The `iterable` to permute with :py:func:`itertools.permutations`. r : :py:class:`int` or None, optional The number of elements to permute. If ``None`` use all elements from the iterable. Default is ``None``. Returns ------- random_permutation : tuple The randomly chosen permutation. Examples -------- One random permutation:: >>> from iteration_utilities import random_permutation >>> import random >>> random.seed(20) >>> random_permutation([1,2,3,4,5,6]) (6, 2, 3, 4, 1, 5) One random permutation using a subset of the `iterable` (here 3 elements):: >>> random.seed(5) >>> random_permutation([1,2,3,4,5,6], r=3) (5, 3, 6) >>> random.seed(None) """ pool = tuple(iterable) r = len(pool) if r is None else r return tuple(sample(pool, r)) def random_combination(iterable, r, replacement=False): """Random selection from :py:func:`itertools.combinations`. Parameters ---------- iterable : iterable The `iterable` to combine with :py:func:`itertools.combinations`. r : :py:class:`int` The number of elements to combine. replacement : :py:class:`bool`, optional If ``True`` then replace already included values (uses :py:func:`itertools.combinations_with_replacement`). Default is ``False``. Returns ------- random_combination : tuple The randomly chosen combination. Examples -------- >>> from iteration_utilities import random_combination >>> import random >>> random.seed(5) >>> random_combination([1,2,3,4,5,6], r=4) (3, 4, 5, 6) >>> random.seed(100) >>> random_combination([1,2,3,4,5,6], r=4, replacement=True) (2, 2, 4, 4) >>> random.seed(None) """ pool = tuple(iterable) n = len(pool) if replacement: indices = sorted(randrange(n) for _ in range(r)) else: indices = sorted(sample(range(n), r)) return tuple(pool[i] for i in indices) def tee_lookahead(tee, i): """Inspect the `i`-th upcoming value from a :py:func:`~itertools.tee` object while leaving the :py:func:`~itertools.tee` object at its current position. Parameters ---------- tee : :py:func:`itertools.tee` The tee object in which to look ahead. i : :py:class:`int` The index counting from the current position which should be peeked. Returns ------- peek : any type The element at the `i`-th upcoming index in the `tee` object. Raises ------ IndexError If the underlying iterator doesn't have enough values. Examples -------- >>> from iteration_utilities import tee_lookahead >>> from itertools import tee >>> t1, t2 = tee([1,2,3,4,5,6]) >>> tee_lookahead(t1, 0) 1 >>> tee_lookahead(t1, 1) 2 >>> tee_lookahead(t1, 0) 1 """ for value in islice(copy(tee), i, None): return value raise IndexError(i) 0707010000011C000081A400000000000000000000000165E3BCDA00000370000000000000000000000000000000000000003D00000000iteration_utilities-0.12.1/src/iteration_utilities/_utils.py# Licensed under Apache License Version 2.0 - see LICENSE # Built-ins import platform import sys __all__ = ['IS_CPYTHON', 'IS_CPYTHON_PY_3_12', 'IS_PYPY', 'USES_VECTORCALL', '_default'] _GE_PY38 = sys.version_info.major > 3 or (sys.version_info.major == 3 and sys.version_info.minor >= 8) IS_CPYTHON_PY_3_12 = sys.version_info.major == 3 and sys.version_info.minor == 12 IS_PYPY = platform.python_implementation() == 'PyPy' IS_CPYTHON = platform.python_implementation() == 'CPython' USES_VECTORCALL = IS_CPYTHON and _GE_PY38 class _SentinelFactory: __slots__ = ('_name', ) def __init__(self, name): self._name = name # Representation and casting to strings def __repr__(self): return str(self._name) def __str__(self): return str(self._name) _default = _SentinelFactory('<default>') 0707010000011D000041ED00000000000000000000000265E3BCDA00000000000000000000000000000000000000000000002100000000iteration_utilities-0.12.1/tests0707010000011E000081A400000000000000000000000165E3BCDA00000110000000000000000000000000000000000000002D00000000iteration_utilities-0.12.1/tests/conftest.pyimport pickle import pytest @pytest.fixture(scope="module", params=range(pickle.HIGHEST_PROTOCOL + 1)) def protocol(request): """Returns all available pickle protocols. This avoids needing to parametrize all test functions manually.""" yield request.param 0707010000011F000081A400000000000000000000000165E3BCDA0000076A000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/helper_cls.py# Licensed under Apache License Version 2.0 - see LICENSE class T: def __init__(self, value): self.value = value def _cmp_cls_and_value(self, other): if (type(self) != type(other) or type(self.value) != type(other.value)): raise TypeError('simulated failure.') # Misc def __hash__(self): return hash(self.value) def __bool__(self): return bool(self.value) def __len__(self): return len(self.value) def __repr__(self): return '{0.__class__.__name__}({0.value})'.format(self) # Mathematical def __add__(self, other): self._cmp_cls_and_value(other) return self.__class__(self.value + other.value) def __mul__(self, other): self._cmp_cls_and_value(other) return self.__class__(self.value * other.value) def __rtruediv__(self, other): return self.__class__(other / self.value) def __pow__(self, other): if isinstance(other, T): return self.__class__(self.value**other.value) else: return self.__class__(self.value**other) def __abs__(self): return self.__class__(abs(self.value)) # Comparisons def __eq__(self, other): self._cmp_cls_and_value(other) return self.value == other.value def __lt__(self, other): self._cmp_cls_and_value(other) return self.value < other.value def __le__(self, other): self._cmp_cls_and_value(other) return self.value <= other.value def __gt__(self, other): self._cmp_cls_and_value(other) return self.value > other.value def __ge__(self, other): self._cmp_cls_and_value(other) return self.value >= other.value def toT(iterable): """Convenience to create a normal list to a list of `T` instances.""" return list(map(T, iterable)) 07070100000120000081A400000000000000000000000165E3BCDA00001991000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/tests/helper_funcs.py# Licensed under Apache License Version 2.0 - see LICENSE """ This module contains callable test cases. """ import abc import copy import operator import pickle import pytest import iteration_utilities from iteration_utilities._utils import IS_CPYTHON_PY_3_12, IS_PYPY, USES_VECTORCALL from helper_cls import T def _skipif_wrapper(func, condition, reason): return pytest.mark.skipif(condition, reason=reason)(func) def skip_on_pypy_because_cache_next_works_differently(func): """Not sure what happens there but on PyPy CacheNext doesn't work like on CPython. """ return _skipif_wrapper(func, IS_PYPY, reason='PyPy works differently with __next__ cache.') def skip_on_pypy_because_sizeof_makes_no_sense_there(func): """PyPy doesn't support sys.getsizeof(). """ return _skipif_wrapper(func, IS_PYPY, reason='PyPy doesn\'t support sys.getsizeof().') def skip_on_pypy_not_investigated_why(func): """PyPy failures - not sure why.""" return _skipif_wrapper(func, IS_PYPY, reason='PyPy fails here.') def skip_on_pypy_not_investigated_why_it_segfaults(func): """PyPy segfaults - not sure why.""" return _skipif_wrapper(func, IS_PYPY, reason='PyPy segfaults here.') def skip_if_vectorcall_is_not_used(func): """The vectorcall implementation imposes some additional restrictions that haven't been there before. """ return _skipif_wrapper(func, not USES_VECTORCALL, reason='pickle does not work with vectorcall') def skip_if_not_latest_python(func): """If the tests are specifically targeted to the latest Python version """ return _skipif_wrapper(func, not IS_CPYTHON_PY_3_12, reason='requires the latest Python version') def iterator_copy(thing): """Normal copies are not officially supported but ``itertools.tee`` uses ``__copy__`` if implemented it is either forbid both or none. Given that ``itertools.tee`` is a very useful function ``copy.copy`` is allowed but no guarantees are made. This function just makes sure they can be copied and the result has at least one item in it (call ``next`` on it)""" # Even though normal copies are discouraged they should be possible. # Cannot do "list" because it may be infinite :-) next(copy.copy(thing)) def iterator_setstate_list_fail(thing): with pytest.raises(TypeError) as exc: thing.__setstate__([]) assert 'tuple' in str(exc.value) and 'list' in str(exc.value) def iterator_setstate_empty_fail(thing): with pytest.raises(TypeError, match='0 given'): thing.__setstate__(()) def check_lengthhint_iteration(iterator, expected_start_lengthhint): for length in range(expected_start_lengthhint, 0, -1): assert operator.length_hint(iterator) == length next(iterator) assert operator.length_hint(iterator) == 0 with pytest.raises(StopIteration): next(iterator) def round_trip_pickle(obj, protocol): tmp = pickle.dumps(obj, protocol=protocol) return pickle.loads(tmp) # Helper classes for certain fail conditions. Bundled here so the tests don't # need to re-implement them. def CacheNext(item): """Iterator that modifies it "next" method when iterated over.""" def subiter(): def newnext(self): raise CacheNext.EXC_TYP(CacheNext.EXC_MSG) Iterator.__next__ = newnext yield item # Need to subclass a C iterator because only the "tp_iternext" slot is # cached, the "__next__" method itself always behaves as expected. class Iterator(filter): pass return Iterator(iteration_utilities.return_True, subiter()) CacheNext.EXC_MSG = 'next call failed, because it was modified' CacheNext.EXC_TYP = ValueError class FailIter: """A class that fails when "iter" is called on it. This class is currently not interchangable with a real "iter(x)" failure because it raises another exception. """ EXC_MSG = 'iter call failed' EXC_TYP = ValueError def __iter__(self): raise self.EXC_TYP(self.EXC_MSG) class FailEqNoHash: """A class that fails when "==" is called on it.""" EXC_MSG = 'eq call failed' EXC_TYP = ValueError __hash__ = None def __eq__(self, other): raise self.EXC_TYP(self.EXC_MSG) class FailEqWithHash(FailEqNoHash): """A class that fails when "==" is called on it.""" def __hash__(self): return 1 class FailHash: """A class that fails when "hash" is called on it.""" EXC_MSG = 'hash call failed' EXC_TYP = ValueError def __hash__(self): raise self.EXC_TYP(self.EXC_MSG) class FailBool: """A class that fails when "bool" is called on it.""" EXC_MSG = 'bool call failed' EXC_TYP = ValueError def __bool__(self): raise self.EXC_TYP(self.EXC_MSG) class FailNext: """An iterator that fails when calling "next" on it. The parameter "offset" can be used to set the number of times "next" works before it raises an exception. """ EXC_MSG = 'next call failed' EXC_TYP = ValueError def __init__(self, offset=0, repeats=1): self.offset = offset self.repeats = repeats def __iter__(self): return self def __next__(self): if self.offset: self.offset -= 1 return T(1) else: raise self.EXC_TYP(self.EXC_MSG) class FailLengthHint: """Simple iterator that fails when length_hint is called on it.""" EXC_MSG = "length_hint call failed" EXC_TYP = ValueError def __init__(self, it): self.it = iter(it) def __iter__(self): return self def __next__(self): return next(self.it) def __length_hint__(self): raise self.EXC_TYP(self.EXC_MSG) class OverflowLengthHint: """Simple iterator that allows to set a length_hint so that one can test overflow in PyObject_LengthHint. Should be used together with "sys.maxsize" so it works on 32bit and 64bit builds. """ def __init__(self, it, length_hint): self.it = iter(it) self.lh = length_hint def __iter__(self): return self def __next__(self): return next(self.it) def __length_hint__(self): return self.lh class FailingIsinstanceClass(metaclass=abc.ABCMeta): EXC_MSG = 'isinstance call failed' EXC_TYP = TypeError @classmethod def __subclasshook__(cls, C): raise cls.EXC_TYP(cls.EXC_MSG) 07070100000121000081A400000000000000000000000165E3BCDA000015AE000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/tests/test__c_funcs.py# Licensed under Apache License Version 2.0 - see LICENSE import gc import pytest import iteration_utilities from iteration_utilities import _iteration_utilities import helper_funcs as _hf def test_other_c_funcs(): assert iteration_utilities.return_True() assert not iteration_utilities.return_False() assert iteration_utilities.return_None() is None assert iteration_utilities.return_identity(1) == 1 assert iteration_utilities.return_first_arg(1, 2, 3) == 1 assert iteration_utilities.return_called(int) == 0 assert iteration_utilities.square(2) == 4 assert iteration_utilities.reciprocal(2) == 0.5 assert iteration_utilities.is_None(None) assert not iteration_utilities.is_None(False) assert not iteration_utilities.is_not_None(None) assert iteration_utilities.is_not_None(False) assert iteration_utilities.is_even(2) assert not iteration_utilities.is_even(1) assert iteration_utilities.is_odd(1) assert not iteration_utilities.is_odd(2) assert not iteration_utilities.is_iterable(1) assert iteration_utilities.is_iterable([1]) def test_other_c_funcs_failures(): with pytest.raises(TypeError): # no argument given. iteration_utilities.return_first_arg() with pytest.raises(TypeError): # no positional argument given. iteration_utilities.return_first_arg(test=10) x = object() with pytest.raises(TypeError): iteration_utilities.is_even(x) with pytest.raises(TypeError): iteration_utilities.is_odd(x) class NoBoolWithMod(_hf.FailBool): def __mod__(self, other): return self with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): iteration_utilities.is_even(NoBoolWithMod()) with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): iteration_utilities.is_odd(NoBoolWithMod()) with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): iteration_utilities.is_iterable(_hf.FailIter()) def test_reverse_math_ops(): assert iteration_utilities.radd(1, 2) == 3 assert iteration_utilities.rsub(1, 2) == 1 assert iteration_utilities.rmul(1, 2) == 2 assert iteration_utilities.rdiv(1, 2) == 2 assert iteration_utilities.rfdiv(1, 2) == 2 assert iteration_utilities.rpow(1, 2) == 2 assert iteration_utilities.rmod(1, 2) == 0 for rfunc in [iteration_utilities.radd, iteration_utilities.rsub, iteration_utilities.rmul, iteration_utilities.rdiv, iteration_utilities.rfdiv, iteration_utilities.rpow, iteration_utilities.rmod]: with pytest.raises(TypeError): # Too few arguments rfunc(1) with pytest.raises(TypeError): # Too many arguments rfunc(1, 2, 3) def test_traverse(): """To test the traverse implementation we call gc.collect() while instances of all the C objects are still valid.""" acc = iteration_utilities.accumulate([]) app = iteration_utilities.applyfunc(lambda x: x, 1) cha = iteration_utilities.chained(int, float) cla = iteration_utilities.clamp([], 0, 1) com = iteration_utilities.complement(int) con = iteration_utilities.constant(1) dee = iteration_utilities.deepflatten([]) dup = iteration_utilities.duplicates([]) fli = iteration_utilities.flip(int) gro = iteration_utilities.grouper([], 2) ine = iteration_utilities.intersperse([], 1) iik = iteration_utilities.ItemIdxKey(10, 2) ite = iteration_utilities.iter_except(int, TypeError) mer = iteration_utilities.merge([]) nth = iteration_utilities.nth(1) pac = iteration_utilities.packed(int) par = iteration_utilities.partial(int, 10) rep = iteration_utilities.replicate([], 3) rou = iteration_utilities.roundrobin([]) see = iteration_utilities.Seen() sid = iteration_utilities.sideeffects([], lambda x: x) spl = iteration_utilities.split([], lambda x: True) sta = iteration_utilities.starfilter(lambda x: True, []) suc = iteration_utilities.successive([]) tab = iteration_utilities.tabulate(int) une = iteration_utilities.unique_everseen([]) unj = iteration_utilities.unique_justseen([]) gc.collect() @_hf.skip_on_pypy_not_investigated_why def test_c_funcs_signatures(): # Makes sure every user-facing C function has a valid signature. from iteration_utilities import Iterable, chained from itertools import chain from operator import itemgetter from inspect import Signature # Get all C functions it = (Iterable(chain(_iteration_utilities.__dict__.items())) # only include those that do not start with an underscore, # we only need user-facing functions/classes .filterfalse(lambda x: x[0].startswith(('_'))) # only include those that have a __module__, to exclude things # like "return_None", "first" which do not have a signature .filter(lambda x: hasattr(x[1], '__module__')) # only include those that are really part of the package .filter(lambda x: x[1].__module__.startswith('iteration_utilities')) # remove duplicates .unique_everseen(itemgetter(0)) # get the signature, fails if it can't .map(lambda x: (x[0], x[1], Signature.from_callable(x[1])))) # Just need to trigger evaluation, use sorted because it's nice for manual # debugging! it.get_sorted(key=chained(itemgetter(0), str.lower)) 07070100000122000081A400000000000000000000000165E3BCDA000002B1000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/test__docs.py# Licensed under Apache License Version 2.0 - see LICENSE import doctest import pytest import iteration_utilities from iteration_utilities import _iteration_utilities import helper_funcs as _hf @_hf.skip_if_not_latest_python @pytest.mark.parametrize("mod", [iteration_utilities, _iteration_utilities, iteration_utilities._recipes, iteration_utilities._additional_recipes, iteration_utilities._classes, ]) def test_doctests(mod): # classes are added to the main module code. :-) doctest.testmod(mod, raise_on_error=True) 07070100000123000081A400000000000000000000000165E3BCDA0000068B000000000000000000000000000000000000003A00000000iteration_utilities-0.12.1/tests/test__parse_arg_kwarg.py# Licensed under Apache License Version 2.0 - see LICENSE from iteration_utilities._iteration_utilities import _parse_args, _parse_kwargs from iteration_utilities._utils import _default as default from helper_cls import T def test_parseargs_normal1(): assert _parse_args((T(1), T(2), T(3)), T(4), 0) == (T(4), T(1), T(2), T(3)) def test_parseargs_normal2(): assert _parse_args((T(1), T(2), T(3)), T(4), 1) == (T(1), T(4), T(2), T(3)) def test_parseargs_normal3(): assert _parse_args((T(1), T(2), T(3)), T(4), 2) == (T(1), T(2), T(4), T(3)) def test_parseargs_normal4(): assert _parse_args((T(1), T(2), T(3)), T(4), 3) == (T(1), T(2), T(3), T(4)) def test_parseargs_empty1(): assert _parse_args(tuple(), T(1), 0) == (T(1),) def test_parsekwargs_empty1(): dct = {} _parse_kwargs(dct, default) assert dct == {} def test_parsekwargs_normal1(): # One removed dct = {'a': 10, 'b': default} _parse_kwargs(dct, default) assert dct == {'a': 10} def test_parsekwargs_normal2(): # No removed dct = {'a': 10, 'b': 20} _parse_kwargs(dct, default) assert dct == {'a': 10, 'b': 20} def test_parsekwargs_normal3(): # All removed dct = {'a': default, 'b': default} _parse_kwargs(dct, default) assert dct == {} def test_parsekwargs_normal4(): # Tests an implementation detail: For more than 5 elements it allocates an # array on the heap (for less elements in the dict it uses an array on the # stack). dct = { 'a': default, 'b': default, 'c': default, 'd': default, 'e': 1, 'f': 2 } _parse_kwargs(dct, default) assert dct == {'e': 1, 'f': 2} 07070100000124000081A400000000000000000000000165E3BCDA00000567000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test__python_cls.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pytest from iteration_utilities import Iterable from iteration_utilities._utils import _default def test_sentinelfactory(): as_str = str(_default) as_repr = repr(_default) assert as_str == as_repr assert as_str == "<default>" def test_cls_length_hint(): assert operator.length_hint(Iterable([1, 2, 3])) == 3 assert operator.length_hint(Iterable([1, 2, 3]).accumulate()) == 3 def test_islice_no_arguments(): # This previously raised an IndexError, this makes sure the correct # exception is raised (#250). with pytest.raises(TypeError): Iterable([1]).islice() def test_cls_exception(): with pytest.raises(TypeError): Iterable.from_count().pad(ntail=None) # __getitem__ : negative idx with pytest.raises(ValueError): Iterable(range(10))[-2] # __getitem__ : negative step with pytest.raises(ValueError): Iterable(range(10))[::-2] # __getitem__ : positive start with negative stop with pytest.raises(ValueError): Iterable(range(10))[2:-1] # __getitem__ : negative start/stop with infinite iterable. with pytest.raises(TypeError): Iterable.from_count()[-5:-3] # __getitem__ : not int, not slice with pytest.raises(TypeError): Iterable.from_count()['bad'] 07070100000125000081A400000000000000000000000165E3BCDA00000BFC000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/tests/test__python_funcs.py# Licensed under Apache License Version 2.0 - see LICENSE from itertools import tee import pytest import iteration_utilities def test_ipartition_with_none_predicate(): f, t = iteration_utilities.ipartition([0, 1, 0, 1], pred=None) assert list(f) == [0, 0] assert list(t) == [1, 1] def test_exceptions(): # Random product doesn't work with empty iterables with pytest.raises(IndexError): iteration_utilities.random_product([]) # There is no element 10 in the tee object so this will raise the # Exception. t1, t2 = tee([1, 2, 3, 4, 5]) with pytest.raises(IndexError): iteration_utilities.tee_lookahead(t1, 10) # Missing idx or start/stop in replace/remove/getitem with pytest.raises(TypeError): iteration_utilities.replace([1, 2, 3], 5) with pytest.raises(TypeError): iteration_utilities.remove([1, 2, 3]) with pytest.raises(TypeError): iteration_utilities.getitem([1, 2, 3]) # Stop smaller than start in replace/remove with pytest.raises(ValueError): iteration_utilities.replace(range(10), 5, start=7, stop=5) with pytest.raises(ValueError): iteration_utilities.remove(range(10), start=7, stop=5) # idx smaller than -1 in getitem with pytest.raises(ValueError): iteration_utilities.getitem(range(10), (4, 2, -3, 9)) with pytest.raises(ValueError): iteration_utilities.nth_combination([1], r=-1, index=0) with pytest.raises(ValueError): iteration_utilities.nth_combination([1], r=2, index=0) with pytest.raises(IndexError): iteration_utilities.nth_combination([1, 2, 3, 4], r=2, index=-20) with pytest.raises(IndexError): iteration_utilities.nth_combination([1, 2, 3, 4], r=2, index=20) def test_empty_input(): empty = [] assert list(iteration_utilities .combinations_from_relations({}, 1)) == [] assert list(iteration_utilities .combinations_from_relations({'a': [1, 2, 3]}, 2)) == [] assert iteration_utilities.consume(empty, 2) is None assert list(iteration_utilities.flatten(empty)) == [] assert list(iteration_utilities.getitem( range(10), empty)) == [] x, y = iteration_utilities.ipartition(empty, lambda x: x) assert list(x) == [] and list(y) == [] # no need to test iter_subclasses here assert list(iteration_utilities.ncycles(empty, 10)) == [] assert list(iteration_utilities.powerset(empty)) == [()] assert iteration_utilities.random_combination(empty, 0) == () assert iteration_utilities.random_combination(empty, 0, True) == () assert iteration_utilities.random_permutation(empty, 0) == () assert list(iteration_utilities.remove( range(10), empty)) == list(range(10)) assert list(iteration_utilities.replace( range(10), 20, empty)) == list(range(10)) # no need to test repeatfunc here # no need to test tabulate here assert list(iteration_utilities.tail(empty, 2)) == [] # no need to test tee_lookahead here 07070100000126000081A400000000000000000000000165E3BCDA00000FD7000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_accumulate.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import accumulate import helper_funcs as _hf from helper_cls import T, toT def test_accumulate_empty1(): assert list(accumulate([])) == [] def test_accumulate_normal1(): assert list(accumulate([T(1), T(2), T(3)])) == [T(1), T(3), T(6)] def test_accumulate_normal2(): # binop=None is identical to no binop assert list(accumulate([], None)) == [] def test_accumulate_normal3(): # binop=None is identical to no binop assert list(accumulate([T(1), T(2), T(3)], None)) == [T(1), T(3), T(6)] def test_accumulate_binop1(): assert list(accumulate([T(1), T(2), T(3), T(4)], operator.add)) == [T(1), T(3), T(6), T(10)] def test_accumulate_binop2(): assert list(accumulate([T(1), T(2), T(3), T(4)], operator.mul)) == [T(1), T(2), T(6), T(24)] def test_accumulate_initial1(): assert list(accumulate([T(1), T(2), T(3)], None, T(10))) == [T(11), T(13), T(16)] def test_accumulate_failure1(): with pytest.raises(TypeError): list(accumulate([T(1), T(2), T(3)], None, T('a'))) def test_accumulate_failure2(): with pytest.raises(TypeError): list(accumulate([T(1), T(2), T(3)], operator.add, T('a'))) def test_accumulate_failure3(): with pytest.raises(TypeError): list(accumulate([T('a'), T(2), T(3)])) def test_accumulate_failure4(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(accumulate(_hf.FailNext())) def test_accumulate_failure5(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): accumulate(_hf.FailIter()) def test_accumulate_failure6(): # Too few arguments with pytest.raises(TypeError): accumulate() def test_accumulate_copy1(): _hf.iterator_copy(accumulate(toT([1, 2, 3]))) def test_accumulate_pickle1(protocol): acc = accumulate([T(1), T(2), T(3), T(4)]) assert next(acc) == T(1) x = pickle.dumps(acc, protocol=protocol) assert list(pickle.loads(x)) == [T(3), T(6), T(10)] def test_accumulate_pickle2(protocol): acc = accumulate([T(1), T(2), T(3), T(4)]) x = pickle.dumps(acc, protocol=protocol) assert list(pickle.loads(x)) == [T(1), T(3), T(6), T(10)] def test_accumulate_pickle3(protocol): acc = accumulate([T(1), T(2), T(3), T(4)], operator.mul) assert next(acc) == T(1) x = pickle.dumps(acc, protocol=protocol) assert list(pickle.loads(x)) == [T(2), T(6), T(24)] def test_accumulate_pickle4(protocol): acc = accumulate([T(1), T(2), T(3), T(4)], None, T(4)) x = pickle.dumps(acc, protocol=protocol) assert list(pickle.loads(x)) == [T(5), T(7), T(10), T(14)] def test_accumulate_attributes1(): it = accumulate(toT([1, 2, 3])) assert it.func is None with pytest.raises(AttributeError): it.current for item in it: assert item == it.current assert it.func is None def test_accumulate_attributes2(): it = accumulate(toT([1, 2, 3]), operator.add) for item in it: assert item == it.current assert it.func is operator.add def test_accumulate_lengthhint1(): it = accumulate([1, 2, 3, 4]) _hf.check_lengthhint_iteration(it, 4) def test_accumulate_lengthhint_failure1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) acc = accumulate(f_it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(acc) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(acc) def test_accumulate_lengthhint_failure2(): of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) acc = accumulate(of_it) with pytest.raises(OverflowError): operator.length_hint(acc) with pytest.raises(OverflowError): list(acc) 07070100000127000081A400000000000000000000000165E3BCDA000006C3000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_alldistinct.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import all_distinct import helper_funcs as _hf from helper_cls import T def test_alldistinct_empty1(): assert all_distinct([]) def test_alldistinct_normal1(): assert all_distinct([T(1), T(2), T(3)]) def test_alldistinct_normal2(): assert not all_distinct([T(1), T(1), T(1)]) def test_alldistinct_normal3(): # generator assert all_distinct((i for i in [T(1), T(2), T(3)])) def test_alldistinct_unhashable1(): assert all_distinct([{T('a'): T(1)}, {T('a'): T(2)}]) def test_alldistinct_unhashable2(): assert not all_distinct([{T('a'): T(1)}, {T('a'): T(1)}]) def test_alldistinct_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): all_distinct(_hf.FailIter()) def test_alldistinct_failure2(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): all_distinct(_hf.FailNext()) def test_alldistinct_failure3(): # Failure when comparing the object to the objects in the list with pytest.raises(_hf.FailEqNoHash.EXC_TYP, match=_hf.FailEqNoHash.EXC_MSG): all_distinct([[T(1)], _hf.FailEqNoHash()]) def test_alldistinct_failure4(): # Failure (no TypeError) when trying to hash the value with pytest.raises(_hf.FailHash.EXC_TYP, match=_hf.FailHash.EXC_MSG): all_distinct([T(1), _hf.FailHash()]) @_hf.skip_on_pypy_because_cache_next_works_differently def test_alldistinct_failure5(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): all_distinct(_hf.CacheNext(1)) 07070100000128000081A400000000000000000000000165E3BCDA00000493000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/tests/test_allequal.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import all_equal import helper_funcs as _hf from helper_cls import T def test_all_equal_empty1(): assert all_equal([]) def test_all_equal_normal1(): assert all_equal([T(1), T(1), T(1)]) def test_all_equal_normal2(): assert not all_equal([T(1), T(1), T(2)]) def test_all_equal_normal3(): # generator assert all_equal(i for i in [T(1), T(1), T(1)]) def test_all_equal_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): all_equal(_hf.FailIter()) def test_all_equal_failure2(): # comparison fail with pytest.raises(TypeError): all_equal([T(1), T('a')]) def test_all_equal_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): all_equal(_hf.FailNext()) @_hf.skip_on_pypy_because_cache_next_works_differently def test_all_equal_failure4(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): all_equal(_hf.CacheNext(1)) 07070100000129000081A400000000000000000000000165E3BCDA000006D6000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/tests/test_allisinstance.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import all_isinstance import helper_funcs as _hf from helper_cls import T, toT def test_allisinstance_empty1(): assert all_isinstance([], T) def test_allisinstance_normal1(): assert all_isinstance(toT([1, 2, 3]), T) def test_allisinstance_normal2(): assert not all_isinstance(toT([1, 2, 3]) + [10], T) def test_allisinstance_normal3(): # using a generator (raises a StopIteration) assert all_isinstance((i for i in toT([1, 2, 3])), T) def test_allisinstance_one_type(): assert all_isinstance([1, 2, 3], int) def test_allisinstance_multiple_types(): assert all_isinstance([1, 2, 3, 4.5], (int, float)) def test_allisinstance_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): all_isinstance(_hf.FailIter(), T) def test_allisinstance_failure2(): # not enough arguments with pytest.raises(TypeError): all_isinstance([T(1)]) def test_allisinstance_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): all_isinstance(_hf.FailNext(), T) def test_allisinstance_failure4(): # Test failing isinstance operation with pytest.raises(_hf.FailingIsinstanceClass.EXC_TYP, match=_hf.FailingIsinstanceClass.EXC_MSG): all_isinstance(toT([1, 2, 3]), _hf.FailingIsinstanceClass) @_hf.skip_on_pypy_because_cache_next_works_differently def test_allisinstance_failure5(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): all_isinstance(_hf.CacheNext(1), int) 0707010000012A000081A400000000000000000000000165E3BCDA00000792000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_allmonotone.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import all_monotone import helper_funcs as _hf from helper_cls import T def test_all_monotone_empty1(): assert all_monotone([]) def test_all_monotone_normal1(): assert all_monotone([T(1), T(1), T(1)]) def test_all_monotone_normal2(): assert not all_monotone([T(1), T(1), T(1)], strict=True) def test_all_monotone_normal3(): assert all_monotone([T(1), T(2), T(3)]) def test_all_monotone_normal4(): assert all_monotone([T(1), T(2), T(3)], strict=True) def test_all_monotone_normal5(): assert all_monotone([T(1), T(1), T(1)], decreasing=True) def test_all_monotone_normal6(): assert not all_monotone([T(1), T(1), T(1)], decreasing=True, strict=True) def test_all_monotone_normal7(): assert all_monotone([T(3), T(2), T(1)], decreasing=True) def test_all_monotone_normal8(): assert all_monotone([T(3), T(2), T(1)], decreasing=True, strict=True) def test_all_monotone_normal9(): # generator assert all_monotone(i for i in [T(1), T(1), T(1)]) def test_all_monotone_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): all_monotone(_hf.FailIter()) def test_all_monotone_failure2(): # comparison fail with pytest.raises(TypeError): all_monotone([T(1), T('a')]) def test_all_monotone_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): all_monotone(_hf.FailNext()) def test_all_monotone_failure4(): # too few arguments with pytest.raises(TypeError): all_monotone() @_hf.skip_on_pypy_because_cache_next_works_differently def test_all_monotone_failure5(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): all_monotone(_hf.CacheNext(1)) 0707010000012B000081A400000000000000000000000165E3BCDA000007D2000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/tests/test_always_iterable.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import always_iterable import helper_funcs as _hf def test_always_iterable_with_iterable(): assert tuple(always_iterable((1, 2, 3))) == (1, 2, 3) assert tuple(always_iterable((1, 2, 3))) == (1, 2, 3) def test_always_iterable_with_string(): assert list(always_iterable("abc")) == ["abc"] assert list(always_iterable(b"abc")) == [b"abc"] def test_always_iterable_excluding(): assert list(always_iterable("abc", excluded_types=None)) == ["a", "b", "c"] assert list(always_iterable([1, 2, 3], excluded_types=list)) == [[1, 2, 3]] assert list(always_iterable([1, 2, 3], excluded_types=tuple)) == [1, 2, 3] def test_always_iterable_str_subclass(): """The default is that only plain `str` or `bytes` are wrapped, but if given explicitly also subclasses should work. """ class StringSubClass(str): ... assert list(always_iterable(StringSubClass("abc"))) == ["a", "b", "c"] assert list(always_iterable(StringSubClass("abc"), excluded_types=str)) == ["abc"] def test_always_iterable_empty_when_none(): assert list(always_iterable(None)) == [None] assert list(always_iterable(None, empty_if_none=True)) == [] assert list(always_iterable(1, empty_if_none=True)) == [1] def test_always_iterable_invalid_argument(): with pytest.raises(TypeError): always_iterable() with pytest.raises(TypeError): always_iterable([1, 2, 3], excluded=None) def test_always_iterable_not_iterable(): assert list(always_iterable(1)) == [1] def test_always_iterable_fails_with_non_typeerror_when_iter(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): always_iterable(_hf.FailIter()) def test_always_iterable_fails_isinstance(): with pytest.raises(_hf.FailingIsinstanceClass.EXC_TYP, match=_hf.FailingIsinstanceClass.EXC_MSG): always_iterable(1, _hf.FailingIsinstanceClass) 0707010000012C000081A400000000000000000000000165E3BCDA00000685000000000000000000000000000000000000003700000000iteration_utilities-0.12.1/tests/test_anyisinstance.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import any_isinstance import helper_funcs as _hf from helper_cls import T, toT def test_anyisinstance_empty1(): assert not any_isinstance([], T) def test_anyisinstance_normal1(): assert not any_isinstance(toT([1, 2, 3]), int) def test_anyisinstance_normal2(): assert any_isinstance(toT([1, 2, 3]), T) def test_anyisinstance_normal3(): assert any_isinstance(toT([1, 2, 3]) + [10], int) def test_anyisinstance_normal4(): # using a generator (raises a StopIteration) assert not any_isinstance((i for i in toT([1, 2, 3])), int) def test_anyisinstance_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): any_isinstance(_hf.FailIter(), T) def test_anyisinstance_failure2(): # not enough arguments with pytest.raises(TypeError): any_isinstance([T(1)]) def test_anyisinstance_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): any_isinstance(_hf.FailNext(), T) def test_anyisinstance_failure4(): # Test failing isinstance operation with pytest.raises(_hf.FailingIsinstanceClass.EXC_TYP, match=_hf.FailingIsinstanceClass.EXC_MSG): any_isinstance(toT([1, 2, 3]), _hf.FailingIsinstanceClass) @_hf.skip_on_pypy_because_cache_next_works_differently def test_anyisinstance_failure5(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): any_isinstance(_hf.CacheNext(1), float) 0707010000012D000081A400000000000000000000000165E3BCDA000004CF000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/tests/test_applyfunc.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest import iteration_utilities from iteration_utilities import applyfunc, getitem from helper_cls import T from helper_funcs import iterator_copy def test_applyfunc_normal1(): assert list(getitem(applyfunc(lambda x: x**T(2), T(2)), stop=3)) == [T(4), T(16), T(256)] def test_applyfunc_normal2(): assert list(getitem(applyfunc(lambda x: x, T(2)), stop=3)) == [T(2), T(2), T(2)] def test_applyfunc_failure1(): with pytest.raises(TypeError): list(getitem(applyfunc(lambda x: x**T(2), T('a')), stop=3)) def test_applyfunc_attributes1(): it = applyfunc(iteration_utilities.square, 2) assert it.func is iteration_utilities.square assert it.current == 2 def test_applyfunc_failure2(): # Too few arguments with pytest.raises(TypeError): applyfunc(bool) def test_applyfunc_copy1(): iterator_copy(applyfunc(lambda x: x**T(2), T(2))) def test_applyfunc_pickle1(protocol): apf = applyfunc(iteration_utilities.square, T(2)) assert next(apf) == T(4) x = pickle.dumps(apf, protocol=protocol) assert next(pickle.loads(x)) == T(16) 0707010000012E000081A400000000000000000000000165E3BCDA00000A90000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/tests/test_argminmax.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import argmin, argmax import helper_funcs as _hf from helper_cls import T def test_argmax_normal1(): # Just one test for argmax because the internals are identical to argmin assert argmax(T(0), T(1), T(2)) == 2 def test_argmin_positional1(): assert argmin(T(0), T(1), T(2)) == 0 def test_argmin_positional2(): assert argmin(T(3), T(0), T(1)) == 1 def test_argmin_positional3(): # key=None is identical to no key assert argmin(T(3), T(1), T(2), key=None) == 1 def test_argmin_sequence1(): assert argmin([T(3), T(0), T(1)]) == 1 def test_argmin_generator1(): assert argmin((T(i) for i in [5, 4, 3, 2])) == 3 def test_argmin_key1(): assert argmin([T(3), T(-2), T(1)], key=abs) == 2 def test_argmin_default1(): assert argmin([], default=2) == 2 def test_argmin_failure1(): # default not possible if given multiple positional arguments with pytest.raises(TypeError): argmin(T(3), T(0), T(1), default=1) def test_argmin_failure2(): # not integer default value with pytest.raises(TypeError): argmin([T(3), T(0), T(1)], default='1.5', key=lambda x: x + 1) def test_argmin_failure3(): # unwanted kwarg with pytest.raises(TypeError): argmin([T(3), T(0), T(1)], default=1, key=lambda x: x + 1, blub=10) def test_argmin_failure4(): # no args with pytest.raises(TypeError): argmin(key=lambda x: x + 1) def test_argmin_failure5(): # empty sequence with pytest.raises(ValueError): argmin([], key=lambda x: x + 1) def test_argmin_failure6(): # cmp failed with pytest.raises(TypeError): argmin([T(1), T(2), T('a')], key=lambda x: x) def test_argmin_failure7(): # key failed with pytest.raises(TypeError): argmin([T(1), T(2), T('a')], key=lambda x: x + 1) def test_argmin_failure8(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): argmin(_hf.FailNext()) def test_argmin_failure9(): # Test that a failing iterator doesn't raise a SystemError # with default with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): argmin(_hf.FailNext(), default=1) def test_argmin_failure10(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): argmin(_hf.FailIter()) @_hf.skip_on_pypy_because_cache_next_works_differently def test_argmin_failure11(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): argmin(_hf.CacheNext(1)) 0707010000012F000081A400000000000000000000000165E3BCDA000014B0000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/tests/test_chained.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest import iteration_utilities from iteration_utilities import chained import helper_funcs from helper_cls import T as Original_T class T(Original_T): def __add__(self, other): return self.__class__(self.value + other) def __mul__(self, other): return self.__class__(self.value * other) def __rtruediv__(self, other): return self.__class__(other / self.value) def __pow__(self, other): return self.__class__(self.value ** other) class ChainedSubclass(chained): pass def test_chained_repr1(): x = chained(int, float) r = repr(x) assert 'chained' in r assert 'int' in r assert 'float' in r assert 'all=False' in r def test_chained_repr2(): x = chained(int, float, complex, str, reverse=True, all=True) r = repr(x) assert 'chained' in r assert 'int' in r assert 'float' in r assert 'complex' in r assert 'str' in r assert 'all=True' in r def test_chained_from_chained1(): # Verify that unwrapping works x = chained(int, chained(float, complex), str) funcs = x.__reduce__()[1] assert funcs[0] is int assert funcs[1] is float assert funcs[2] is complex assert funcs[3] is str def test_chained_from_chained2(): # Verify that unwrapping works (chained from chained) x = chained(chained(float, complex)) funcs = x.__reduce__()[1] assert funcs[0] is float assert funcs[1] is complex def test_chained_from_chained3(): # Verify that unwrapping works x = chained(int, chained(float, complex), str, reverse=True) funcs = x.__reduce__()[1] assert funcs[0] is str assert funcs[1] is float # still second! assert funcs[2] is complex # still third! assert funcs[3] is int def test_chained_from_chained4(): # Verify that unwrapping works with multiple chained x = chained(chained(int, list), chained(float, complex), chained(str, tuple, set)) funcs = x.__reduce__()[1] assert funcs[0] is int assert funcs[1] is list assert funcs[2] is float assert funcs[3] is complex assert funcs[4] is str assert funcs[5] is tuple assert funcs[6] is set def test_chained_from_chained5(): # Verify that unwrapping does NOT work if the inner chained has all x = chained(int, chained(float, complex, all=True), str) funcs = x.__reduce__()[1] assert funcs[0] is int assert type(funcs[1]) is chained # no unwrapping assert funcs[2] is str def test_chained_from_chained6(): # Verify that unwrapping does NOT work if the outer chained has all x = chained(int, chained(float, complex), str, all=True) funcs = x.__reduce__()[1] assert funcs[0] is int assert type(funcs[1]) is chained # no unwrapping assert funcs[2] is str def test_chained_normal1(): double_increment = chained(lambda x: x*2, lambda x: x+1) assert double_increment(T(10)) == T(21) assert double_increment(T(2)) == T(5) def test_chained_reverse1(): double_increment = chained(lambda x: x*2, lambda x: x+1, reverse=True) assert double_increment(T(10)) == T(22) assert double_increment(T(2)) == T(6) def test_chained_all1(): double_increment = chained(lambda x: x*2, lambda x: x+1, all=True) assert double_increment(T(10)) == (T(20), T(11)) assert double_increment(T(2)) == (T(4), T(3)) def test_chained_attributes1(): chd = chained(bool, int) assert chd.funcs == (bool, int) assert not chd.all def test_chained_failure1(): with pytest.raises(TypeError): # at least one func must be present chained() def test_chained_failure2(): with pytest.raises(TypeError): # kwarg not accepted chained(lambda x: x+1, invalidkwarg=lambda x: x*2) def test_chained_failure3(): with pytest.raises(TypeError): # func fails chained(lambda x: x+1)(T('a')) def test_chained_failure4(): with pytest.raises(TypeError): # second func fails chained(lambda x: x*2, lambda x: x+1)(T('a')) def test_chained_failure5(): with pytest.raises(TypeError): # second func fails chained(lambda x: x*2, lambda x: x+1, all=True)(T('a')) def test_chained_failure_setstate1(): helper_funcs.iterator_setstate_list_fail(chained(lambda x: x)) def test_chained_failure_setstate2(): helper_funcs.iterator_setstate_empty_fail(chained(lambda x: x)) def test_chained_pickle1(protocol): cmp = chained(iteration_utilities.square, iteration_utilities.reciprocal) x = pickle.dumps(cmp, protocol=protocol) assert pickle.loads(x)(T(10)) == T(1/100) assert pickle.loads(x)(T(2)) == T(1/4) def test_chained_pickle2(protocol): cmp = chained(iteration_utilities.square, iteration_utilities.double, reverse=True) x = pickle.dumps(cmp, protocol=protocol) assert pickle.loads(x)(T(10)) == T(400) assert pickle.loads(x)(T(3)) == T(36) def test_chained_pickle3(protocol): cmp = chained(iteration_utilities.square, iteration_utilities.double, all=True) x = pickle.dumps(cmp, protocol=protocol) assert pickle.loads(x)(T(10)) == (T(100), T(20)) assert pickle.loads(x)(T(3)) == (T(9), T(6)) 07070100000130000081A400000000000000000000000165E3BCDA000017A2000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/test_clamp.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import clamp import helper_funcs as _hf from helper_cls import T, toT def test_clamp_empty1(): assert list(clamp([], T(10), T(100))) == [] def test_clamp_normal1(): assert list(clamp(toT(range(10)), T(2), T(7))) == toT([2, 3, 4, 5, 6, 7]) def test_clamp_normal2(): # only low assert list(clamp(toT(range(10)), T(2))) == toT([2, 3, 4, 5, 6, 7, 8, 9]) def test_clamp_normal3(): # only high assert list(clamp(toT(range(10)), high=T(7))) == toT([0, 1, 2, 3, 4, 5, 6, 7]) def test_clamp_normal4(): # both, inclusive assert list(clamp(toT(range(10)), low=T(2), high=T(7), inclusive=True)) == toT([3, 4, 5, 6]) def test_clamp_normal5(): # no low/high assert list(clamp(toT(range(10)))) == toT([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]) def test_clamp_normal6(): # only low without remove assert list(clamp(toT(range(10)), T(2), remove=False)) == ( toT([2, 2, 2, 3, 4, 5, 6, 7, 8, 9])) def test_clamp_normal7(): # only high without remove assert list(clamp(toT(range(10)), high=T(7), remove=False)) == toT([0, 1, 2, 3, 4, 5, 6, 7, 7, 7]) def test_clamp_normal8(): # both without remove assert list(clamp(toT(range(10)), low=T(2), high=T(7), remove=False)) == toT([2, 2, 2, 3, 4, 5, 6, 7, 7, 7]) def test_clamp_normal9(): # no low/high (given as None) assert list(clamp(toT(range(10)), None, None)) == toT([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]) def test_clamp_attributes1(): it = clamp(toT(range(5)), T(1)) assert it.low == T(1) assert it.high is None assert it.remove assert not it.inclusive def test_clamp_failure1(): with pytest.raises(TypeError): list(clamp(toT(range(10)), T('a'), T(3))) def test_clamp_failure2(): with pytest.raises(TypeError): list(clamp(map(T, range(10)), T(3), T('a'))) def test_clamp_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(clamp(_hf.FailNext())) def test_clamp_failure4(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): clamp(_hf.FailIter()) def test_clamp_failure5(): # Too few arguments with pytest.raises(TypeError): clamp() @_hf.skip_on_pypy_because_cache_next_works_differently def test_clamp_failure6(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): # needs to be outside of the range and "remove=True" list(clamp(_hf.CacheNext(1), 2, remove=True)) def test_clamp_copy1(): _hf.iterator_copy(clamp([T(20), T(50)], T(10), T(100))) def test_clamp_pickle1(protocol): clmp = clamp(toT(range(10)), T(2), T(7)) assert next(clmp) == T(2) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([3, 4, 5, 6, 7]) def test_clamp_pickle2(protocol): # inclusive clmp = clamp(map(T, range(10)), T(2), T(7), inclusive=True) assert next(clmp) == T(3) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([4, 5, 6]) def test_clamp_pickle3(protocol): # only low clmp = clamp(map(T, range(10)), T(2)) assert next(clmp) == T(2) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([3, 4, 5, 6, 7, 8, 9]) def test_clamp_pickle4(protocol): # only high clmp = clamp(map(T, range(10)), high=T(7)) assert next(clmp) == T(0) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 3, 4, 5, 6, 7]) def test_clamp_pickle5(protocol): # only high, with inclusive clmp = clamp(map(T, range(10)), high=T(7), inclusive=True) assert next(clmp) == T(0) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 3, 4, 5, 6]) def test_clamp_pickle6(protocol): # only low, with inclusive clmp = clamp(map(T, range(10)), T(2), inclusive=True) assert next(clmp) == T(3) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([4, 5, 6, 7, 8, 9]) def test_clamp_pickle7(protocol): # no low no high clmp = clamp(map(T, range(10))) assert next(clmp) == T(0) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 3, 4, 5, 6, 7, 8, 9]) def test_clamp_pickle8(protocol): # only high but without remove clmp = clamp(map(T, range(10)), high=T(7), remove=False) assert next(clmp) == T(0) x = pickle.dumps(clmp, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 3, 4, 5, 6, 7, 7, 7]) def test_clamp_lengthhint1(): # When remove=False we can determine the length-hint. it = clamp(toT(range(5)), low=T(2), high=T(5), remove=False) _hf.check_lengthhint_iteration(it, 5) def test_clamp_lengthhint2(): # When low and high are not given we can determine the length-hint it = clamp(toT(range(5))) _hf.check_lengthhint_iteration(it, 5) def test_clamp_lengthhint3(): # Only works if "remove=False", otherwise the length-hint simply returns 0. it = clamp(toT(range(5)), low=T(2), high=T(5), remove=True) assert operator.length_hint(it) == 0 def test_clamp_lengthhint_failure1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = clamp(f_it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_clamp_lengthhint_failure2(): of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = clamp(of_it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 07070100000131000081A400000000000000000000000165E3BCDA00000791000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_complement.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest import iteration_utilities from iteration_utilities import complement import helper_funcs as _hf def test_complement_repr1(): x = complement(int) r = repr(x) assert 'complement' in r assert 'int' in r def test_complement_attributes1(): x = complement(int) assert x.func is int def test_complement_normal1(): assert not complement(lambda x: x is True)(True) def test_complement_normal2(): assert complement(lambda x: x is True)(False) def test_complement_normal3(): assert complement(lambda x: x is False)(True) def test_complement_normal4(): assert not complement(lambda x: x is False)(False) def test_complement_normal5(): assert not complement(iteration_utilities.is_None)(None) def test_complement_normal6(): assert complement(iteration_utilities.is_None)(False) def test_complement_normal7(): assert complement(iteration_utilities.is_None)(True) def test_complement_failure1(): # Function raises an Exception def failingfunction(x): raise ValueError('bad function') with pytest.raises(ValueError, match='bad function'): complement(failingfunction)(1) def test_complement_failure2(): # Function returns an object that cannot be interpreted as boolean with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): complement(lambda x: _hf.FailBool())(1) def test_complement_failure3(): # Too many arguments with pytest.raises(TypeError): complement(bool, int) def test_complement_failure4(): # Too few arguments with pytest.raises(TypeError): complement() def test_complement_pickle1(protocol): x = pickle.dumps(complement(iteration_utilities.is_None), protocol=protocol) assert pickle.loads(x)(False) assert pickle.loads(x)(True) assert not pickle.loads(x)(None) 07070100000132000081A400000000000000000000000165E3BCDA000002FD000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/tests/test_constant.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest from iteration_utilities import constant from helper_cls import T def test_constant_repr1(): x = constant(2) r = repr(x) assert 'constant' in r assert '2' in r def test_constant_attributes1(): x = constant(T(2)) assert x.item == T(2) def test_constant_normal1(): one = constant(T(1)) assert one() == T(1) def test_constant_normal2(): one = constant(T(1)) assert one(10, a=2) == T(1) def test_constant_failure1(): # Too few arguments with pytest.raises(TypeError): constant() def test_constant_pickle1(protocol): x = pickle.dumps(constant(T(10)), protocol=protocol) assert pickle.loads(x)() == T(10) 07070100000133000081A400000000000000000000000165E3BCDA00000929000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/test_count.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest import iteration_utilities from iteration_utilities import count_items import helper_funcs as _hf from helper_cls import T def test_count_empty1(): assert count_items([]) == 0 def test_count_normal1(): assert count_items([T(0), T(0)]) == 2 def test_count_normal2(): assert count_items([T(0), T(0), T(1)], bool) == 1 def test_count_normal3(): # None as pred is equal to not giving any predicate assert count_items([T(0), T(0), T(1), T(1)], None) == 4 def test_count_normal4(): assert count_items([], iteration_utilities.return_identity) == 0 def test_count_normal5(): assert count_items([T(1), T(2), T(3)], lambda x: x > T(2)) == 1 def test_count_normal6(): assert count_items([T(1), T(2), T(3)], lambda x: x < T(3)) == 2 def test_count_normal7(): assert count_items([T(3), T(1), T(2), T(3), T(3)], T(3), True) == 3 def test_count_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): count_items(_hf.FailIter()) def test_count_failure2(): with pytest.raises(TypeError): count_items([T(1)], T(1)) def test_count_failure3(): # Regression test when accessing the next item of the iterable resulted # in an Exception. For example when the iterable was a filter and the # filter function threw an exception. with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): count_items(_hf.FailNext()) def test_count_failure4(): # Too few arguments with pytest.raises(TypeError): count_items() def test_count_failure5(): # eq True but no pred with pytest.raises(TypeError): count_items([T(0)], eq=True) def test_count_failure6(): # eq True but pred None (like not given) with pytest.raises(TypeError): count_items([T(0)], pred=None, eq=True) def test_count_failure7(): # function returns item without boolean interpretation with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): count_items([T(0)], lambda x: _hf.FailBool()) @_hf.skip_on_pypy_because_cache_next_works_differently def test_count_failure8(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): count_items(_hf.CacheNext(1)) 07070100000134000081A400000000000000000000000165E3BCDA00001ED1000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_deepflatten.py# Licensed under Apache License Version 2.0 - see LICENSE import collections import pickle import pytest from iteration_utilities import deepflatten import helper_funcs as _hf from helper_cls import T, toT def test_deepflatten_empty1(): assert list(deepflatten([])) == [] def test_deepflatten_attributes1(): it = deepflatten([[T(1)], T(2)]) assert it.depth == -1 assert it.currentdepth == 0 assert it.ignore is None assert it.types is None assert next(it) == T(1) assert it.currentdepth == 1 def test_deepflatten_normal1(): assert list(deepflatten([T(1), T(2), T(3)])) == [T(1), T(2), T(3)] def test_deepflatten_normal2(): assert list(deepflatten([[T(1)], T(2), [[T(3)]]])) == toT([1, 2, 3]) def test_deepflatten_normal3(): # really deeply nested thingy assert list(deepflatten([[[[[[[[[[[T(5), T(4), T(3), T(2), T(1), T(0)]]]]], map(T, range(3))]]], (T(i) for i in range(5))]]]) ) == toT([5, 4, 3, 2, 1, 0, 0, 1, 2, 0, 1, 2, 3, 4]) def test_deepflatten_normal4(): # really deeply nested thingy with types assert list(deepflatten([[[[[[[[[[[T(5), T(4), T(3), T(2), T(1), T(0)]]]]], [T(0), T(1), T(2)]]]], [T(0), T(1), T(2), T(3), T(4)]]]], types=list) ) == toT([5, 4, 3, 2, 1, 0, 0, 1, 2, 0, 1, 2, 3, 4]) def test_deepflatten_containing_strings1(): # no endless recursion even if we have strings in the iterable assert list(deepflatten(["abc", "def"])) == ['a', 'b', 'c', 'd', 'e', 'f'] def test_deepflatten_containing_strings2(): # no endless recursion even if we have strings in the iterable and gave # strings as types assert list(deepflatten(["abc", "def"], types=str)) == ['a', 'b', 'c', 'd', 'e', 'f'] def test_deepflatten_containing_strings3(): # mixed with strings assert list(deepflatten(["abc", ("def",), "g", [[{'h'}], 'i']], )) == ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i'] def test_deepflatten_depth1(): assert list(deepflatten([T(1), T(2), T(3)], 1)) == toT([1, 2, 3]) def test_deepflatten_depth2(): assert list(deepflatten([[T(1)], T(2), [[T(3)]]], 1)) == [T(1), T(2), [T(3)]] def test_deepflatten_types1(): assert list(deepflatten([[T(1)], T(2), [[T(3)]]], types=list)) == toT([1, 2, 3]) def test_deepflatten_types2(): assert list(deepflatten([[T(1)], T(2), [[T(3)]]], types=tuple)) == [[T(1)], T(2), [[T(3)]]] def test_deepflatten_types3(): assert list(deepflatten([[T(1)], T(2), ([T(3)], )], types=(list, tuple))) == toT([1, 2, 3]) def test_deepflatten_ignore1(): assert list(deepflatten([[T(1)], T(2), [[T(3), 'abc']]], ignore=str)) == [T(1), T(2), T(3), 'abc'] def test_deepflatten_ignore2(): assert list(deepflatten([[T(1)], T(2), ([T(3), 'abc'], )], ignore=(tuple, str)) ) == [T(1), T(2), ([T(3), 'abc'], )] def test_deepflatten_failure1(): with pytest.raises(TypeError): list(deepflatten([T(1), T(2), T(3)], None, T('a'))) def test_deepflatten_failure2(): # recursively iterable data structures like strings that return another # string in their iter. with pytest.raises(RecursionError): list(deepflatten([collections.UserString('abc')])) def test_deepflatten_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(deepflatten(_hf.FailNext())) def test_deepflatten_failure4(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(deepflatten([[_hf.FailNext()], 2])) def test_deepflatten_failure5(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): deepflatten(_hf.FailIter()) def test_deepflatten_failure6(): # specified not iterable type as types with pytest.raises(TypeError): list(deepflatten([T(1), 2., T(3), T(4)], types=float)) def test_deepflatten_failure7(): # object that raises something else than TypeError when not iterable with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): list(deepflatten([T(1), _hf.FailIter(), T(3), T(4)])) def test_deepflatten_failure8(): # accessing iterator after exhausting the iterable df = deepflatten(toT([1, 2, 3, 4])) assert list(df) == toT([1, 2, 3, 4]) nothing = object() assert next(df, nothing) is nothing def test_deepflatten_failure9(): # Check that everyting is working even if isinstance fails df = deepflatten(toT([1, 2, 3, 4]), types=_hf.FailingIsinstanceClass) with pytest.raises(_hf.FailingIsinstanceClass.EXC_TYP, match=_hf.FailingIsinstanceClass.EXC_MSG): list(df) def test_deepflatten_failure10(): # Check that everyting is working even if isinstance fails df = deepflatten(toT([1, 2, 3, 4]), ignore=_hf.FailingIsinstanceClass) with pytest.raises(_hf.FailingIsinstanceClass.EXC_TYP, match=_hf.FailingIsinstanceClass.EXC_MSG): list(df) def test_deepflatten_copy1(): _hf.iterator_copy(deepflatten(toT([1, 2, 3, 4]))) def test_deepflatten_failure_setstate1(): # using __setstate__ to pass in an invalid iteratorlist df = deepflatten(toT([1, 2, 3, 4])) with pytest.raises(TypeError): df.__setstate__(({'a', 'b', 'c'}, 0, 0)) def test_deepflatten_failure_setstate2(): # using __setstate__ to pass in an invalid iteratorlist (not iterator # inside) df = deepflatten(toT([1, 2, 3, 4])) with pytest.raises(TypeError): df.__setstate__(([set(toT([1, 2, 3, 4]))], 0, 0)) def test_deepflatten_failure_setstate3(): # using __setstate__ to pass in an invalid currentdepth (too low) df = deepflatten(toT([1, 2, 3, 4])) with pytest.raises(ValueError): df.__setstate__(([iter(toT([1, 2, 3, 4]))], -3, 0)) def test_deepflatten_failure_setstate4(): # using __setstate__ to pass in an invalid currentdepth (too high) df = deepflatten(toT([1, 2, 3, 4])) with pytest.raises(ValueError): df.__setstate__(([iter(toT([1, 2, 3, 4]))], 5, 0)) def test_deepflatten_failure_setstate5(): _hf.iterator_setstate_list_fail(deepflatten(toT([1, 2, 3, 4]))) def test_deepflatten_failure_setstate6(): _hf.iterator_setstate_empty_fail(deepflatten(toT([1, 2, 3, 4]))) def test_deepflatten_reduce1(): # Earlier we were able to modify the iteratorlist (including deleting # parts of it). That could lead to segmentation faults. df = deepflatten(toT([1, 2, 3, 4, 5, 6])) next(df) # Clear the iteratorlist from all items. df.__reduce__()[2][0][:] = [] next(df) def test_deepflatten_setstate1(): # We could keep a reference to the iteratorlist passed to setstate and # mutate it (leading to incorrect behavior and segfaults). df = deepflatten(toT([1, 2, 3, 4, 5, 6])) next(df) # Easiest way is to roundtrip the state but keep the state as variable so # we can modify it! state = df.__reduce__()[2] df.__setstate__(state) state[0][:] = [] next(df) def test_deepflatten_pickle1(protocol): dpflt = deepflatten([[T(1)], [T(2)], [T(3)], [T(4)]]) assert next(dpflt) == T(1) x = pickle.dumps(dpflt, protocol=protocol) assert list(pickle.loads(x)) == toT([2, 3, 4]) def test_deepflatten_pickle2(protocol): dpflt = deepflatten([['abc', T(1)], [T(2)], [T(3)], [T(4)]]) assert next(dpflt) == 'a' x = pickle.dumps(dpflt, protocol=protocol) assert list(pickle.loads(x)) == ['b', 'c'] + toT([1, 2, 3, 4]) 07070100000135000081A400000000000000000000000165E3BCDA0000082C000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_dotproduct.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import dotproduct import helper_funcs as _hf from helper_cls import T def test_dotproduct_empty1(): assert dotproduct([], []) == 0 def test_dotproduct_normal1(): assert dotproduct([T(1), T(2), T(3)], [T(1), T(2), T(3)]) == T(14) def test_dotproduct_normal2(): assert dotproduct([T(100), T(200), T(300)], [T(100), T(200), T(300)]) == T(140000) def test_dotproduct_normal3(): # generators assert dotproduct((i for i in [T(1), T(2), T(3)]), (i for i in [T(1), T(2), T(3)])) == T(14) def test_dotproduct_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): dotproduct(_hf.FailIter(), [T(1)]) def test_dotproduct_failure2(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): dotproduct([T(1)], _hf.FailIter()) def test_dotproduct_failure3(): # multiplication fails with pytest.raises(TypeError): dotproduct([T(1)], [1]) def test_dotproduct_failure4(): # multiplication fails (later) with pytest.raises(TypeError): dotproduct([T(1), T(1)], [T(1), 1]) def test_dotproduct_failure5(): # addition fails with pytest.raises(TypeError): dotproduct([T(1), 1], [T(1), 1]) def test_dotproduct_failure6(): # addition fails (inverted) with pytest.raises(TypeError): dotproduct([1, T(1), 1], [1, T(1), 1]) def test_dotproduct_failure7(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): dotproduct(_hf.FailNext(), _hf.FailNext()) def test_dotproduct_failure8(): # Too few arguments with pytest.raises(TypeError): dotproduct() @_hf.skip_on_pypy_because_cache_next_works_differently def test_dotproduct_failure9(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): dotproduct(_hf.CacheNext(1), _hf.CacheNext(1)) 07070100000136000081A400000000000000000000000165E3BCDA00000DBB000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_duplicates.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import pytest import iteration_utilities from iteration_utilities import duplicates import helper_funcs as _hf from helper_cls import T, toT def test_duplicates_empty1(): assert list(duplicates([])) == [] def test_duplicates_normal1(): assert list(duplicates([T(1), T(2), T(1)])) == [T(1)] def test_duplicates_key1(): assert list(duplicates([T(1), T(2), T(1)], abs)) == [T(1)] def test_duplicates_key2(): assert list(duplicates([T(1), T(1), T(-1)], abs)) == toT([1, -1]) def test_duplicates_unhashable1(): assert list(duplicates([{T(1): T(1)}, {T(2): T(2)}, {T(1): T(1)}] )) == [{T(1): T(1)}] def test_duplicates_unhashable2(): assert list(duplicates([[T(1)], [T(2)], [T(1)]])) == [[T(1)]] def test_duplicates_unhashable3(): assert list(duplicates([[T(1), T(1)], [T(1), T(2)], [T(1), T(3)]], operator.itemgetter(0) )) == [[T(1), T(2)], [T(1), T(3)]] def test_duplicates_getter1(): t = duplicates([T(1), T([0, 0]), T(3), T(1)]) assert not t.seen assert t.key is None assert next(t) == T(1) assert T(1) in t.seen assert T(3) in t.seen assert T([0, 0]) in t.seen assert t.key is None def test_duplicates_getter2(): t = duplicates([T(1), T([0, 0]), T(3), T(1)], key=iteration_utilities.return_identity) assert t.key is iteration_utilities.return_identity def test_duplicates_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): duplicates(_hf.FailIter()) def test_duplicates_failure2(): with pytest.raises(TypeError): list(duplicates([T(1), T(2), T(3), T('a')], abs)) def test_duplicates_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(duplicates(_hf.FailNext())) def test_duplicates_failure4(): # Too few arguments with pytest.raises(TypeError): duplicates() def test_duplicates_failure5(): # Failure when comparing the object to the objects in the list with pytest.raises(_hf.FailEqNoHash.EXC_TYP, match=_hf.FailEqNoHash.EXC_MSG): list(duplicates([[T(1)], _hf.FailEqNoHash()])) def test_duplicates_failure6(): # Failure (no TypeError) when trying to hash the value with pytest.raises(_hf.FailHash.EXC_TYP, match=_hf.FailHash.EXC_MSG): list(duplicates([T(1), _hf.FailHash()])) @_hf.skip_on_pypy_because_cache_next_works_differently def test_duplicates_failure7(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): list(duplicates(_hf.CacheNext(1))) def test_duplicates_failure_setstate1(): # __setstate__ only accepts Seen instances dp = duplicates(toT([1, 1])) with pytest.raises(TypeError): dp.__setstate__((set(toT(range(1, 3))),)) def test_duplicates_failure_setstate2(): _hf.iterator_setstate_list_fail(duplicates(toT([1, 1]))) def test_duplicates_failure_setstate3(): _hf.iterator_setstate_empty_fail(duplicates(toT([1, 1]))) def test_duplicates_copy1(): _hf.iterator_copy(duplicates(toT([1, 1]))) def test_duplicates_pickle1(protocol): dpl = duplicates([T(1), T(2), T(1), T(2)]) assert next(dpl) == T(1) x = pickle.dumps(dpl, protocol=protocol) assert list(pickle.loads(x)) == [T(2)] 07070100000137000081A400000000000000000000000165E3BCDA00000414000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/test_empty.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pytest from iteration_utilities import empty import helper_funcs as _hf EmptyType = type(empty) def test_empty(): assert list(empty) == [] assert tuple(empty) == () assert set(empty) == set() def test_empty_type_construct(): with pytest.raises(TypeError, match=r"_EmptyType\.__new__` takes no arguments"): EmptyType(1) with pytest.raises(TypeError, match=r"_EmptyType\.__new__` takes no arguments"): EmptyType(a=1) def test_empty_only_one_instance(): e1 = EmptyType() e2 = EmptyType() assert e1 is e2 def test_empty_length_hint(): assert operator.length_hint(empty, -1) == 0 def test_empty_pickle(protocol): e = EmptyType() assert _hf.round_trip_pickle(e, protocol=protocol) is e assert list(_hf.round_trip_pickle(e, protocol=protocol)) == [] def test_empty_subiter_pickle(protocol): e = enumerate(EmptyType()) assert list(_hf.round_trip_pickle(e, protocol=protocol)) == [] 07070100000138000081A400000000000000000000000165E3BCDA00000C9D000000000000000000000000000000000000002E00000000iteration_utilities-0.12.1/tests/test_flip.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest from iteration_utilities import flip from helper_cls import T class FlipSubclass(flip): pass def _one_arg(a): return a, def _two_args(a, b): return a, b def _three_args(a, b, c): return a, b, c def _six_args(a, b, c, d, e, f): return a, b, c, d, e, f def test_flip_repr1(): x = flip(int) r = repr(x) assert 'flip' in r assert 'int' in r def test_flip_attributes1(): x = flip(int) assert x.func is int def test_flip_double_flip1(): x = flip(int) y = flip(x) # Simply returned the original function instead of flipping the # arguments twice. assert y is int def test_flip_double_flip2(): # A subclass should prevent the behavior that it simply returns the # original function when flipped. assert FlipSubclass(flip(int)) is not int assert flip(FlipSubclass(int)) is not int def test_flip_normal1(): assert not flip(isinstance)(float, 10) def test_flip_normal2(): assert flip(isinstance)(int, 10) def test_flip_args0(): def func(): return () assert flip(func)() == () def test_flip_args1(): assert flip(_one_arg)(T(10)) == (T(10),) def test_flip_args1_only_kwargs(): assert flip(_one_arg)(a=T(10)) == (T(10),) def test_flip_args2(): assert flip(_two_args)(T(1), T(2)) == (T(2), T(1)) def test_flip_args2_with_kwargs(): assert flip(_two_args)(T(1), b=T(2)) == (T(1), T(2)) def test_flip_args2_only_kwargs(): assert flip(_two_args)(a=T(1), b=T(2)) == (T(1), T(2)) def test_flip_args3(): assert flip(_three_args)(T(1), T(2), T(3)) == (T(3), T(2), T(1)) def test_flip_args3_with_kwargs(): assert flip(_three_args)(T(1), T(2), c=T(3)) == (T(2), T(1), T(3)) def test_flip_args3_with_2_kwargs(): assert flip(_three_args)(T(1), b=T(2), c=T(3)) == (T(1), T(2), T(3)) def test_flip_args3_only_kwargs(): assert flip(_three_args)(a=T(1), b=T(2), c=T(3)) == (T(1), T(2), T(3)) def test_flip_args6(): expected = T(6), T(5), T(4), T(3), T(2), T(1) assert flip(_six_args)(T(1), T(2), T(3), T(4), T(5), T(6)) == expected def test_flip_args6_with_kwargs(): expected = T(3), T(2), T(1), T(4), T(5), T(6) assert flip(_six_args)(T(1), T(2), T(3), d=T(4), e=T(5), f=T(6)) == expected def test_flip_args6_with_5_kwargs(): expected = T(1), T(2), T(3), T(4), T(5), T(6) assert flip(_six_args)(T(1), b=T(2), c=T(3), d=T(4), e=T(5), f=T(6)) == expected def test_flip_args6_only_kwargs(): expected = T(1), T(2), T(3), T(4), T(5), T(6) assert flip(_six_args)(a=T(1), b=T(2), c=T(3), d=T(4), e=T(5), f=T(6)) == expected def test_flip_failure1(): with pytest.raises(TypeError): flip(isinstance)(10, float) def test_flip_failure2(): # Too few arguments with pytest.raises(TypeError): flip() def test_flip_failure3(): # Too many arguments with pytest.raises(TypeError): flip(isinstance, bool) def test_flip_pickle1(protocol): x = pickle.dumps(flip(isinstance), protocol=protocol) assert pickle.loads(x)(float, 10.) assert pickle.loads(x)(int, 10) assert not pickle.loads(x)(float, 10) 07070100000139000081A400000000000000000000000165E3BCDA00000FA6000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/tests/test_groupedby.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pytest import iteration_utilities from iteration_utilities import groupedby import helper_funcs as _hf from helper_cls import T, toT def test_groupedby_empty1(): assert groupedby([], key=lambda x: x) == {} def test_groupedby_normal1(): assert groupedby([T('a'), T('ab'), T('abc')], key=lambda x: x.value[0] ) == {'a': toT(['a', 'ab', 'abc'])} def test_groupedby_normal2(): assert groupedby([T('a'), T('ba'), T('ab'), T('abc'), T('b')], key=lambda x: x.value[0] ) == {'a': toT(['a', 'ab', 'abc']), 'b': toT(['ba', 'b'])} def test_groupedby_normal3(): # generator assert groupedby((i for i in [T('a'), T('ab'), T('abc')]), key=lambda x: x.value[0] ) == {'a': toT(['a', 'ab', 'abc'])} def test_groupedby_keep1(): assert groupedby([T('a'), T('ba'), T('ab'), T('abc'), T('b')], key=lambda x: x.value[0], keep=len) == {'a': [1, 2, 3], 'b': [2, 1]} def test_groupedby_reduce1(): assert groupedby([(T('a'), T(1)), (T('a'), T(2)), (T('b'), T(5))], key=operator.itemgetter(0), keep=operator.itemgetter(1), reduce=operator.add) == {T('a'): T(3), T('b'): T(5)} def test_groupedby_reduce2(): assert groupedby([(T('a'), T(1)), (T('a'), T(2)), (T('b'), T(5))], key=operator.itemgetter(0), reduce=lambda x, y: x + y[1], reducestart=T(0)) == {T('a'): T(3), T('b'): T(5)} def test_groupedby_reduce3(): assert groupedby(map(T, range(500)), key=lambda x: T(x.value % 5), reduce=operator.add, reducestart=T(0)) def test_groupedby_reduce4(): # reduce=None is identical to no reduce assert groupedby([T(1), T(1), T(2), T(3)], lambda x: x, reduce=None) == {T(1): [T(1), T(1)], T(2): [T(2)], T(3): [T(3)]} def test_groupedby_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): groupedby(_hf.FailIter(), key=len) def test_groupedby_failure2(): # key func fails with pytest.raises(TypeError): groupedby([T(1), T(2), T(3)], key=lambda x: T(x.value + 'a')) def test_groupedby_failure3(): # keep func fails with pytest.raises(TypeError): groupedby([T(1), T(2), T(3)], key=lambda x: x, keep=lambda x: T(x.value + 'a')) @_hf.skip_on_pypy_not_investigated_why_it_segfaults def test_groupedby_failure4(): # unhashable with pytest.raises(TypeError): groupedby([{T('a'): T(10)}], key=lambda x: x) def test_groupedby_failure5(): # no reduce but reducestart with pytest.raises(TypeError): groupedby(toT(range(10)), lambda x: x, reducestart=T(0)) def test_groupedby_failure6(): # reduce function fails with reducestart with pytest.raises(TypeError): groupedby(map(T, range(10)), lambda x: x.value % 2 == 0, reduce=operator.add, reducestart=T('a')) def test_groupedby_failure7(): # reduce function fails with pytest.raises(TypeError): groupedby(map(T, [1, 2, 3, 4, 'a']), lambda x: True, reduce=operator.add) def test_groupedby_failure8(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): groupedby(_hf.FailNext(), bool) def test_groupedby_failure9(): # too few arguments with pytest.raises(TypeError): groupedby() @_hf.skip_on_pypy_because_cache_next_works_differently def test_groupedby_failure10(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): groupedby(_hf.CacheNext(1), iteration_utilities.return_True) 0707010000013A000081A400000000000000000000000165E3BCDA00001C10000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/tests/test_grouper.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import grouper import helper_funcs as _hf from helper_cls import T, toT def test_grouper_empty1(): # Empty iterable assert list(grouper([], 2)) == [] def test_grouper_normal1(): # no fillvalue + truncate assert list(grouper([T(1)], 3)) == [(T(1), )] def test_grouper_normal2(): assert list(grouper([T(1), T(2)], 3)) == [(T(1), T(2))] def test_grouper_normal3(): assert list(grouper([T(1), T(2), T(3)], 3)) == [(T(1), T(2), T(3))] def test_grouper_normal4(): assert list(grouper([T(1), T(2), T(3), T(4)], 3)) == [(T(1), T(2), T(3)), (T(4), )] def test_grouper_normal5(): assert list(grouper(toT([1, 2, 3, 4, 5]), 3)) == [(T(1), T(2), T(3)), (T(4), T(5))] def test_grouper_normal6(): assert list(grouper(toT([1, 2, 3, 4, 5, 6]), 3)) == [(T(1), T(2), T(3)), (T(4), T(5), T(6))] def test_grouper_normal7(): # generator assert list(grouper((i for i in toT([1, 2, 3, 4, 5, 6])), 3) ) == [(T(1), T(2), T(3)), (T(4), T(5), T(6))] def test_grouper_fill1(): # with fillvalue assert list(grouper(toT([1]), 3, fillvalue=T(0))) == [(T(1), T(0), T(0))] def test_grouper_fill2(): assert list(grouper(toT([1, 2]), 3, fillvalue=T(0))) == [(T(1), T(2), T(0))] def test_grouper_fill3(): assert list(grouper(toT([1, 2, 3]), 3, fillvalue=T(0))) == [(T(1), T(2), T(3))] def test_grouper_fill4(): assert list(grouper(toT([1, 2, 3, 4]), 3, fillvalue=T(0))) == [(T(1), T(2), T(3)), (T(4), T(0), T(0))] def test_grouper_fill5(): assert list(grouper(toT([1, 2, 3, 4, 5]), 3, fillvalue=T(0))) == [(T(1), T(2), T(3)), (T(4), T(5), T(0))] def test_grouper_fill6(): assert list(grouper(toT([1, 2, 3, 4, 5, 6]), 3, fillvalue=T(0))) == [(T(1), T(2), T(3)), (T(4), T(5), T(6))] def test_grouper_truncate1(): # with truncate assert list(grouper(toT([1]), 3, truncate=True)) == [] def test_grouper_truncate2(): assert list(grouper(toT([1, 2]), 3, truncate=True)) == [] def test_grouper_truncate3(): assert list(grouper(toT([1, 2, 3]), 3, truncate=True)) == [(T(1), T(2), T(3))] def test_grouper_truncate4(): assert list(grouper(toT([1, 2, 3, 4]), 3, truncate=True)) == [(T(1), T(2), T(3))] def test_grouper_truncate5(): assert list(grouper(toT([1, 2, 3, 4, 5]), 3, truncate=True)) == [(T(1), T(2), T(3))] def test_grouper_truncate6(): assert list(grouper(toT([1, 2, 3, 4, 5, 6]), 3, truncate=True)) == [(T(1), T(2), T(3)), (T(4), T(5), T(6))] def test_grouper_attributes1(): it = grouper(toT(range(10)), 2) assert it.times == 2 assert not it.truncate with pytest.raises(AttributeError): it.fillvalue def test_grouper_attributes2(): it = grouper(toT(range(10)), 2, fillvalue=None) assert it.times == 2 assert not it.truncate assert it.fillvalue is None def test_grouper_failure1(): # fillvalue + truncate is forbidden with pytest.raises(TypeError): grouper(toT([1, 2, 3]), 2, fillvalue=T(0), truncate=True) def test_grouper_failure2(): # n must be > 0 with pytest.raises(ValueError): grouper(toT([1, 2, 3]), 0) def test_grouper_failure3(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): grouper(_hf.FailIter(), 2) def test_grouper_failure4(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(grouper(_hf.FailNext(), 2)) def test_grouper_failure5(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(grouper(_hf.FailNext(offset=1), 2)) def test_grouper_failure6(): # Too few arguments with pytest.raises(TypeError): grouper() with pytest.raises(TypeError): grouper(toT([1, 2, 3, 4])) @_hf.skip_on_pypy_because_cache_next_works_differently def test_grouper_failure7(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): # I use "next" here otherwise it would be interpreted as last group... # because the original "next" indicates the end of the iterator. next(grouper(_hf.CacheNext(1), 2)) def test_grouper_copy1(): _hf.iterator_copy(grouper(toT(range(10)), 3)) def test_grouper_failure_setstate1(): _hf.iterator_setstate_list_fail(grouper(toT(range(10)), 3)) def test_grouper_failure_setstate2(): _hf.iterator_setstate_empty_fail(grouper(toT(range(10)), 3)) def test_grouper_pickle1(protocol): grp = grouper(toT(range(10)), 3) assert next(grp) == (T(0), T(1), T(2)) x = pickle.dumps(grp, protocol=protocol) assert list(pickle.loads(x)) == [(T(3), T(4), T(5)), (T(6), T(7), T(8)), (T(9), )] def test_grouper_pickle2(protocol): grp = grouper(toT(range(10)), 3, fillvalue=T(0)) assert next(grp) == (T(0), T(1), T(2)) x = pickle.dumps(grp, protocol=protocol) assert list(pickle.loads(x)) == [(T(3), T(4), T(5)), (T(6), T(7), T(8)), (T(9), T(0), T(0))] def test_grouper_pickle3(protocol): grp = grouper(toT(range(10)), 3, truncate=True) assert next(grp) == (T(0), T(1), T(2)) x = pickle.dumps(grp, protocol=protocol) assert list(pickle.loads(x)) == [(T(3), T(4), T(5)), (T(6), T(7), T(8))] @pytest.mark.parametrize( 'length, it', [ (2, grouper([1, 2, 3, 4, 5], 2, truncate=True)), (3, grouper([1, 2, 3, 4, 5, 6], 2, truncate=True)), (3, grouper([1, 2, 3, 4, 5], 2)), (3, grouper([1, 2, 3, 4, 5, 6], 2)), (3, grouper([1, 2, 3, 4, 5], 2, fillvalue=None)), (3, grouper([1, 2, 3, 4, 5, 6], 2, fillvalue=None)) ] ) def test_grouper_lengthhint1(length, it): _hf.check_lengthhint_iteration(it, length) def test_grouper_lengthhint_failure1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = grouper(f_it, 2) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_grouper_lengthhint_failure2(): of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = grouper(of_it, 2) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 0707010000013B000081A400000000000000000000000165E3BCDA00000F27000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_intersperse.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import intersperse import helper_funcs as _hf from helper_cls import T, toT def test_intersperse_empty1(): assert list(intersperse([], T(0))) == [] def test_intersperse_empty2(): assert list(intersperse([T(1)], T(0))) == [T(1)] def test_intersperse_normal1(): assert list(intersperse([T(1), T(2)], T(0))) == toT([1, 0, 2]) def test_intersperse_attributes1(): it = intersperse([T(1), T(2)], T(0)) assert it.fillvalue == T(0) def test_intersperse_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): intersperse(_hf.FailIter(), T(0)) def test_intersperse_failure2(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(intersperse(_hf.FailNext(), T(0))) def test_intersperse_failure3(): # Too few arguments with pytest.raises(TypeError): intersperse() def test_intersperse_copy1(): _hf.iterator_copy(intersperse(toT([1, 2, 3]), T(0))) def test_intersperse_failure_setstate1(): # When start==0 then no second item should be given to setstate its = intersperse(toT([1, 1]), None) with pytest.raises(ValueError): its.__setstate__((0, T(1))) def test_intersperse_failure_setstate2(): _hf.iterator_setstate_list_fail(intersperse(toT([1, 1]), None)) def test_intersperse_failure_setstate3(): _hf.iterator_setstate_empty_fail(intersperse(toT([1, 1]), None)) def test_intersperse_pickle1(protocol): its = intersperse(toT([1, 2, 3]), T(0)) x = pickle.dumps(its, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 0, 2, 0, 3]) def test_intersperse_pickle2(protocol): its = intersperse(toT([1, 2, 3]), T(0)) assert next(its) == T(1) x = pickle.dumps(its, protocol=protocol) assert list(pickle.loads(x)) == toT([0, 2, 0, 3]) def test_intersperse_pickle3(protocol): its = intersperse([T(1), T(2), T(3)], T(0)) assert next(its) == T(1) assert next(its) == T(0) x = pickle.dumps(its, protocol=protocol) assert list(pickle.loads(x)) == toT([2, 0, 3]) def test_intersperse_pickle4(protocol): its = intersperse([T(1), T(2), T(3)], T(0)) assert next(its) == T(1) assert next(its) == T(0) assert next(its) == T(2) x = pickle.dumps(its, protocol=protocol) assert list(pickle.loads(x)) == toT([0, 3]) def test_intersperse_lengthhint1(): it = intersperse([1, 2, 3], 2) _hf.check_lengthhint_iteration(it, 5) def test_intersperse_lengthhint_failure1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = intersperse(f_it, 2) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_intersperse_lengthhint_failure2(): # This is the easy way to overflow the length_hint: If the iterable itself # has a length_hint > sys.maxsize of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = intersperse(of_it, 2) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_intersperse_lengthhint_failure3(): # The length_hint method multiplies the length_hint of the iterable with # 2 (and adds/subtracts 1) so it's actually possible to have overflow even # if the length of the iterable doesn't trigger the overflow! of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize) it = intersperse(of_it, 2) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 0707010000013C000081A400000000000000000000000165E3BCDA0000145A000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_itemidxkey.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest import iteration_utilities from iteration_utilities import ItemIdxKey from helper_cls import T def test_itemidxkey_repr1(): # Just make sure the representation does not fail. assert repr(ItemIdxKey(T(10), 2)) assert repr(ItemIdxKey(T(10), 2, T(10))) def test_itemidxkey_repr2(): # Just make sure the representation uses the class name class Fun(ItemIdxKey): pass assert 'Fun' in repr(Fun(T(10), 2)) assert 'Fun' in repr(Fun(T(10), 2, T(10))) def test_itemidxkey_repr3(): # Just make sure the representation does not fail. iik = ItemIdxKey(10, 2) iik.item = [iik] assert repr(iik) == 'iteration_utilities.ItemIdxKey(item=[...], idx=2)' def test_itemidxkey_repr4(): # Make sure the representation does not segfault if the representation of # the item deletes the "key"... iik = ItemIdxKey([10, 11], 2, [50, 100]) class DeleteKey: def __repr__(self): del iik.key return 'DeleteKey()' iik.item = DeleteKey() iik # asserting the representation isn't really the point, the point of this # test is that the representation doesn't segfault. However making sure the # representation is like that seems a "good idea". assert repr(iik) == ('iteration_utilities.ItemIdxKey(item=DeleteKey(), ' 'idx=2, key=[50, 100])') def test_itemidxkey_failure1(): # Too few arguments with pytest.raises(TypeError): ItemIdxKey() def test_itemidxkey_failure2(): # item may not be an ItemIdxKey iik = ItemIdxKey(T(10), 2) with pytest.raises(TypeError) as exc: ItemIdxKey(iik, 2) assert "`ItemIdxKey`" in str(exc.value) and '`item`' in str(exc.value) def test_itemidxkey_failure3(): # key may not be an ItemIdxKey iik = ItemIdxKey(T(10), 2) with pytest.raises(TypeError) as exc: ItemIdxKey(T(10), 2, iik) assert "`ItemIdxKey`" in str(exc.value) and '`key`' in str(exc.value) def test_itemidxkey_failure4(): # Cannot use <= or >=, these make no sense with ItemIdxKey but if these # are implemented just remove that test. with pytest.raises(TypeError): ItemIdxKey(T(10), 2) >= ItemIdxKey(T(10), 2) with pytest.raises(TypeError): ItemIdxKey(T(10), 2) <= ItemIdxKey(T(10), 2) def test_itemidxkey_failure5(): # Other argument in < and > must be another ItemIdxKey with pytest.raises(TypeError): ItemIdxKey(T(10), 2) < T(10) def test_itemidxkey_failure6(): # The item of the ItemIdxKey instances throws an Error when compared. with pytest.raises(TypeError, match='simulated failure'): ItemIdxKey(T(10), 2) < ItemIdxKey(T('a'), 2) def test_itemidxkey_getter(): iik = ItemIdxKey(T(10), 2) assert iik.item == T(10) assert iik.idx == 2 with pytest.raises(AttributeError): iik.key iik = ItemIdxKey(T(10), 2, T(5)) assert iik.item == T(10) assert iik.idx == 2 assert iik.key == T(5) def test_itemidxkey_setter(): iik = ItemIdxKey(T(10), 2) iik.item = T(20) assert iik.item == T(20) iik.idx = 10 assert iik.idx == 10 iik.key = T(0) assert iik.key == T(0) iik = ItemIdxKey(T(10), 2, T(5)) iik.item = T(20) assert iik.item == T(20) iik.idx = 10 assert iik.idx == 10 iik.key = T(0) assert iik.key == T(0) def test_itemidxkey_setter_failure1(): iik = ItemIdxKey(T(10), 2) with pytest.raises(TypeError): iik.idx = 'a' iik = ItemIdxKey(T(10), 2, T(5)) with pytest.raises(TypeError): iik.idx = 'a' def test_itemidxkey_setter_failure2(): # cannot manually assign ItemIdxKey instance for item or key iik = ItemIdxKey(T(10), 2) with pytest.raises(TypeError) as exc: iik.item = iik assert "`ItemIdxKey`" in str(exc.value) and '`item`' in str(exc.value) iik = ItemIdxKey(T(10), 2, T(5)) with pytest.raises(TypeError) as exc: iik.key = iik assert "`ItemIdxKey`" in str(exc.value) and '`key`' in str(exc.value) def test_itemidxkey_deleter(): iik = ItemIdxKey(T(10), 2) with pytest.raises(AttributeError): del iik.key with pytest.raises(AttributeError): iik.key iik = ItemIdxKey(T(10), 2, T(5)) assert iik.key == T(5) del iik.key with pytest.raises(AttributeError): iik.key def test_itemidxkey_deleter_failure(): iik = ItemIdxKey(T(10), 2) with pytest.raises(TypeError): del iik.item with pytest.raises(TypeError): del iik.idx iik = ItemIdxKey(T(10), 2, T(5)) with pytest.raises(TypeError): del iik.item with pytest.raises(TypeError): del iik.idx def test_itemidxkey_pickle1(protocol): iik = ItemIdxKey(T(10), 2) x = pickle.dumps(iik, protocol=protocol) assert pickle.loads(x).item == T(10) assert pickle.loads(x).idx == 2 def test_itemidxkey_pickle2(protocol): iik = ItemIdxKey(T(10), 2, T(5)) x = pickle.dumps(iik, protocol=protocol) assert pickle.loads(x).item == T(10) assert pickle.loads(x).idx == 2 assert pickle.loads(x).key == T(5) 0707010000013D000081A400000000000000000000000165E3BCDA000009CD000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_iter_except.py# Licensed under Apache License Version 2.0 - see LICENSE from functools import partial import pickle import pytest from iteration_utilities import iter_except from helper_cls import T import helper_funcs as _hf def test_iterexcept_normal1(): dct = {T('a'): T(10)} assert list(iter_except(dct.popitem, KeyError)) == [(T('a'), T(10))] def test_iterexcept_normal2(): # None as "first" argument is equivalent to not passing in a "first". dct = {T('a'): T(10)} assert list(iter_except(dct.popitem, KeyError, None)) == [(T('a'), T(10))] def test_iterexcept_first(): d = {} def insert(): d[T('a')] = T(10) exp_out = [None, (T('a'), T(10))] assert list(iter_except(d.popitem, KeyError, insert)) == exp_out def test_iterexcept_attributes1(): it = iter_except(list.append, ValueError) assert it.func is list.append assert it.exception is ValueError with pytest.raises(AttributeError): it.first def test_iterexcept_attributes2(): it = iter_except(list.append, ValueError, list) assert it.func is list.append assert it.exception is ValueError assert it.first is list def test_iterexcept_failure1(): # wrong exception with pytest.raises(KeyError): list(iter_except(({T('a'): T(10)}).popitem, ValueError)) def test_iterexcept_failure2(): # too few arguments with pytest.raises(TypeError): iter_except() def test_iterexcept_copy1(): dct = {T('a'): T(10)} _hf.iterator_copy(iter_except(dct.popitem, KeyError)) def test_iterexcept_pickle1(protocol): dct = {T('a'): T(10)} ie = iter_except(dct.popitem, KeyError) x = pickle.dumps(ie, protocol=protocol) assert list(pickle.loads(x)) == [(T('a'), T(10))] def test_iterexcept_pickle2(protocol): dct = {T('a'): T(10)} ie = iter_except(dct.popitem, KeyError, None) x = pickle.dumps(ie, protocol=protocol) assert list(pickle.loads(x)) == [(T('a'), T(10))] def test_iterexcept_pickle3(protocol): dct = {} first = partial(dct.setdefault, T('a'), T(10)) ie = iter_except(dct.popitem, KeyError, first) x = pickle.dumps(ie, protocol=protocol) assert list(pickle.loads(x)) == [T(10), (T('a'), T(10))] def test_iterexcept_pickle4(protocol): dct = {} first = partial(dct.setdefault, T('a'), T(10)) ie = iter_except(dct.popitem, KeyError, first) assert next(ie) == T(10) x = pickle.dumps(ie, protocol=protocol) assert list(pickle.loads(x)) == [(T('a'), T(10))] 0707010000013E000081A400000000000000000000000165E3BCDA00003056000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/test_merge.py# Licensed under Apache License Version 2.0 - see LICENSE import itertools import operator import pickle import sys import pytest import iteration_utilities from iteration_utilities import merge import helper_funcs as _hf from helper_cls import T, toT def test_merge_empty1(): assert list(merge()) == [] def test_merge_empty2(): assert list(merge([])) == [] def test_merge_empty3(): assert list(merge([], (), {})) == [] def test_merge_empty4(): # generator, one ends immediately the other only after two items assert list(merge((i for i in []), (i for i in (T(1), T(2))), {})) == [T(1), T(2)] def test_merge_empty5(): # generator assert list(merge((), {}, (i for i in []))) == [] def test_merge_normal1(): for seq in itertools.permutations([[T(1)], [T(2)], [T(3)]]): assert list(merge(*seq)) == toT([1, 2, 3]) def test_merge_normal2(): for seq in itertools.permutations([[T(1)], [T(2)], [T(3)], []]): assert list(merge(*seq)) == toT([1, 2, 3]) def test_merge_normal3(): for seq in itertools.permutations([[T(1), T(3)], [T(2)], [T(4)]]): assert list(merge(*seq)) == toT([1, 2, 3, 4]) def test_merge_normal4(): for seq in itertools.permutations([[T(1), T(3)], [T(0), T(2)], [T(4)]]): assert list(merge(*seq)) == toT([0, 1, 2, 3, 4]) def test_merge_normal5(): perms = itertools.permutations([toT(range(5)), toT(range(3)), toT(range(4, 7))]) for seq in perms: assert list(merge(*seq)) == toT([0, 0, 1, 1, 2, 2, 3, 4, 4, 5, 6]) def test_merge_normal6(): # key=None is identical to no key assert list(merge([T(1)], [T(2)], key=None)) == [T(1), T(2)] def test_merge_stable1(): # Stability tests (no use of T on purpose!) it = merge([1], [1.]) item1 = next(it) assert isinstance(item1, int) item2 = next(it) assert isinstance(item2, float) def test_merge_key1(): # Key function tests seq = ([(T(1), T(0)), (T(2), T(0))], [(T(1), T(-1)), (T(2), T(-1))]) assert (list(merge(*seq, key=operator.itemgetter(0))) == [(T(1), T(0)), (T(1), T(-1)), (T(2), T(0)), (T(2), T(-1))]) def test_merge_reverse1(): # Reverse test for seq in itertools.permutations([[T(1)], [T(2)], [T(3)]]): assert list(merge(*seq, reverse=True)) == toT([3, 2, 1]) def test_merge_reverse2(): for seq in itertools.permutations([[T(1)], [T(2)], [T(3)], []]): assert list(merge(*seq, reverse=True)) == toT([3, 2, 1]) def test_merge_reverse3(): for seq in itertools.permutations([[T(3), T(1)], [T(2)], [T(4)]]): assert list(merge(*seq, reverse=True)) == toT([4, 3, 2, 1]) def test_merge_reverse4(): for seq in itertools.permutations([[T(3), T(1)], [T(2), T(0)], [T(4)]]): assert list(merge(*seq, reverse=True)) == toT([4, 3, 2, 1, 0]) def test_merge_keyreverse1(): # Key+reverse function tests seq = ([(T(2), T(0)), (T(1), T(0))], [(T(2), T(-1)), (T(1), T(-1))]) assert (list(merge(*seq, reverse=True, key=operator.itemgetter(0))) == [(T(2), T(0)), (T(2), T(-1)), (T(1), T(0)), (T(1), T(-1))]) def test_merge_attributes1(): # Key+reverse function tests it = merge(toT(range(5)), toT(range(5))) assert not it.reverse assert not it.key def test_merge_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): merge(_hf.FailIter()) def test_merge_failure2(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): merge([T(10), T(20)], _hf.FailIter()) def test_merge_failure3(): # Unexpected keyword argument with pytest.raises(TypeError): merge([T(10), T(20)], [T(20)], reverse=True, key=abs, wrongkwd=True) def test_merge_failure4(): # Unexpected keyword argument with pytest.raises(TypeError): merge([T(10), T(20)], [T(20), T(30)], reverse=True, wrongkwd=True) def test_merge_failure5(): # Unexpected keyword argument with pytest.raises(TypeError): merge([T(10), T(20)], [T(20), T(30)], key=abs, wrongkwd=True) def test_merge_failure6(): # Unexpected keyword argument with pytest.raises(TypeError): merge([T(10), T(20)], [T(20), T(30)], wrongkwd=True) def test_merge_failure7(): # Key function fails with pytest.raises(TypeError): list(merge([T(2), (T(2), T(0))], [(T(1), T(2)), (T(1), T(3))], key=operator.itemgetter(0))) def test_merge_failure8(): # Key function fails with pytest.raises(TypeError): list(merge([(T(2), T(0)), T(2)], [(T(1), T(2)), (T(1), T(3))], key=operator.itemgetter(0))) def test_merge_failure9(): # comparison fails with pytest.raises(TypeError): list(merge([T('a'), T('b')], [T(2), T(3)])) def test_merge_failure10(): # comparison fails with pytest.raises(TypeError): list(merge([T(1), T('b')], [T(2), T(3)])) def test_merge_failure11(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(merge(_hf.FailNext())) def test_merge_failure12(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(merge([T(1), T(1)], _hf.FailNext())) def test_merge_failure13(): # Test that a failing iterator doesn't raise a SystemError mge = merge(_hf.FailNext(offset=2, repeats=10)) assert next(mge) == T(1) with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(mge) def test_merge_failure_setstate1(): # __setstate__ with numactive < 0 fails mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(ValueError): mg.__setstate__((None, 0, None, -1)) def test_merge_failure_setstate2(): # __setstate__ with numactive > len(iterators) fails mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(ValueError): mg.__setstate__((None, 0, None, 3)) def test_merge_failure_setstate3(): # __setstate__ with type(current) != tuple mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(TypeError): mg.__setstate__((None, 0, [], 2)) def test_merge_failure_setstate4(): # __setstate__ with len(current) != len(iteratortuple) from iteration_utilities import ItemIdxKey as IIK mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(ValueError): mg.__setstate__((None, 0, (IIK(-2, 0), ), 2)) def test_merge_failure_setstate5(): # __setstate__ with current containing non-itemidxkey instances mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(TypeError): mg.__setstate__((None, 0, (1, 2), 2)) def test_merge_failure_setstate6(): # __setstate__ with current containing itemidxkey with key even though # no key function is given from iteration_utilities import ItemIdxKey as IIK mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(TypeError): mg.__setstate__((None, 0, (IIK(-2, 0), IIK(-1, 1, 2)), 2)) def test_merge_failure_setstate7(): # __setstate__ with current containing itemidxkey without key even though # a key function is given from iteration_utilities import ItemIdxKey as IIK mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(TypeError): mg.__setstate__((lambda x: x, 0, (IIK(-2, 0), IIK(-1, 1, 2)), 2)) def test_merge_failure_setstate8(): # __setstate__ with current containing itemidxkey with index that is out # of bounds from iteration_utilities import ItemIdxKey as IIK mg = merge(toT(range(5)), toT(range(3, 10, 2))) with pytest.raises(ValueError): mg.__setstate__((None, 0, (IIK(-2, 0), IIK(-1, 20)), 2)) def test_merge_failure_setstate9(): _hf.iterator_setstate_list_fail(merge(toT(range(5)), toT(range(3, 10, 2)))) def test_merge_failure_setstate10(): _hf.iterator_setstate_empty_fail( merge(toT(range(5)), toT(range(3, 10, 2)))) def test_merge_copy1(): _hf.iterator_copy(merge([T(0)], [T(1), T(2)], [T(2)])) def test_merge_reduce1(): # We shouldn't be able to alter the ItemIdxKey instances in the "current" # tuple that is returned from reduce. We could remove or add the key # attribute which would break the comparisons df = merge([T(1), T(2), T(3)], [T(1), T(2), T(3)]) next(df) # add a key even though we have no key function df.__reduce__()[2][2][0].key = 10 list(df) def test_merge_setstate1(): # We shouldn't be able to alter the ItemIdxKey instances in the "current" # tuple that is used to setstate. We could remove or add the key # attribute which would break the comparisons df = merge([T(1), T(2), T(3)], [T(1), T(2), T(3)]) next(df) # we roundtrip the state but keep a reference so we can later add a key # even though we have no key function state = df.__reduce__()[2] df.__setstate__(state) state[2][0].key = 10 list(df) def test_merge_pickle1(protocol): # normal mge = merge([T(0)], [T(1), T(2)], [T(2)]) assert next(mge) == T(0) x = pickle.dumps(mge, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 2]) def test_merge_pickle2(protocol): # with key mge = merge([T(1), T(2)], [T(0)], [T(-2)], key=abs) assert next(mge) == T(0) x = pickle.dumps(mge, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, -2]) def test_merge_pickle3(protocol): # reverse mge = merge([T(2), T(1)], [T(0)], [T(3)], reverse=True) assert next(mge) == T(3) x = pickle.dumps(mge, protocol=protocol) assert list(pickle.loads(x)) == toT([2, 1, 0]) def test_merge_pickle4(protocol): # pickle unstarted merge instance mge = merge([T(0)], [T(1), T(2)], [T(2)]) x = pickle.dumps(mge, protocol=protocol) assert list(pickle.loads(x)) == toT([0, 1, 2, 2]) def test_merge_pickle5(protocol): # pickle merge with no exhausted iterable mge = merge([T(0), T(1)], [T(1), T(2)]) assert next(mge) == T(0) x = pickle.dumps(mge, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 1, 2]) def test_merge_lengthhint1(): it = merge([0], [1, 2, 3], [1]) _hf.check_lengthhint_iteration(it, 5) def test_merge_lengthhint_failure1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = merge(f_it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_merge_lengthhint_failure2(): # This is the easy way to overflow the length_hint: If the iterable itself # has a length_hint > sys.maxsize of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = merge(of_it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_merge_lengthhint_failure3(): # Like the test case above but this time we take one item because # internally an unstarted "merge" and started "merge" are treated # differently of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = merge(of_it) next(it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_merge_lengthhint_failure4(): # Overflow could also happen when adding length_hints that individually are # below the sys.maxsize # In this case we have 3 + sys.maxsize of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize) it = merge(toT([1, 2, 3]), of_it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_merge_lengthhint_failure5(): # Like the test above but this time with a "started" merge of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize) it = merge(toT([1, 2, 3]), of_it) next(it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 0707010000013F000081A400000000000000000000000165E3BCDA00001A6E000000000000000000000000000000000000003000000000iteration_utilities-0.12.1/tests/test_minmax.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import minmax import helper_funcs as _hf from helper_cls import T def test_minmax_normal1(): assert minmax([T(1)]) == (T(1), T(1)) def test_minmax_normal2(): assert minmax([T(1), T(2)]) == (T(1), T(2)) def test_minmax_normal3(): assert minmax([T(2), T(1)]) == (T(1), T(2)) def test_minmax_normal4(): assert minmax([T(1), T(2), T(3)]) == (T(1), T(3)) def test_minmax_normal5(): assert minmax([T(1), T(3), T(2)]) == (T(1), T(3)) def test_minmax_normal6(): assert minmax(map(T, range(100))) == (T(0), T(99)) def test_minmax_normal7(): assert minmax(map(T, range(101))) == (T(0), T(100)) def test_minmax_normal8(): assert minmax({T(1), T(2), T(-3)}) == (T(-3), T(2)) def test_minmax_normal9(): assert minmax({T(1): T(0), T(2): T(0), T(3): T(0)}) == (T(1), T(3)) def test_minmax_normal10(): assert minmax(T(1), T(2), T(3)) == (T(1), T(3)) def test_minmax_normal11(): assert minmax(T(4), T(3), T(2), T(1)) == (T(1), T(4)) def test_minmax_normal12(): assert minmax((T(i) for i in [4, 3, 2, 5, 3])) == (T(2), T(5)) def test_minmax_normal13(): assert minmax((T(i) for i in [4, 3, 2, 5, 3, 3])) == (T(2), T(5)) def test_minmax_normal14(): assert minmax((T(i) for i in [4, 3, 2, 5, 3]), key=abs) == (T(2), T(5)) def test_minmax_normal15(): assert minmax((T(i) for i in [4, 3, 2, 5, 3, 3]), key=abs) == (T(2), T(5)) def test_minmax_keyNone1(): # key=None is identical to no key assert minmax([T(1), T(2)], key=None) == (T(1), T(2)) def test_minmax_key1(): assert minmax(T('a'), T('b'), T('c'), key=lambda x: x.value.upper()) == (T('a'), T('c')) def test_minmax_key2(): assert minmax(T((T(1), T(2))), T((T(2), T(3))), T((T(3), T(1))), key=lambda x: x.value[1]) == (T((T(3), T(1))), T((T(2), T(3)))) def test_minmax_default1(): assert minmax([], default=T(10)) == (T(10), T(10)) def test_minmax_stability1(): assert minmax([T((T(1), T(5)))], key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(1), T(5)))) def test_minmax_stability2(): assert minmax(T((T(1), T(5))), T((T(1), T(1))), key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(1), T(5)))) def test_minmax_stability3(): assert minmax(T((T(1), T(5))), T((T(1), T(1))), T((T(1), T(2))), key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(1), T(5)))) def test_minmax_stability4(): assert minmax(T((T(1), T(5))), T((T(1), T(1))), T((T(1), T(2))), T((T(1), T(3))), key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(1), T(5)))) def test_minmax_stability5(): assert minmax(T((T(5), T(5))), T((T(1), T(5))), T((T(1), T(2))), T((T(1), T(3))), key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(5), T(5)))) def test_minmax_stability6(): assert minmax(T((T(5), T(5))), T((T(3), T(5))), T((T(1), T(5))), T((T(1), T(3))), key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(5), T(5)))) def test_minmax_stability7(): assert minmax(T((T(5), T(5))), T((T(3), T(5))), T((T(4), T(5))), T((T(1), T(5))), key=lambda x: x.value[0]) == (T((T(1), T(5))), T((T(5), T(5)))) def test_minmax_failure1(): # No args with pytest.raises(TypeError): minmax() def test_minmax_failure2(): # empty sequence no default with pytest.raises(ValueError): minmax([]) def test_minmax_failure3(): # invalid kwarg with pytest.raises(TypeError): minmax(T(1), T(2), invalid_kw='a') def test_minmax_failure4(): # default with multiple args with pytest.raises(TypeError): minmax(T(1), T(2), default=T(10)) def test_minmax_failure5(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): minmax(_hf.FailIter()) def test_minmax_failure6(): # func fails on odd numbered arg with pytest.raises(TypeError): minmax(T(100), T('a'), key=lambda x: x.value + '') def test_minmax_failure7(): # func fails on even numbered arg with pytest.raises(TypeError): minmax(T('a'), T(100), key=lambda x: x.value + '') def test_minmax_failure8(): # unable to compare first and second with pytest.raises(TypeError): minmax(T(100), T('a')) def test_minmax_failure9(): # unable to compare third and fourth with pytest.raises(TypeError): minmax(T(100), T(20), T(100), T('a')) def test_minmax_failure10(): # unable to compare first and third with pytest.raises(TypeError): minmax(T(1), T(20), T('a'), T('c')) def test_minmax_failure11(): # unable to compare second and fourth # This is tricky. The elements are explicitly chosen so that # 1 compares with 2 without error: 1 current min, 2 current max # 3 compares with 4: 3 < 4 # 3 compares with 1: 1 still current minimum # 4 compares not with 2 because the first element is equal and then # the comparison with the second element throws and error because # str and int are not compareable. with pytest.raises(TypeError): minmax(T((100, 'a')), T((200, 10)), T((150, 'b')), T((200, 'd'))) def test_minmax_failure12(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): minmax(_hf.FailNext()) def test_minmax_failure13(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): minmax(_hf.FailNext(offset=1)) def test_minmax_failure14(): # Test a weird class that has lt but no gt method class ltbutnogt: def __init__(self, val): self.val = val def __lt__(self, other): return self.val < other.val def __gt__(self, other): raise ValueError('no gt') with pytest.raises(ValueError, match='no gt'): minmax(ltbutnogt(10), ltbutnogt(5)) @_hf.skip_on_pypy_because_cache_next_works_differently def test_minmax_failure15(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): minmax(_hf.CacheNext(1)) 07070100000140000081A400000000000000000000000165E3BCDA0000154F000000000000000000000000000000000000002D00000000iteration_utilities-0.12.1/tests/test_nth.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest import iteration_utilities from iteration_utilities import nth import helper_funcs as _hf from helper_cls import T, toT def test_nth_repr1(): x = nth(2) r = repr(x) assert 'nth' in r assert '2' in r def test_nth_attributes1(): assert iteration_utilities.first.n == 0 assert iteration_utilities.second.n == 1 assert iteration_utilities.third.n == 2 assert iteration_utilities.last.n == -1 assert nth(10).n == 10 def test_nth_normal1(): assert nth(1)([T(1), T(2), T(3)]) == T(2) def test_nth_normal2(): assert nth(2)(map(T, range(10))) == T(2) def test_nth_nopred_retpred1(): assert nth(2)(toT(range(10)), retpred=True) == T(2) def test_nth_retidx1(): assert nth(2)(toT(range(10)), retidx=True) == 2 def test_nth_retidx2(): assert nth(2)(toT(range(10)), pred=bool, retidx=True) == 3 def test_nth_pred1(): # With pred assert nth(1)([T(0), T(1), T(2)], pred=bool) == T(2) def test_nth_pred2(): assert nth(1)([T(0), T(1), T(2)], pred=None) == T(2) def test_nth_pred3(): assert nth(0)([T(0)]*100 + [T(1)], pred=bool) == T(1) def test_nth_pred4(): assert nth(1)([[T(0)], [T(1), T(2)]]*2, pred=lambda x: len(x) > 1) == [T(1), T(2)] def test_nth_predtruthyretpred1(): # pred with truthy/retpred assert nth(1)([T(0), T(2), T(3), T(0)], pred=bool, truthy=False) == T(0) def test_nth_predtruthyretpred2(): assert not nth(1)([T(0), T(1), T(2), T(3), T(0)], pred=bool, truthy=False, retpred=True) def test_nth_predtruthyretpred3(): assert nth(1)([T(0), T(2), T(3), T(0)], pred=lambda x: x**T(2), truthy=False) == T(0) def test_nth_predtruthyretpred4(): assert nth(1)(toT([0, 1, 2, 3, 0]), pred=lambda x: x**T(2), truthy=False, retpred=True) == T(0) def test_nth_predtruthyretpred5(): assert nth(2)([T(0), T(1), T(2), T(3)], pred=bool) == T(3) def test_nth_predtruthyretpred6(): assert nth(2)([T(0), T(1), T(2), T(3)], pred=bool, retpred=True) def test_nth_predtruthyretpred7(): assert nth(2)([T(0), T(1), T(2), T(3)], pred=lambda x: x**T(2)) == T(3) def test_nth_predtruthyretpred8(): assert nth(2)([T(0), T(2), T(3), T(4)], pred=lambda x: x**T(2), retpred=True) == T(16) def test_nth_default1(): # With default assert nth(2)([], default=None) is None def test_nth_default2(): assert nth(1)([T(0), T(0), T(0)], default=None, pred=bool) is None def test_nth_default3(): # generator assert nth(1)((i for i in [T(0), T(0), T(0)]), default=None, pred=bool) is None def test_nth_regressiontest(): # This segfaulted in earlier versions because the "val" intermediate # variable was decref'd for each item in the iterable. lst = [1] + [0]*10000 + [2]*20 assert nth(1)(lst, pred=bool, retpred=True) def test_nth_failures1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): nth(10)(_hf.FailIter()) def test_nth_failures2(): with pytest.raises(IndexError): nth(10)([]) def test_nth_failures3(): with pytest.raises(IndexError): nth(1)([T(0)], pred=bool) def test_nth_failures4(): with pytest.raises(TypeError): nth(1)([T('a'), T('b')], pred=lambda x: abs(x.value)) def test_nth_failures5(): # item not an integer with pytest.raises(TypeError): nth('a') def test_nth_failures6(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): nth(1)(_hf.FailNext()) def test_nth_failures7(): # too few arguments for __call__ with pytest.raises(TypeError): nth(1)() def test_nth_failures8(): # too few arguments for __call__ with pytest.raises(TypeError): nth(1)() def test_nth_failures9(): # too few arguments for __call__ with pytest.raises(ValueError, match='`retpred` or `retidx`'): nth(1)([T(0), T(1), T(2)], retpred=True, retidx=True) def test_nth_failures10(): # indexerror with generator with pytest.raises(IndexError): nth(1)((i for i in [T(0)]), pred=bool) def test_nth_failures11(): # evaluating as boolean fails with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): nth(1)([T(0)], pred=lambda x: _hf.FailBool()) def test_nth_failures12(): # evaluating as boolean fails with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): nth(1)([T(0)], pred=lambda x: _hf.FailBool(), retpred=True) def test_nth_failures13(): # evaluating as boolean fails with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): nth(1)([T(0)], pred=lambda x: _hf.FailBool(), retidx=True) def test_nth_failures14(): # evaluating as boolean fails with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): nth(1)([T(0)], pred=lambda x: _hf.FailBool(), truthy=False) @_hf.skip_on_pypy_because_cache_next_works_differently def test_nth_failure15(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): nth(2)(_hf.CacheNext(1)) def test_nth_pickle1(protocol): x = pickle.dumps(nth(2), protocol=protocol) assert pickle.loads(x)([T(1), T(2), T(3), T(4)]) == T(3) 07070100000141000081A400000000000000000000000165E3BCDA00000620000000000000000000000000000000000000002D00000000iteration_utilities-0.12.1/tests/test_one.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import one import helper_funcs as _hf from helper_cls import T def test_one_normal1(): assert one([T(0)]) == T(0) def test_one_normal2(): assert one('a') == 'a' def test_one_normal3(): assert one({T('o'): T(10)}) == T('o') def test_one_normal4(): # generator with one item assert one(i for i in [T(0)]) == T(0) def test_one_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): one(_hf.FailIter()) def test_one_failure2(): # empty iterable with pytest.raises(ValueError): one([]) def test_one_failure3(): # more than 1 element with pytest.raises(ValueError) as exc: one([T(1), T(2)]) assert "'T(1), T(2)[, ...]'" in str(exc.value) def test_one_failure4(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): one(_hf.FailNext()) def test_one_failure5(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): one(_hf.FailNext(offset=1)) def test_one_failure6(): # generator without items with pytest.raises(ValueError): one(i for i in []) @_hf.skip_on_pypy_because_cache_next_works_differently def test_one_failure7(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): one(_hf.CacheNext(1)) 07070100000142000081A400000000000000000000000165E3BCDA000009E5000000000000000000000000000000000000003000000000iteration_utilities-0.12.1/tests/test_packed.py# Licensed under Apache License Version 2.0 - see LICENSE import collections import operator import pickle import pytest from iteration_utilities import packed from helper_cls import T def _t3(a, b, c): return a, b, c def _t6(a, b, c, d, e, f): return a, b, c, d, e, f def test_packed_repr1(): x = packed(int) r = repr(x) assert 'packed' in r assert 'int' in r def test_packed_attributes1(): x = packed(int) assert x.func is int @pytest.mark.parametrize("kind", [tuple, list, collections.deque]) def test_packed_normal(kind): eq = packed(operator.eq) assert eq(kind([T(1), T(1)])) assert not eq(kind([T(1), T(2)])) @pytest.mark.parametrize("kind", [tuple, list, collections.deque]) def test_packed_normal_with_kwargs(kind): t3 = packed(_t3) inp = kind([T(1), T(2)]) assert t3(inp, c=T(3)) == (T(1), T(2), T(3)) @pytest.mark.parametrize("kind", [tuple, list, collections.deque]) def test_packed_normal_more_than_6(kind): t6 = packed(_t6) inp = kind([T(1), T(2), T(3), T(4), T(5), T(6)]) assert t6(inp) == (T(1), T(2), T(3), T(4), T(5), T(6)) @pytest.mark.parametrize("kind", [tuple, list, collections.deque]) def test_packed_normal_more_than_6_with_kwargs(kind): t6 = packed(_t6) inp = kind([T(1), T(2), T(3), T(4)]) assert t6(inp, e=T(5), f=T(6)) == (T(1), T(2), T(3), T(4), T(5), T(6)) def test_packed_failure1(): # too many arguments when creating an instance with pytest.raises(TypeError): packed(1, 2) def test_packed_failure2(): # too few arguments when creating an instance with pytest.raises(TypeError): packed() def test_packed_failure3(): # too few arguments when calling the instance with pytest.raises(TypeError): packed(operator.eq)() def test_packed_failure4(): # too many arguments when calling the instance with pytest.raises(TypeError): packed(operator.eq)(1, 2) def test_packed_failure5(): # arguments for calling the instance are not convertible to tuple with pytest.raises(TypeError): packed(operator.eq)(1) def test_packed_failure6(): # function raised an Exception def failingfunc(a, b): raise ValueError('bad func') with pytest.raises(ValueError, match='bad func'): packed(failingfunc)((1, 2)) def test_packed_pickle1(protocol): eq = packed(operator.eq) x = pickle.dumps(eq, protocol=protocol) assert pickle.loads(x)((T(1), T(1))) assert not pickle.loads(x)((T(1), T(2))) 07070100000143000081A400000000000000000000000165E3BCDA00005429000000000000000000000000000000000000003100000000iteration_utilities-0.12.1/tests/test_partial.py# Licensed under Apache License Version 2.0 - see LICENSE import collections import copy import pickle import sys import weakref import pytest import iteration_utilities from iteration_utilities import partial import helper_funcs as _hf from helper_cls import T, toT # ============================================================================= # These tests are taken from the python tests. # # They were changed from unitests to pytest and made py2 and py3 compatible. # ============================================================================= def capture(*args, **kw): """capture all positional and keyword arguments""" return args, kw def signature(part): """ return the signature of a partial object """ return (part.func, part.args, part.keywords, part.__dict__) class AllowPickle: def __enter__(self): return self def __exit__(self, typ, value, tb): return False class MyTuple(tuple): pass class BadTuple(tuple): def __add__(self, other): return list(self) + list(other) class MyDict(dict): pass class MyStr(str): pass def test_attributes_unwritable(): p = partial(capture, T(1), T(2), a=T(10), b=T(20)) with pytest.raises(AttributeError): p.func = map with pytest.raises(AttributeError): p.args = (T(1), T(2)) with pytest.raises(AttributeError): p.keywords = {'a': T(1), 'b': T(2)} p = partial(hex) with pytest.raises(TypeError): del p.__dict__ @_hf.skip_on_pypy_not_investigated_why def test_recursive_pickle(): with AllowPickle(): f = partial(capture) f.__setstate__((f, (), {}, {})) try: for proto in range(pickle.HIGHEST_PROTOCOL + 1): with pytest.raises(RecursionError): pickle.dumps(f, proto) finally: f.__setstate__((capture, (), {}, {})) f = partial(capture) f.__setstate__((capture, (f,), {}, {})) try: for proto in range(pickle.HIGHEST_PROTOCOL + 1): f_copy = pickle.loads(pickle.dumps(f, proto)) try: assert f_copy.args[0] is f_copy finally: f_copy.__setstate__((capture, (), {}, {})) finally: f.__setstate__((capture, (), {}, {})) f = partial(capture) f.__setstate__((capture, (), {'a': f}, {})) try: for proto in range(pickle.HIGHEST_PROTOCOL + 1): f_copy = pickle.loads(pickle.dumps(f, proto)) try: assert f_copy.keywords['a'] is f_copy finally: f_copy.__setstate__((capture, (), {}, {})) finally: f.__setstate__((capture, (), {}, {})) def test_repr(): args = (object(), object()) args_repr = ', '.join(repr(a) for a in args) kwargs = {'a': object(), 'b': object()} kwargs_reprs = ['a={a!r}, b={b!r}'.format(**kwargs), 'b={b!r}, a={a!r}'.format(**kwargs)] name = 'iteration_utilities.partial' f = partial(capture) compare = '{name}({capture!r})'.format(name=name, capture=capture) assert compare == repr(f) f = partial(capture, *args) compare = ('{name}({capture!r}, {args_repr})' ''.format(name=name, capture=capture, args_repr=args_repr)) assert compare == repr(f) f = partial(capture, **kwargs) compare = ['{name}({capture!r}, {kwargs_repr})' ''.format(name=name, capture=capture, kwargs_repr=kwargs_repr) for kwargs_repr in kwargs_reprs] assert repr(f) in compare f = partial(capture, *args, **kwargs) compare = ['{name}({capture!r}, {args_repr}, {kwargs_repr})' ''.format(name=name, capture=capture, args_repr=args_repr, kwargs_repr=kwargs_repr) for kwargs_repr in kwargs_reprs] assert repr(f) in compare def test_basic_examples(): p = partial(capture, T(1), T(2), a=T(10), b=T(20)) assert callable(p) assert p(T(3), T(4), b=T(30), c=T(40)) == ((T(1), T(2), T(3), T(4)), dict(a=T(10), b=T(30), c=T(40))) p = partial(map, lambda x: x*T(10)) assert list(p([T(1), T(2), T(3), T(4)])) == toT([10, 20, 30, 40]) def test_attributes(): p = partial(capture, T(1), T(2), a=T(10), b=T(20)) # attributes should be readable assert p.func == capture assert p.args == (T(1), T(2)) assert p.keywords == dict(a=T(10), b=T(20)) def test_argument_checking(): # at least one argument with pytest.raises(TypeError): partial() # must be callable with pytest.raises(TypeError): partial(T(2)) def test_protection_of_callers_dict_argument(): # a caller's dictionary should not be altered by partial def func(a=10, b=20): return a d = {'a': T(3)} p = partial(func, a=T(5)) assert p(**d) == T(3) assert d == {'a': T(3)} p(b=7) assert d == {'a': T(3)} def test_kwargs_copy(): # Issue #29532: Altering a kwarg dictionary passed to a constructor # should not affect a partial object after creation d = {'a': T(3)} p = partial(capture, **d) assert p() == ((), {'a': T(3)}) d['a'] = T(5) assert p(), ((), {'a': T(3)}) def test_arg_combinations(): # exercise special code paths for zero args in either partial # object or the caller p = partial(capture) assert p() == ((), {}) assert p(T(1), T(2)) == ((T(1), T(2)), {}) p = partial(capture, T(1), T(2)) assert p() == ((T(1), T(2)), {}) assert p(T(3), T(4)) == ((T(1), T(2), T(3), T(4)), {}) def test_kw_combinations(): # exercise special code paths for no keyword args in # either the partial object or the caller p = partial(capture) assert p.keywords == {} assert p() == ((), {}) assert p(a=T(1)) == ((), {'a': T(1)}) p = partial(capture, a=T(1)) assert p.keywords == {'a': T(1)} assert p() == ((), {'a': T(1)}) assert p(b=T(2)) == ((), {'a': T(1), 'b': T(2)}) # keyword args in the call override those in the partial object assert p(a=T(3), b=T(2)) == ((), {'a': T(3), 'b': T(2)}) def test_positional(): # make sure positional arguments are captured correctly for args in [(), (T(0),), (T(0), T(1)), (T(0), T(1), T(2)), (T(0), T(1), T(2), T(3))]: p = partial(capture, *args) expected = args + (T('x'),) got, empty = p(T('x')) assert expected == got and empty == {} def test_keyword(): # make sure keyword arguments are captured correctly for a in [T('a'), T(0), T(None), T(3.5)]: p = partial(capture, a=T(a)) expected = {'a': T(a), 'x': T(None)} empty, got = p(x=T(None)) assert expected == got and empty == () def test_no_side_effects(): # make sure there are no side effects that affect subsequent calls p = partial(capture, T(0), a=T(1)) args1, kw1 = p(T(1), b=T(2)) assert args1 == (T(0), T(1)) and kw1 == {'a': T(1), 'b': T(2)} args2, kw2 = p() assert args2 == (T(0),) and kw2 == {'a': T(1)} def test_error_propagation(): def f(x, y): x / y with pytest.raises(ZeroDivisionError): partial(f, 1, 0)() with pytest.raises(ZeroDivisionError): partial(f, 1)(0) with pytest.raises(ZeroDivisionError): partial(f)(1, 0) with pytest.raises(ZeroDivisionError): partial(f, y=0)(1) @_hf.skip_on_pypy_not_investigated_why def test_weakref(): f = partial(int, base=16) p = weakref.proxy(f) assert f.func == p.func f = None with pytest.raises(ReferenceError): p.func def test_with_bound_and_unbound_methods(): data = list(map(str, range(10))) join = partial(str.join, '') assert join(data) == '0123456789' join = partial(''.join) assert join(data) == '0123456789' def test_nested_optimization(): inner = partial(signature, 'asdf') nested = partial(inner, bar=True) flat = partial(signature, 'asdf', bar=True) assert signature(nested) == signature(flat) def test_nested_partial_with_attribute(): # see issue 25137 def foo(bar): return bar p = partial(foo, 'first') p2 = partial(p, 'second') p2.new_attr = 'spam' assert p2.new_attr == 'spam' def test_recursive_repr(): name = 'iteration_utilities.partial' f = partial(capture) f.__setstate__((f, (), {}, {})) try: assert repr(f) == '{}(...)'.format(name) finally: f.__setstate__((capture, (), {}, {})) f = partial(capture) f.__setstate__((capture, (f,), {}, {})) try: assert repr(f) == '{}({!r}, ...)'.format(name, capture) finally: f.__setstate__((capture, (), {}, {})) f = partial(capture) f.__setstate__((capture, (), {'a': f}, {})) try: assert repr(f) == '{}({!r}, a=...)'.format(name, capture) finally: f.__setstate__((capture, (), {}, {})) def test_pickle(): with AllowPickle(): f = partial(signature, ['asdf'], bar=[True]) f.attr = [] for proto in range(pickle.HIGHEST_PROTOCOL + 1): f_copy = pickle.loads(pickle.dumps(f, proto)) assert signature(f_copy) == signature(f) @_hf.skip_on_pypy_not_investigated_why def test_copy(): f = partial(signature, ['asdf'], bar=[True]) f.attr = [] f_copy = copy.copy(f) assert signature(f_copy) == signature(f) assert f_copy.attr is f.attr assert f_copy.args is f.args assert f_copy.keywords is f.keywords @_hf.skip_on_pypy_not_investigated_why def test_deepcopy(): f = partial(signature, ['asdf'], bar=[True]) f.attr = [] f_copy = copy.deepcopy(f) assert signature(f_copy) == signature(f) assert f_copy.attr is not f.attr assert f_copy.args is not f.args assert f_copy.args[0] is not f.args[0] assert f_copy.keywords is not f.keywords assert f_copy.keywords['bar'] is not f.keywords['bar'] def test_setstate(): f = partial(signature) f.__setstate__((capture, (1,), dict(a=10), dict(attr=[]))) assert signature(f) == (capture, (1,), dict(a=10), dict(attr=[])) assert f(2, b=20) == ((1, 2), {'a': 10, 'b': 20}) f.__setstate__((capture, (1,), dict(a=10), None)) assert signature(f) == (capture, (1,), dict(a=10), {}) assert f(2, b=20) == ((1, 2), {'a': 10, 'b': 20}) f.__setstate__((capture, (1,), None, None)) # self.assertEqual(signature(f), (capture, (1,), {}, {})) assert f(2, b=20) == ((1, 2), {'b': 20}) assert f(2) == ((1, 2), {}) assert f() == ((1,), {}) f.__setstate__((capture, (), {}, None)) assert signature(f) == (capture, (), {}, {}) assert f(2, b=20) == ((2,), {'b': 20}) assert f(2) == ((2,), {}) assert f() == ((), {}) def test_setstate_errors(): f = partial(signature) with pytest.raises(TypeError): f.__setstate__((capture, (), {})) with pytest.raises(TypeError): f.__setstate__((capture, (), {}, {}, None)) with pytest.raises(TypeError): f.__setstate__([capture, (), {}, None]) with pytest.raises(TypeError): f.__setstate__((None, (), {}, None)) with pytest.raises(TypeError): f.__setstate__((capture, None, {}, None)) with pytest.raises(TypeError): f.__setstate__((capture, [], {}, None)) with pytest.raises(TypeError): f.__setstate__((capture, (), [], None)) def test_setstate_subclasses(): f = partial(signature) f.__setstate__((capture, MyTuple((1,)), MyDict(a=10), None)) s = signature(f) assert s == (capture, (1,), dict(a=10), {}) assert type(s[1]) is tuple assert type(s[2]) is dict r = f() assert r == ((1,), {'a': 10}) assert type(r[0]) is tuple assert type(r[1]) is dict f.__setstate__((capture, BadTuple((1,)), {}, None)) s = signature(f) assert s == (capture, (1,), {}, {}) assert type(s[1]) is tuple r = f(2) assert r == ((1, 2), {}) assert type(r[0]) is tuple def test_setstate_refcount(): # Issue 6083: Reference counting bug class BadSequence: def __len__(self): return 4 def __getitem__(self, key): if key == 0: return max elif key == 1: return tuple(range(1000000)) elif key in (2, 3): return {} raise IndexError f = partial(object) with pytest.raises(TypeError): f.__setstate__(BadSequence()) # ============================================================================= # New tests (not taken from CPython source) # ============================================================================= def test_invalid_kwargs_with_setstate(): f = partial(object) f.__setstate__((object, (), {1: 1}, {})) # Shouldn't segfault!!! with pytest.raises(TypeError): repr(f) def test_partial_from_partial_kwargs(): f1 = partial(capture, b=10) f2 = partial(f1) assert f2() == ((), {'b': 10}) assert signature(f2) == (capture, (), {'b': 10}, {}) f1 = partial(capture, b=10) f2 = partial(f1, c=10) assert f2() == ((), {'b': 10, 'c': 10}) assert signature(f2) == (capture, (), {'b': 10, 'c': 10}, {}) f1 = partial(capture, b=10) f2 = partial(f1, b=20) assert f2() == ((), {'b': 20}) assert signature(f2) == (capture, (), {'b': 20}, {}) def test_partial_dict_setter(): p = partial(capture, b=10) with pytest.raises(TypeError): p.__dict__ = 10 p = partial(capture, b=10) p.__dict__ = {} assert signature(p) == (capture, (), {'b': 10}, {}) p = partial(capture, b=10) p.__dict__ = collections.OrderedDict() assert signature(p) == (capture, (), {'b': 10}, collections.OrderedDict()) assert isinstance(p.__dict__, collections.OrderedDict) def test_partial_has_placeholder(): assert hasattr(partial, '_') def test_partial_placeholder_basic(): p = partial(isinstance, partial._, int) assert p.num_placeholders == 1 assert p.args == (partial._, int) assert p(20) assert not p(1.2) assert not p(T(1.2)) def test_partial_placeholder_someone_holds_ref(): p = partial(isinstance, partial._, int) # hold a reference to args while calling the function x = p.args assert p(20) assert not p(1.2) assert not p(T(1.2)) del x def test_partial_placeholder_copy(): p = partial(isinstance, partial._, int) # call a copy of a partial with placeholders p2 = copy.copy(p) assert p2.num_placeholders == 1 assert p2(20) assert not p2(1.2) assert not p2(T(1.2)) @_hf.skip_on_pypy_because_sizeof_makes_no_sense_there def test_partial_sizeof(): p1 = partial(isinstance, 10, int) p2 = partial(isinstance, partial._, int) p3 = partial(isinstance, partial._, partial._) # The sizes should be different because each placeholder leads to one more # element in the posph array. sizes = [sys.getsizeof(p) for p in (p1, p2, p3)] assert sizes[2] > sizes[1] assert sizes[1] > sizes[0] # Also make sure that the difference is the same between 3 vs. 2 and 2 vs. 1 assert sizes[2] - sizes[1] == sizes[1] - sizes[0] def test_partial_placeholder_deepcopy(): p = partial(isinstance, partial._, int) p2 = copy.deepcopy(p) assert p2.num_placeholders == 1 assert p2(20) assert not p2(1.2) assert not p2(T(1.2)) def test_partial_placeholder_setstate_frees_old_array(): p = partial(isinstance, partial._, int) p.__setstate__((isinstance, (10, int), {}, {})) # TODO: How to check the memory is freed? :( def test_partial_placeholder_missing_args(): p = partial(isinstance, partial._, int) with pytest.raises(TypeError, match='not enough values'): p() # partial with multiple placeholders and too many or too few arguments p = partial(isinstance, partial._, partial._) assert p.num_placeholders == 2 with pytest.raises(TypeError, match='not enough values'): p() with pytest.raises(TypeError, match='not enough values'): p(T(1)) def test_partial_placeholder_more_args(): p = partial(capture, partial._, T(2)) assert p(T(1), T(3), T(4)) == ((T(1), T(2), T(3), T(4)), {}) def test_partial_from_partial_with_one_placeholder(): # One placeholder, no additional arguments, placeholder in different # positions p1 = partial(capture, partial._, T(2), T(3)) p2 = partial(p1) assert p1.args is p2.args assert p1.keywords == p2.keywords assert p1(T(1)) == ((T(1), T(2), T(3)), {}) assert p2(T(1)) == ((T(1), T(2), T(3)), {}) assert p1(T(1)) == p2(T(1)) p1 = partial(capture, T(1), partial._, T(3)) p2 = partial(p1) assert p1.args is p2.args assert p1.keywords == p2.keywords assert p1(T(2)) == ((T(1), T(2), T(3)), {}) assert p2(T(2)) == ((T(1), T(2), T(3)), {}) assert p1(T(2)) == p2(T(2)) p1 = partial(capture, T(1), T(2), partial._) p2 = partial(p1) assert p1.args is p2.args assert p1.keywords == p2.keywords assert p1(T(3)) == ((T(1), T(2), T(3)), {}) assert p2(T(3)) == ((T(1), T(2), T(3)), {}) assert p1(T(3)) == p2(T(3)) def test_partial_from_partial_with_one_placeholder_fail(): p1 = partial(capture, partial._, T(2), T(3)) p2 = partial(p1) with pytest.raises(TypeError, match='not enough values'): p2() def test_partial_from_partial_basic1(): # One placeholder, one argument given p1 = partial(capture, partial._, T(2), T(3)) p2 = partial(p1, T(1)) assert p1.args == (partial._, T(2), T(3)) assert p1(T(1)) == ((T(1), T(2), T(3)), {}) assert p1(T(1), T(4)) == ((T(1), T(2), T(3), T(4)), {}) assert p2.args == (T(1), T(2), T(3)) assert p2() == ((T(1), T(2), T(3)), {}) assert p2(T(4)) == ((T(1), T(2), T(3), T(4)), {}) def test_partial_from_partial_basic2(): # Two placeholders, one argument given p1 = partial(capture, partial._, T(2), partial._) p2 = partial(p1, T(1)) assert p1.args == (partial._, T(2), partial._) assert p1(T(1), T(3)) == ((T(1), T(2), T(3)), {}) assert p1(T(1), T(3), T(4)) == ((T(1), T(2), T(3), T(4)), {}) assert p2.args == (T(1), T(2), partial._) assert p2(T(3)) == ((T(1), T(2), T(3)), {}) assert p2(T(3), T(4)) == ((T(1), T(2), T(3), T(4)), {}) def test_partial_from_partial_basic3(): # One placeholders, two arguments given p1 = partial(capture, partial._, T(2)) p2 = partial(p1, T(1), T(3)) assert p1.args == (partial._, T(2)) assert p1(T(1)) == ((T(1), T(2)), {}) assert p1(T(1), T(3)) == ((T(1), T(2), T(3)), {}) assert p1(T(1), T(3), T(4)) == ((T(1), T(2), T(3), T(4)), {}) assert p2.args == (T(1), T(2), T(3)) assert p2() == ((T(1), T(2), T(3)), {}) assert p2(T(4)) == ((T(1), T(2), T(3), T(4)), {}) def test_partial_from_partial_basic4(): # Two placeholders, two arguments given p1 = partial(capture, partial._, partial._, T(3)) p2 = partial(p1, T(1), T(2)) assert p1.args == (partial._, partial._, T(3)) assert p1(T(1), T(2)) == ((T(1), T(2), T(3)), {}) assert p1(T(1), T(2), T(4)) == ((T(1), T(2), T(3), T(4)), {}) assert p2.args == (T(1), T(2), T(3)) assert p2() == ((T(1), T(2), T(3)), {}) assert p2(T(4)) == ((T(1), T(2), T(3), T(4)), {}) def test_partial_from_partial_basic5(): # Two placeholders, three arguments given p1 = partial(capture, partial._, partial._, T(3)) p2 = partial(p1, T(1), T(2), T(4)) assert p1.args == (partial._, partial._, T(3)) assert p1(T(1), T(2)) == ((T(1), T(2), T(3)), {}) assert p1(T(1), T(2), T(4)) == ((T(1), T(2), T(3), T(4)), {}) assert p2.args == (T(1), T(2), T(3), T(4)) assert p2() == ((T(1), T(2), T(3), T(4)), {}) assert p2(T(5)) == ((T(1), T(2), T(3), T(4), T(5)), {}) def test_partial_with_function_that_keeps_args(): # A function that keeps its args as-is was a problem with partial because # it reused the arguments. chained is such a function (currently). chained = iteration_utilities.chained assert partial(chained, partial._, str)(complex)(10) == '(10+0j)' @_hf.skip_if_vectorcall_is_not_used def test_partial_with_str_subclasses_fails1(): p = partial(capture, **{MyStr('a'): 10}) with pytest.raises(TypeError): p() @_hf.skip_if_vectorcall_is_not_used def test_partial_with_str_subclasses_fails2(): p = partial(capture, **{MyStr('a'): 10}) with pytest.raises(TypeError): p(b=20) @_hf.skip_if_vectorcall_is_not_used def test_partial_with_str_subclasses_fails3(): p = partial(capture) with pytest.raises(TypeError): p(**{MyStr('a'): 10}) def test_partial_with_lots_of_kwargs(): """The purpose of this test is to test the vectorcall implementation which converts the kwargs passed to the call to a set to speed-up the lookup behavior. """ p = partial(capture, a=1, b=2, c=3, d=4, e=5, f=6, g=7, h=8, i=9, j=10, k=11) r = p(l=12, m=13, n=14, o=15, p=16, q=17, r=18, s=19, t=20, u=21, v=22, w=23) assert r == (tuple(), dict(zip('abcdefghijklmnopqrstuvw', range(1, 24)))) def test_partial_with_lots_of_kwargs_with_duplicate(): p = partial(capture, a=1, b=2, c=3, d=4, e=5, f=6, g=7, h=8, i=9, j=10, k=11) r = p(a=12, b=13, c=14, d=15, e=16, f=17, g=18, h=19, i=20, j=21, k=22, l=23) assert r == (tuple(), dict(zip('abcdefghijkl', range(12, 24)))) 07070100000144000081A400000000000000000000000165E3BCDA000009CB000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/tests/test_partition.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest from iteration_utilities import partition import helper_funcs as _hf from helper_cls import T, toT def test_partition_empty1(): assert partition([]) == (toT([]), toT([])) def test_partition_normal1(): assert partition([T(0), T(1), T(2)]) == (toT([0]), toT([1, 2])) def test_partition_normal2(): assert partition([T(3), T(1), T(0)]) == (toT([0]), toT([3, 1])) def test_partition_normal3(): assert partition([T(0), T(0), T(0)]) == (toT([0, 0, 0]), []) def test_partition_normal4(): assert partition([T(1), T(1), T(1)]) == ([], toT([1, 1, 1])) def test_partition_normal5(): # using a generator assert partition((i for i in [T(0), T(1)])) == ([T(0)], [T(1)]) def test_partition_normal6(): # pred=None is identical to no pred assert partition([T(0), T(1), T(2)], None) == (toT([0]), toT([1, 2])) def test_partition_pred1(): assert partition([T(0), T(1), T(2)], lambda x: x.value > 1) == (toT([0, 1]), toT([2])) def test_partition_pred2(): assert partition([T(0), T(1), T(2)], lambda x: x.value < 1) == (toT([1, 2]), toT([0])) def test_partition_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): partition(_hf.FailIter()) def test_partition_failure2(): with pytest.raises(TypeError): partition([T(1), T('a')], lambda x: x.value + 3) def test_partition_failure3(): with pytest.raises(TypeError): partition([T(1), T('a')], lambda x: x.value - 1) def test_partition_failure4(): with pytest.raises(TypeError): partition([T(1), T('a')], lambda x: x.value + 'a') def test_partition_failure5(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): partition(_hf.FailNext(), bool) def test_partition_failure6(): # too few arguments with pytest.raises(TypeError): partition() def test_partition_failure7(): # object has no boolean interpretation class NoBoolWithT(_hf.FailBool, T): ... with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): partition([NoBoolWithT(10)]) @_hf.skip_on_pypy_because_cache_next_works_differently def test_partition_failure8(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): partition(_hf.CacheNext(1)) 07070100000145000081A400000000000000000000000165E3BCDA000003CD000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_placeholder.py# Licensed under Apache License Version 2.0 - see LICENSE import copy import pytest from iteration_utilities import Placeholder, partial import helper_funcs as _hf PlaceholderType = type(Placeholder) def test_placeholder_only_one_instance(): p1 = PlaceholderType() p2 = PlaceholderType() p1 is p2 def test_placeholder_pickle(protocol): p = PlaceholderType() assert _hf.round_trip_pickle(p, protocol=protocol) is p def test_placeholder(): assert partial._ is partial._ assert copy.copy(partial._) is partial._ assert copy.deepcopy(partial._) is partial._ # PlaceholderType.__new__() assert type(partial._)() is partial._ assert repr(partial._) == '_' def test_placeholder_new(): with pytest.raises(TypeError, match=r"_PlaceholderType\.__new__` takes no arguments"): type(partial._)(1) with pytest.raises(TypeError, match=r"_PlaceholderType\.__new__` takes no arguments"): type(partial._)(a=1) 07070100000146000081A400000000000000000000000165E3BCDA00001189000000000000000000000000000000000000003300000000iteration_utilities-0.12.1/tests/test_replicate.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import replicate import helper_funcs as _hf from helper_cls import T, toT def test_replicate_empty1(): assert list(replicate([], 3)) == [] def test_replicate_normal1(): assert list(replicate([T(1), T(2)], 3)) == toT([1, 1, 1, 2, 2, 2]) def test_replicate_normal2(): # using a generator assert list(replicate((i for i in toT([1, 2])), 2)) == toT([1, 1, 2, 2]) def test_replicate_attributes1(): # Key+reverse function tests it = replicate(toT(range(5)), 3) assert it.times == 3 assert it.timescurrent == 0 with pytest.raises(AttributeError): it.current assert next(it) == T(0) assert it.times == 3 assert it.timescurrent == 1 assert it.current == T(0) def test_replicate_copy1(): _hf.iterator_copy(replicate([T(1), T(2)], 3)) def test_replicate_failure1(): # not enough arguments with pytest.raises(TypeError): replicate([T(1), T(2)]) def test_replicate_failure2(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): replicate(_hf.FailIter(), 2) def test_replicate_failure3(): # second argument <= 1 with pytest.raises(ValueError): replicate([T(1), T(2)], 0) def test_replicate_failure4(): # second argument <= 1 with pytest.raises(ValueError): replicate([T(1), T(2)], 1) def test_replicate_failure5(): # iterator throws an exception different from StopIteration with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): list(replicate(_hf.FailNext(), 2)) def test_replicate_failure_setstate1(): # "state" is not a tuple mg = replicate(toT(range(5)), 3) with pytest.raises(TypeError): mg.__setstate__([None, 0]) def test_replicate_failure_setstate2(): # setstate has an invalid second item in "state" < 0 mg = replicate(toT(range(5)), 3) with pytest.raises(ValueError): mg.__setstate__((None, -1)) def test_replicate_pickle1(protocol): # normal rpl = replicate([T(1), T(2)], 3) x = pickle.dumps(rpl, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 1, 1, 2, 2, 2]) def test_replicate_pickle2(protocol): # normal rpl = replicate([T(1), T(2)], 3) assert next(rpl) == T(1) x = pickle.dumps(rpl, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 1, 2, 2, 2]) def test_replicate_lengthhint1(): it = replicate([T(1), T(2)], 3) _hf.check_lengthhint_iteration(it, 6) def test_replicate_failure_lengthhint1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = replicate(f_it, 3) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_replicate_failure_lengthhint2(): # This only checks for overflow if the length_hint is above PY_SSIZE_T_MAX of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = replicate(of_it, 3) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_replicate_failure_lengthhint3(): # It is also possible that the length_hint overflows when the length is # below maxsize but "times * length" is above maxsize. # In this case length = maxsize / 2 but times = 3 of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize // 2) it = replicate(of_it, 3) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_replicate_failure_lengthhint4(): # There is also the possibility that "length * times" does not overflow # but adding the "times - timescurrent" afterwards will overflow. # That's a bit tricky, but it seems that ((2**x-1) // 10) * 10 + 9 > 2**x-1 # is true for x=15, 31, 63 and 127 so it's possible by setting the times to # 10 and the length to sys.maxsize // 10. The 9 are because the first item # is already popped and should be replicated 9 more times. of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize // 10) it = replicate(of_it, 10) next(it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 07070100000147000081A400000000000000000000000165E3BCDA0000160F000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_roundrobin.py# Licensed under Apache License Version 2.0 - see LICENSE import itertools import operator import pickle import sys import pytest from iteration_utilities import roundrobin import helper_funcs as _hf from helper_cls import T, toT def test_roundrobin_empty1(): assert list(roundrobin()) == [] def test_roundrobin_empty2(): assert list(roundrobin([])) == [] def test_roundrobin_empty3(): assert list(roundrobin([], (), {})) == [] def test_roundrobin_normal1(): assert list(roundrobin([T(1)], [T(1), T(2)], [T(1), T(2), T(3)] )) == toT([1, 1, 1, 2, 2, 3]) def test_roundrobin_normal2(): assert list(roundrobin([T(1), T(2), T(3)], [T(1)], [T(1), T(2)] )) == toT([1, 1, 1, 2, 2, 3]) def test_roundrobin_normal3(): assert list(roundrobin([T(1), T(2)], [T(1), T(2), T(3)], [T(1)] )) == toT([1, 1, 1, 2, 2, 3]) def test_roundrobin_normal4(): # generator assert list(roundrobin((i for i in [T(1), T(2), T(3)]), (i for i in [T(1)]), (i for i in [T(1), T(2)])) ) == toT([1, 1, 1, 2, 2, 3]) def test_roundrobin_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): roundrobin(_hf.FailIter()) def test_roundrobin_failure2(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): roundrobin([T(1)], _hf.FailIter()) def test_roundrobin_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(roundrobin(_hf.FailNext())) def test_roundrobin_failure4(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): list(roundrobin([T(1), T(2)], _hf.FailNext())) def test_roundrobin_failure5(): # Test that a failing iterator doesn't raise a SystemError rr = roundrobin(_hf.FailNext(offset=1, repeats=10), [T(1), T(2), T(3), T(4)]) assert next(rr) == T(1) assert next(rr) == T(1) with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(rr) def test_roundrobin_copy1(): _hf.iterator_copy(roundrobin([T(1), T(2), T(3), T(4)])) def test_roundrobin_failure_setstate1(): # setstate active < 0 rr = roundrobin([T(1), T(2), T(3), T(4)]) with pytest.raises(ValueError): rr.__setstate__((1, -1)) def test_roundrobin_failure_setstate2(): # setstate numactive < 0 rr = roundrobin([T(1), T(2), T(3), T(4)]) with pytest.raises(ValueError): rr.__setstate__((-1, 0)) def test_roundrobin_failure_setstate3(): # setstate numactive <= active rr = roundrobin([T(1), T(2), T(3), T(4)]) with pytest.raises(ValueError): rr.__setstate__((1, 1)) def test_roundrobin_failure_setstate4(): # setstate numactive <= active (numactive = 0) rr = roundrobin() with pytest.raises(ValueError): rr.__setstate__((0, 1)) def test_roundrobin_failure_setstate5(): # setstate numactive > len(iteratortuple) rr = roundrobin([T(1), T(2), T(3), T(4)]) with pytest.raises(ValueError): rr.__setstate__((2, 1)) def test_roundrobin_failure_setstate6(): # setstate numactive > len(iteratortuple) (after exhausting one iterable) rr = roundrobin([T(1)], [T(1), T(2), T(3), T(4)]) assert [i for i in itertools.islice(rr, 3)] == toT([1, 1, 2]) with pytest.raises(ValueError): rr.__setstate__((2, 1)) def test_roundrobin_failure_setstate7(): _hf.iterator_setstate_list_fail( roundrobin([T(1)], [T(1), T(2), T(3), T(4)])) def test_roundrobin_failure_setstate8(): _hf.iterator_setstate_empty_fail( roundrobin([T(1)], [T(1), T(2), T(3), T(4)])) def test_roundrobin_pickle1(protocol): rr = roundrobin([T(1), T(2), T(3)], [T(1), T(2), T(3)]) assert next(rr) == T(1) x = pickle.dumps(rr, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 2, 3, 3]) def test_roundrobin_pickle2(protocol): rr2 = roundrobin([T(1)], [T(1), T(2), T(3)]) assert next(rr2) == T(1) assert next(rr2) == T(1) assert next(rr2) == T(2) x = pickle.dumps(rr2, protocol=protocol) assert list(pickle.loads(x)) == [T(3)] def test_roundrobin_lengthhint1(): it = roundrobin([0], [1, 2, 3], [1]) _hf.check_lengthhint_iteration(it, 5) def test_roundrobin_failure_lengthhint1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = roundrobin(f_it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_roundrobin_failure_lengthhint2(): # This only checks for overflow if the length_hint is above PY_SSIZE_T_MAX of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = roundrobin(of_it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) def test_roundrobin_failure_lengthhint3(): # Check if by adding the different lengths it could lead to overflow. # We use two iterables both with sys.maxsize length. it1 = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize) it2 = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize) it = roundrobin(it1, it2) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 07070100000148000081A400000000000000000000000165E3BCDA00001AB2000000000000000000000000000000000000002E00000000iteration_utilities-0.12.1/tests/test_seen.py# Licensed under Apache License Version 2.0 - see LICENSE import pytest import iteration_utilities from iteration_utilities import Seen import helper_funcs as _hf from helper_cls import T def test_seen_new_None1(): # seenset=None is identical to no seenset assert Seen(None) == Seen() def test_seen_new_None2(): # seenlist=None is identical to no seenlist assert Seen(set(), None) == Seen(set()) def test_seen_equality0(): assert Seen() == Seen() assert not Seen() != Seen() def test_seen_equality1(): # only sets, identical contents assert Seen({T(1), T(2)}) == Seen({T(1), T(2)}) assert not Seen({T(1), T(2)}) != Seen({T(1), T(2)}) def test_seen_equality2(): # only sets, not identical contents assert not Seen({T(1), T(2), T(3)}) == Seen({T(1), T(2)}) assert Seen({T(1), T(2), T(3)}) != Seen({T(1), T(2)}) def test_seen_equality3(): # set and list, identical contents assert Seen({T(1)}, [T([0, 0])]) == Seen({T(1)}, [T([0, 0])]) assert not Seen({T(1)}, [T([0, 0])]) != Seen({T(1)}, [T([0, 0])]) def test_seen_equality4(): # set and list, not identical list contents assert not Seen({T(1)}, [T([0, 0])]) == Seen({T(1)}, [T([0, 1])]) assert Seen({T(1)}, [T([0, 0])]) != Seen({T(1)}, [T([0, 1])]) def test_seen_equality5(): # set and list, not identical list contents assert not Seen({T(1)}, [T([0, 0]), T([1, 0])]) == Seen({T(1)}, [T([0, 1])]) assert Seen({T(1)}, [T([0, 0]), [T(1), T(0)]]) != Seen({T(1)}, [T([0, 0])]) def test_seen_equality6(): # empty sets, one has empty list assert Seen(set()) == Seen(set(), []) assert not Seen(set()) != Seen(set(), []) def test_seen_equality7(): # empty sets, one has empty list assert Seen(set(), []) == Seen(set()) assert not Seen(set(), []) != Seen(set()) def test_seen_equality8(): # empty sets, one has not-empty list assert not Seen(set(), [[T(0)]]) == Seen(set()) assert Seen(set(), [[T(0)]]) != Seen(set()) def test_seen_equality9(): # empty sets, one has not-empty list assert not Seen(set()) == Seen(set(), [[T(0)]]) assert Seen(set()) != Seen(set(), [[T(0)]]) def test_seen_cmpfailure1(): s1 = Seen({_hf.FailEqWithHash()}) s2 = Seen({_hf.FailEqWithHash()}) with pytest.raises(_hf.FailEqWithHash.EXC_TYP, match=_hf.FailEqWithHash.EXC_MSG): s1 == s2 with pytest.raises(_hf.FailEqWithHash.EXC_TYP, match=_hf.FailEqWithHash.EXC_MSG): s1 != s2 def test_seen_cmpfailure2(): s1 = Seen(set(), [_hf.FailEqWithHash()]) s2 = Seen(set(), [_hf.FailEqWithHash()]) with pytest.raises(_hf.FailEqWithHash.EXC_TYP, match=_hf.FailEqWithHash.EXC_MSG): s1 == s2 with pytest.raises(_hf.FailEqWithHash.EXC_TYP, match=_hf.FailEqWithHash.EXC_MSG): s1 != s2 def test_seen_othercmp1(): # other comparisons than == or != fail with pytest.raises(TypeError): Seen(set()) < Seen(set()) with pytest.raises(TypeError): Seen(set()) <= Seen(set()) with pytest.raises(TypeError): Seen(set()) >= Seen(set()) with pytest.raises(TypeError): Seen(set()) > Seen(set()) def test_seen_len0(): assert not len(Seen()) assert len(Seen({T(1), T(2), T(3)})) == 3 assert len(Seen(seenlist=[[T(0), T(0)], [T(1), T(1)], [T(2), T(2)]])) == 3 assert len(Seen({T(1), T(2), T(3)}, seenlist=[[T(0), T(0)], [T(1), T(1)], [T(2), T(2)]])) == 6 def test_seen_repr0(): assert repr(Seen()) == 'iteration_utilities.Seen(set())' assert repr(Seen({T(1)})) == 'iteration_utilities.Seen({T(1)})' assert repr(Seen(set(), [])) == repr(Seen()) expected = 'iteration_utilities.Seen(set(), seenlist=[T(1)])' assert repr(Seen(set(), [T(1)])) == expected def test_seen_repr1(): # check that even though it can't be immediately set that recursive # representations are catched s = Seen() s.contains_add([s]) assert repr(s) == 'iteration_utilities.Seen(set(), seenlist=[[...]])' def test_seen_repr2(): # Check that the representation is class name aware class Fun(Seen): pass assert 'Fun' in repr(Fun()) assert 'Fun' in repr(Fun({T(1)})) assert 'Fun' in repr(Fun(set(), [])) assert 'Fun' in repr(Fun(set(), [T(1)])) def test_seen_attributes1(): x = Seen() assert isinstance(x.seenset, set) assert x.seenlist is None def test_seen_contains0(): x = Seen() assert T(1) not in x assert x == Seen(set()) assert T([0, 0]) not in x assert x == Seen(set()) def test_seen_contains_failure1(): # Failure (no TypeError) when trying to hash the value x = Seen({T(0)}) with pytest.raises(_hf.FailHash.EXC_TYP, match=_hf.FailHash.EXC_MSG): _hf.FailHash() in x @_hf.skip_on_pypy_not_investigated_why def test_seen_contains_failure2(): # Failure when comparing the object to the objects in the list x = Seen(set(), [_hf.FailEqNoHash()]) with pytest.raises(_hf.FailEqNoHash.EXC_TYP, match=_hf.FailEqNoHash.EXC_MSG): _hf.FailEqNoHash() in x def test_seen_containsadd0(): x = Seen() assert not x.contains_add(T(1)) assert not x.contains_add(T([0, 0])) assert T(1) in x assert T([0, 0]) in x assert x == Seen({T(1)}, [T([0, 0])]) def test_seen_containsadd_failure1(): # Failure (no TypeError) when trying to hash the value x = Seen({T(0)}) with pytest.raises(_hf.FailHash.EXC_TYP, match=_hf.FailHash.EXC_MSG): x.contains_add(_hf.FailHash()) def test_seen_containsadd_failure2(): # Failure when comparing the object to the objects in the list. x = Seen(set(), [_hf.FailEqNoHash()]) with pytest.raises(_hf.FailEqNoHash.EXC_TYP, match=_hf.FailEqNoHash.EXC_MSG): x.contains_add(_hf.FailEqNoHash()) # Pickle tests and most failure tests are implemented implicitly as part of # unique_everseen, duplicates, all_distinct so there should be no need to # repeat these here. But if "Seen" is expanded these should be included!!! def test_seen_failures1(): # too many arguments with pytest.raises(TypeError): Seen({10, 20}, [1, 2, 3], [1, 2, 3]) def test_seen_failures2(): # seenset not a set with pytest.raises(TypeError) as exc: Seen(frozenset({10, 20})) assert '`seenset`' in str(exc.value) and 'set' in str(exc.value) def test_seen_failures3(): # seenlist must be a list with pytest.raises(TypeError) as exc: Seen({10, 20}, tuple([1, 2, 3])) assert '`seenlist`' in str(exc.value) and 'list' in str(exc.value) def test_seen_failures4(): # seen can only be compared to other seen's. with pytest.raises(TypeError, match='`Seen` instances can only compared to other ' '`Seen` instances'): Seen() == set() 07070100000149000081A400000000000000000000000165E3BCDA00001E86000000000000000000000000000000000000003500000000iteration_utilities-0.12.1/tests/test_sideeffects.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import sideeffects, return_None import helper_funcs as _hf from helper_cls import T, toT def raise_error_when_below10(val): if isinstance(val, tuple): if val[0].value < 10: raise ValueError() else: if val.value < 10: raise ValueError() def test_sideeffects_empty1(): assert list(sideeffects([], return_None)) == [] def test_sideeffects_empty2(): assert list(sideeffects([], return_None, 0)) == [] def test_sideeffects_empty3(): assert list(sideeffects([], return_None, 1)) == [] def test_sideeffects_empty4(): assert list(sideeffects([], return_None, 10)) == [] def test_sideeffects_normal1(): l = [] assert list(sideeffects([T(1), T(2)], l.append)) == [T(1), T(2)] assert l == [T(1), T(2)] def test_sideeffects_normal2(): l = [] assert list(sideeffects([T(1), T(2)], l.append, 0)) == [T(1), T(2)] assert l == [T(1), T(2)] def test_sideeffects_normal3(): l = [] assert list(sideeffects([T(1), T(2)], l.append, 1)) == [T(1), T(2)] assert l == [(T(1), ), (T(2), )] def test_sideeffects_normal4(): l = [] assert list(sideeffects([T(1), T(2)], l.append, 2)) == [T(1), T(2)] assert l == [(T(1), T(2))] def test_sideeffects_normal5(): l = [] assert list(sideeffects([T(1), T(2), T(3)], l.append, times=2)) == [T(1), T(2), T(3)] assert l == [(T(1), T(2)), (T(3), )] def test_sideeffects_normal6(): # generator l = [] assert list(sideeffects((i for i in [T(1), T(2)]), l.append, 2)) == [T(1), T(2)] assert l == [(T(1), T(2))] def test_sideeffects_normal7(): # useless side-effect assert list(sideeffects([T(1), T(2)], return_None)) == [T(1), T(2)] def test_sideeffects_normal8(): # useless side-effect assert list(sideeffects(toT(range(10)), return_None, 3)) == toT(range(10)) def test_sideeffects_attribute1(): it = sideeffects(toT(range(10)), return_None) assert it.times == 0 assert it.func is return_None assert it.count == 0 def test_sideeffects_failure1(): l = [] with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): sideeffects(_hf.FailIter(), l.append) def test_sideeffects_failure2(): l = [] with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): sideeffects(_hf.FailIter(), l.append, 1) def test_sideeffects_failure3(): with pytest.raises(ValueError): list(sideeffects([T(1), T(2), T(3)], raise_error_when_below10)) def test_sideeffects_failure4(): with pytest.raises(ValueError): list(sideeffects([T(11), T(12), T(3)], raise_error_when_below10)) def test_sideeffects_failure5(): with pytest.raises(ValueError): list(sideeffects([T(11), T(12), T(3)], raise_error_when_below10, 2)) def test_sideeffects_failure6(): with pytest.raises(ValueError): list(sideeffects([T(3), T(12), T(11)], raise_error_when_below10, 2)) def test_sideeffects_failure7(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): list(sideeffects(_hf.FailNext(), lambda x: x)) def test_sideeffects_failure8(): # Too few arguments with pytest.raises(TypeError): sideeffects() def test_sideeffects_copy1(): _hf.iterator_copy(sideeffects(toT([1, 2, 3, 4]), return_None)) def test_sideeffects_failure_setstate1(): # If times==0 then the second argument must be None se = sideeffects([T(1), T(2), T(3), T(4)], return_None) with pytest.raises(TypeError): se.__setstate__((0, ())) def test_sideeffects_failure_setstate2(): # The first argument must be smaller than the length of the second se = sideeffects([T(1), T(2), T(3), T(4)], return_None, 1) with pytest.raises(ValueError): se.__setstate__((1, (T(1), ))) def test_sideeffects_failure_setstate3(): # The first argument must not be smaller than zero se = sideeffects([T(1), T(2), T(3), T(4)], return_None, 1) with pytest.raises(ValueError): se.__setstate__((-1, (T(1), ))) def test_sideeffects_failure_setstate4(): # The length of the second argument must be equal to the "times". se = sideeffects([T(1), T(2), T(3), T(4)], return_None, 1) with pytest.raises(ValueError): se.__setstate__((1, (T(1), T(2)))) def test_sideeffects_failure_setstate5(): # If the second argument is None then the times must be zero se = sideeffects([T(1), T(2), T(3), T(4)], return_None, 1) with pytest.raises(TypeError): se.__setstate__((0, None)) def test_sideeffects_failure_setstate6(): # If the second argument is None then the first argument must be zero se = sideeffects([T(1), T(2), T(3), T(4)], return_None, 0) with pytest.raises(TypeError): se.__setstate__((1, None)) def test_sideeffects_failure_setstate7(): # The second argument must be a tuple or None se = sideeffects([T(1), T(2), T(3), T(4)], return_None, 2) with pytest.raises(TypeError): se.__setstate__((1, [T(1), T(2)])) def test_sideeffects_failure_setstate8(): _hf.iterator_setstate_list_fail( sideeffects([T(1), T(2), T(3), T(4)], return_None, 2)) def test_sideeffects_failure_setstate9(): _hf.iterator_setstate_empty_fail( sideeffects([T(1), T(2), T(3), T(4)], return_None, 2)) def test_sideeffects_pickle1(protocol): suc = sideeffects([T(1), T(2), T(3), T(4)], return_None) assert next(suc) == T(1) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [T(2), T(3), T(4)] def test_sideeffects_pickle2(protocol): suc = sideeffects([T(1), T(2), T(3), T(4)], return_None) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [T(1), T(2), T(3), T(4)] def test_sideeffects_pickle3(protocol): suc = sideeffects([T(1), T(2), T(3), T(4)], return_None, 1) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [T(1), T(2), T(3), T(4)] def test_sideeffects_pickle4(protocol): suc = sideeffects([T(1), T(2), T(3), T(4)], return_None, 1) assert next(suc) == T(1) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [T(2), T(3), T(4)] def test_sideeffects_pickle5(protocol): suc = sideeffects([T(1), T(2), T(3), T(4)], return_None, 2) assert next(suc) == T(1) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [T(2), T(3), T(4)] def test_sideeffects_pickle6(protocol): suc = sideeffects([T(1), T(2), T(3), T(4)], return_None, 2) assert next(suc) == T(1) assert next(suc) == T(2) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [T(3), T(4)] def test_sideeffects_lengthhint1(): it = sideeffects([1, 2, 3, 4, 5, 6], return_None) _hf.check_lengthhint_iteration(it, 6) def test_sideeffects_failure_lengthhint1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = sideeffects(f_it, return_None) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_sideeffects_failure_lengthhint2(): # This only checks for overflow if the length_hint is above PY_SSIZE_T_MAX of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = sideeffects(of_it, return_None) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 0707010000014A000081A400000000000000000000000165E3BCDA00001CB7000000000000000000000000000000000000002F00000000iteration_utilities-0.12.1/tests/test_split.py# Licensed under Apache License Version 2.0 - see LICENSE from functools import partial import operator import pickle import pytest import iteration_utilities from iteration_utilities import split import helper_funcs as _hf from helper_cls import T, toT equalsthreeT = partial(operator.eq, T(3)) def test_split_empty1(): assert list(split([], lambda x: False)) == [] def test_split_normal1(): assert list(split([T(1), T(2), T(3)], lambda x: x.value == 2)) == [[T(1)], [T(3)]] def test_split_normal2(): assert list(split([T(1), T(2), T(3)], lambda x: x.value == 3)) == [toT([1, 2])] def test_split_normal3(): # using a generator assert list(split((i for i in [T(1), T(2), T(3)]), lambda x: x.value == 2)) == [[T(1)], [T(3)]] def test_split_keep1(): assert list(split([T(1), T(2), T(3)], lambda x: x.value == 2, keep=True)) == [[T(1)], [T(2)], [T(3)]] def test_split_keep2(): assert list(split([T(1), T(2), T(3)], lambda x: x.value == 3, keep=True)) == [[T(1), T(2)], [T(3)]] def test_split_keep_before1(): assert list(split([T(1), T(2), T(3), T(4)], lambda x: x.value == 3, keep_before=True)) == [[T(1), T(2), T(3)], [T(4)]] def test_split_keep_before2(): assert list(split([T(1), T(2), T(3)], lambda x: x.value == 3, keep_before=True)) == [[T(1), T(2), T(3)]] def test_split_keep_after1(): assert list(split([T(1), T(2), T(3), T(4)], lambda x: x.value == 3, keep_after=True)) == [[T(1), T(2)], [T(3), T(4)]] def test_split_keep_after2(): assert list(split([T(1), T(2), T(3)], lambda x: x.value == 3, keep_after=True)) == [[T(1), T(2)], [T(3)]] def test_split_maxsplit1(): assert list(split([T(1), T(2), T(3), T(4), T(5)], lambda x: x.value % 2 == 0, maxsplit=1)) == [[T(1)], [T(3), T(4), T(5)]] def test_split_maxsplit2(): assert list(split([T(1), T(2), T(3), T(4), T(5)], lambda x: x.value % 2 == 0, maxsplit=2)) == [[T(1)], [T(3)], [T(5)]] def test_split_eq1(): assert list(split([T(1), T(2), T(3), T(2), T(5)], T(2), eq=True)) == [[T(1)], [T(3)], [T(5)]] def test_split_attributes1(): it = split([], iteration_utilities.return_False) assert it.key is iteration_utilities.return_False assert it.maxsplit == -1 assert not it.keep assert not it.keep_before assert not it.keep_after assert not it.eq it = split([], iteration_utilities.return_False, keep=True) assert it.keep assert not it.keep_before assert not it.keep_after it = split([], iteration_utilities.return_False, keep_before=True) assert not it.keep assert it.keep_before assert not it.keep_after it = split([], iteration_utilities.return_False, keep_after=True) assert not it.keep assert not it.keep_before assert it.keep_after def test_split_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): split(_hf.FailIter(), lambda x: False) def test_split_failure2(): # func fails with pytest.raises(TypeError): list(split([T(1), T(2), T(3)], lambda x: T(x.value + 'a'))) def test_split_failure3(): # cmp fails with pytest.raises(TypeError): list(split([T(1), T(2), T(3)], T('a'), eq=True)) def test_split_failure4(): # more than one keep* parameter with pytest.raises(ValueError) as exc: split(toT([1, 2, 3, 4]), T(2), eq=True, keep=True, keep_before=True) assert '`keep`, `keep_before`, `keep_after`' in str(exc.value) def test_split_failure5(): # more than one keep* parameter with pytest.raises(ValueError) as exc: split(toT([1, 2, 3, 4]), T(2), eq=True, keep=True, keep_after=True) assert '`keep`, `keep_before`, `keep_after`' in str(exc.value) def test_split_failure6(): # more than one keep* parameter with pytest.raises(ValueError) as exc: split(toT([1, 2, 3, 4]), T(2), eq=True, keep_before=True, keep_after=True) assert '`keep`, `keep_before`, `keep_after`' in str(exc.value) def test_split_failure7(): # more than one keep* parameter with pytest.raises(ValueError) as exc: split(toT([1, 2, 3, 4]), T(2), eq=True, keep=True, keep_before=True, keep_after=True) assert '`keep`, `keep_before`, `keep_after`' in str(exc.value) def test_split_failure8(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(split(_hf.FailNext(), iteration_utilities.return_False)) def test_split_failure9(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(split(_hf.FailNext(offset=1), iteration_utilities.return_False)) def test_split_failure10(): # Too few arguments with pytest.raises(TypeError): split() def test_split_failure11(): # maxsplit <= -2 with pytest.raises(ValueError, match='`maxsplit`'): split(toT([1, 2, 3, 4]), T(2), eq=True, maxsplit=-2) @_hf.skip_on_pypy_because_cache_next_works_differently def test_split_failure12(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): list(split(_hf.CacheNext(1), lambda x: x == 10)) def test_split_copy1(): _hf.iterator_copy(split(toT(range(1, 9)), equalsthreeT)) def test_split_failure_setstate1(): _hf.iterator_setstate_list_fail( split(toT(range(1, 9)), equalsthreeT)) def test_split_failure_setstate2(): _hf.iterator_setstate_empty_fail( split(toT(range(1, 9)), equalsthreeT)) def test_split_pickle1(protocol): l = [T(1), T(2), T(3), T(4), T(5), T(3), T(7), T(8)] spl = split(l, equalsthreeT) x = pickle.dumps(spl, protocol=protocol) assert list(pickle.loads(x)) == [[T(1), T(2)], [T(4), T(5)], [T(7), T(8)]] def test_split_pickle2(protocol): l = [T(1), T(2), T(3), T(4), T(5), T(3), T(7), T(8)] spl = split(l, equalsthreeT) assert next(spl) == toT([1, 2]) x = pickle.dumps(spl, protocol=protocol) assert list(pickle.loads(x)) == [toT([4, 5]), toT([7, 8])] def test_split_pickle3(protocol): l = [T(1), T(2), T(3), T(4), T(5), T(3), T(7), T(8)] spl = split(l, equalsthreeT, keep=True) assert next(spl) == toT([1, 2]) x = pickle.dumps(spl, protocol=protocol) assert list(pickle.loads(x)) == [toT(i) for i in [[3], [4, 5], [3], [7, 8]]] def test_split_pickle4(protocol): l = [T(1), T(2), T(3), T(4), T(5), T(3), T(7), T(8)] spl = split(l, equalsthreeT, maxsplit=1) assert next(spl) == toT([1, 2]) x = pickle.dumps(spl, protocol=protocol) assert list(pickle.loads(x)) == [toT([4, 5, 3, 7, 8])] def test_split_pickle5(protocol): l = [T(1), T(2), T(3), T(4), T(5), T(3), T(7), T(8)] spl = split(l, T(3), eq=True) assert next(spl) == toT([1, 2]) x = pickle.dumps(spl, protocol=protocol) assert list(pickle.loads(x)) == [toT([4, 5]), toT([7, 8])] 0707010000014B000081A400000000000000000000000165E3BCDA00000E27000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_starfilter.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import pytest import iteration_utilities from iteration_utilities import starfilter import helper_funcs as _hf from helper_cls import T def test_starfilter_empty1(): assert list(starfilter(operator.eq, [])) == [] def test_starfilter_normal1(): inp = [(T(1), T(1)), (T(2), T(2))] assert list(starfilter(operator.eq, inp)) == [(T(1), T(1)), (T(2), T(2))] def test_starfilter_normal2(): # same test as above but with lists inside. inp = [[T(1), T(1)], [T(2), T(2)]] assert list(starfilter(operator.eq, inp)) == [[T(1), T(1)], [T(2), T(2)]] def test_starfilter_normal3(): inp = [(T(1), T(2)), (T(2), T(1))] assert list(starfilter(operator.eq, inp)) == [] def test_starfilter_normal4(): # same test as above but with lists inside. inp = [[T(1), T(2)], [T(2), T(1)]] assert list(starfilter(operator.eq, inp)) == [] def test_starfilter_attributes1(): it = starfilter(operator.eq, []) assert it.pred is operator.eq def test_starfilter_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): starfilter(operator.eq, _hf.FailIter()) def test_starfilter_failure2(): # item not convertable to tuple with pytest.raises(TypeError): next(starfilter(operator.eq, [T(1)])) def test_starfilter_failure3(): # not enough arguments for function call with pytest.raises(TypeError): next(starfilter(operator.eq, [(T(1), )])) def test_starfilter_failure4(): # too many arguments for function call with pytest.raises(TypeError): next(starfilter(operator.eq, [(T(1), T(1), T(1))])) def test_starfilter_failure5(): # too many arguments for function call with pytest.raises(TypeError): next(starfilter(operator.eq, [(T(1), T(1), T(1))])) def test_starfilter_failure6(): # function itself fails def failingfunc(a, b): raise ValueError('bad func') with pytest.raises(ValueError, match='bad func'): next(starfilter(failingfunc, [(T(1), T(1))])) def test_starfilter_failure7(): # result of function has no boolean interpretation with pytest.raises(_hf.FailBool.EXC_TYP, match=_hf.FailBool.EXC_MSG): next(starfilter(lambda x, y: _hf.FailBool(), [(T(1), T(1))])) def test_starfilter_failure8(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(starfilter(operator.ne, _hf.FailNext())) def test_starfilter_failure9(): # Too few arguments with pytest.raises(TypeError): starfilter() @_hf.skip_on_pypy_because_cache_next_works_differently def test_starfilter_failure10(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): # won't work with return_True because then "iternext" is refreshed # before the failure comes. list(starfilter(iteration_utilities.return_False, _hf.CacheNext([1]))) def test_starfilter_copy1(): _hf.iterator_copy(starfilter(operator.eq, [(T(1), T(1)), (T(2), T(2))])) def test_starfilter_pickle1(protocol): sf = starfilter(operator.eq, [(T(1), T(1)), (T(2), T(2))]) x = pickle.dumps(sf, protocol=protocol) assert list(pickle.loads(x)) == [(T(1), T(1)), (T(2), T(2))] def test_starfilter_pickle2(protocol): sf = starfilter(operator.eq, [(T(1), T(1)), (T(2), T(2))]) assert next(sf) == (T(1), T(1)) x = pickle.dumps(sf, protocol=protocol) assert list(pickle.loads(x)) == [(T(2), T(2))] 0707010000014C000081A400000000000000000000000165E3BCDA00001191000000000000000000000000000000000000003400000000iteration_utilities-0.12.1/tests/test_successive.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import sys import pytest from iteration_utilities import successive import helper_funcs as _hf from helper_cls import T, toT def test_successive_empty1(): assert list(successive([])) == [] def test_successive_empty2(): assert list(successive([T(1)])) == [] def test_successive_empty3(): assert list(successive([], times=10)) == [] def test_successive_empty4(): assert list(successive([T(1), T(2), T(3)], times=10)) == [] def test_successive_normal1(): assert (list(successive([T(1), T(2), T(3), T(4)])) == [(T(1), T(2)), (T(2), T(3)), (T(3), T(4))]) def test_successive_normal2(): assert (list(successive([T(1), T(2), T(3), T(4)], times=3)) == [(T(1), T(2), T(3)), (T(2), T(3), T(4))]) def test_successive_normal3(): assert (list(successive([T(1), T(2), T(3), T(4)], times=4)) == [(T(1), T(2), T(3), T(4))]) def test_successive_normal4(): assert (dict(successive([T(1), T(2), T(3), T(4)])) == {T(1): T(2), T(2): T(3), T(3): T(4)}) def test_successive_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): successive(_hf.FailIter()) def test_successive_failure2(): with pytest.raises(ValueError): # times must be > 0 successive([T(1), T(2), T(3)], 0) def test_successive_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(successive(_hf.FailNext(), 1)) def test_successive_failure4(): # Too few arguments with pytest.raises(TypeError): successive() @_hf.skip_on_pypy_because_cache_next_works_differently def test_successive_failure5(): # Changing next method with pytest.raises(_hf.CacheNext.EXC_TYP, match=_hf.CacheNext.EXC_MSG): list(successive(_hf.CacheNext(1), 3)) def test_successive_copy1(): _hf.iterator_copy(successive(toT([1, 2, 3, 4]))) def test_successive_failure_setstate1(): # first argument must be a tuple suc = successive([T(1), T(2), T(3), T(4)], 2) with pytest.raises(TypeError): suc.__setstate__(([T(1), T(2)], )) def test_successive_failure_setstate2(): # length of first argument not equal to times suc = successive([T(1), T(2), T(3), T(4)], 2) with pytest.raises(ValueError): suc.__setstate__(((T(1), ), )) def test_successive_failure_setstate3(): # length of first argument not equal to times suc = successive([T(1), T(2), T(3), T(4)], 2) with pytest.raises(ValueError): suc.__setstate__(((T(1), T(2), T(3)), )) def test_successive_failure_setstate4(): _hf.iterator_setstate_list_fail( successive([T(1), T(2), T(3), T(4)], 2)) def test_successive_failure_setstate5(): _hf.iterator_setstate_empty_fail( successive([T(1), T(2), T(3), T(4)], 2)) def test_successive_pickle1(protocol): suc = successive([T(1), T(2), T(3), T(4)]) assert next(suc) == (T(1), T(2)) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [(T(2), T(3)), (T(3), T(4))] def test_successive_pickle2(protocol): suc = successive([T(1), T(2), T(3), T(4)]) x = pickle.dumps(suc, protocol=protocol) assert list(pickle.loads(x)) == [(T(1), T(2)), (T(2), T(3)), (T(3), T(4))] def test_successive_lengthhint1(): it = successive([0]*6, 4) _hf.check_lengthhint_iteration(it, 3) def test_successive_lengthhint2(): assert operator.length_hint(successive([0]*6, 11)) == 0 def test_successive_failure_lengthhint1(): f_it = _hf.FailLengthHint(toT([1, 2, 3])) it = successive(f_it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): operator.length_hint(it) with pytest.raises(_hf.FailLengthHint.EXC_TYP, match=_hf.FailLengthHint.EXC_MSG): list(it) def test_successive_failure_lengthhint2(): # This only checks for overflow if the length_hint is above PY_SSIZE_T_MAX. # In theory that would be possible because with times the length would be # shorter but "length_hint" throws the exception so we propagate it. of_it = _hf.OverflowLengthHint(toT([1, 2, 3]), sys.maxsize + 1) it = successive(of_it) with pytest.raises(OverflowError): operator.length_hint(it) with pytest.raises(OverflowError): list(it) 0707010000014D000081A400000000000000000000000165E3BCDA00000973000000000000000000000000000000000000003200000000iteration_utilities-0.12.1/tests/test_tabulate.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest import iteration_utilities from iteration_utilities import tabulate, getitem import helper_funcs as _hf from helper_cls import T class T2(T): def __add__(self, other): if isinstance(other, T): return self.__class__(self.value + other.value) return self.__class__(self.value + other) def __mul__(self, other): if isinstance(other, T): return self.__class__(self.value * other.value) return self.__class__(self.value * other) def test_tabulate_normal1(): assert list(getitem(tabulate(lambda x: x, T2(0)), stop=5)) == [T2(0), T2(1), T2(2), T2(3), T2(4)] def test_tabulate_normal2(): assert list(getitem(tabulate(T), stop=5)) == [T(0), T(1), T(2), T(3), T(4)] def test_tabulate_attributes1(): it = tabulate(T) assert it.func is T assert it.current == 0 next(it) assert it.current == 1 def test_tabulate_failure1(): class T: def __init__(self, val): self.val = val def __truediv__(self, other): return self.__class__(self.val / other.val) # Function call fails with pytest.raises(ZeroDivisionError): next(tabulate(lambda x: T(1)/x, T(0))) def test_tabulate_failure2(): # incrementing with one fails with pytest.raises(TypeError): next(tabulate(iteration_utilities.return_identity, T(0.5))) def test_tabulate_failure3(): tab = tabulate(iteration_utilities.return_identity, T(0)) # Fail once while incrementing, this will set cnt to NULL with pytest.raises(TypeError): next(tab) with pytest.raises(StopIteration): next(tab) def test_tabulate_failure4(): # Too few arguments with pytest.raises(TypeError): tabulate() def test_tabulate_copy1(): _hf.iterator_copy(tabulate(T)) def test_tabulate_pickle1(protocol): rr = tabulate(T) assert next(rr) == T(0) x = pickle.dumps(rr, protocol=protocol) assert next(pickle.loads(x)) == T(1) def test_tabulate_pickle2(protocol): rr = tabulate(T, 2) assert next(rr) == T(2) x = pickle.dumps(rr, protocol=protocol) assert next(pickle.loads(x)) == T(3) def test_tabulate_pickle3(protocol): rr = tabulate(T) x = pickle.dumps(rr, protocol=protocol) assert next(pickle.loads(x)) == T(0) 0707010000014E000081A400000000000000000000000165E3BCDA0000109E000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/tests/test_unique_everseen.py# Licensed under Apache License Version 2.0 - see LICENSE import operator import pickle import pytest import iteration_utilities from iteration_utilities import unique_everseen, Seen import helper_funcs as _hf from helper_cls import T, toT def test_uniqueeverseen_empty1(): assert list(unique_everseen([])) == [] def test_uniqueeverseen_normal1(): assert list(unique_everseen([T(1), T(2), T(1)])) == [T(1), T(2)] def test_uniqueeverseen_normal2(): # key=None is identical to no key assert list(unique_everseen([T(1), T(2), T(1)], None)) == [T(1), T(2)] def test_uniqueeverseen_key1(): assert list(unique_everseen([T(1), T(2), T(1)], abs)) == [T(1), T(2)] def test_uniqueeverseen_key2(): assert list(unique_everseen([T(1), T(1), T(-1)], abs)) == [T(1)] def test_uniqueeverseen_unhashable1(): assert list(unique_everseen([{T(1): T(1)}, {T(2): T(2)}, {T(1): T(1)}])) == [{T(1): T(1)}, {T(2): T(2)}] def test_uniqueeverseen_unhashable2(): assert list(unique_everseen([[T(1)], [T(2)], [T(1)]])) == [[T(1)], [T(2)]] def test_uniqueeverseen_unhashable3(): assert list(unique_everseen([[T(1), T(1)], [T(1), T(2)], [T(1), T(3)]], operator.itemgetter(0))) == [[T(1), T(1)]] def test_uniqueeverseen_getter1(): t = unique_everseen([T(1), T([0, 0]), T(3)]) assert not t.seen assert t.key is None assert next(t) == T(1) assert t.seen == Seen({T(1)}) assert t.key is None assert next(t) == T([0, 0]) assert T(1) in t.seen assert T([0, 0]) in t.seen assert t.key is None assert next(t) == T(3) assert t.seen == Seen({T(1), T(3)}, [T([0, 0])]) assert t.key is None def test_uniqueeverseen_getter2(): t = unique_everseen([T(1), T([0, 0]), T(3)], iteration_utilities.return_identity) assert not t.seen assert t.key is iteration_utilities.return_identity assert next(t) == T(1) assert t.seen == Seen({T(1)}) assert t.key is iteration_utilities.return_identity assert next(t) == T([0, 0]) assert T(1) in t.seen assert T([0, 0]) in t.seen assert t.key is iteration_utilities.return_identity assert next(t) == T(3) assert t.seen == Seen({T(1), T(3)}, [T([0, 0])]) assert t.key is iteration_utilities.return_identity def test_uniqueeverseen_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): unique_everseen(_hf.FailIter()) def test_uniqueeverseen_failure2(): with pytest.raises(TypeError): list(unique_everseen([T(1), T(2), T(3), T('a')], abs)) def test_uniqueeverseen_failure3(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(unique_everseen(_hf.FailNext())) def test_uniqueeverseen_failure4(): # Too few arguments with pytest.raises(TypeError): unique_everseen() def test_uniqueeverseen_failure5(): # Failure when comparing the object to the objects in the list with pytest.raises(_hf.FailEqNoHash.EXC_TYP, match=_hf.FailEqNoHash.EXC_MSG): list(unique_everseen([[T(1)], _hf.FailEqNoHash()])) def test_uniqueeverseen_failure6(): # Failure (no TypeError) when trying to hash the value with pytest.raises(_hf.FailHash.EXC_TYP, match=_hf.FailHash.EXC_MSG): list(unique_everseen([T(1), _hf.FailHash()])) def test_uniqueeverseen_copy1(): _hf.iterator_copy(unique_everseen(toT([1, 2, 1, 2]))) def test_uniqueeverseen_failure_setstate1(): # __setstate__ only accepts Seen instances dp = unique_everseen(toT([1, 1])) with pytest.raises(TypeError): dp.__setstate__((set(toT(range(1, 3))),)) def test_uniqueeverseen_failure_setstate2(): _hf.iterator_setstate_list_fail(unique_everseen(toT([1, 1]))) def test_uniqueeverseen_failure_setstate3(): _hf.iterator_setstate_empty_fail(unique_everseen(toT([1, 1]))) def test_uniqueeverseen_pickle1(protocol): uqe = unique_everseen([T(1), T(2), T(1), T(2)]) assert next(uqe) == T(1) x = pickle.dumps(uqe, protocol=protocol) assert list(pickle.loads(x)) == [T(2)] 0707010000014F000081A400000000000000000000000165E3BCDA00000C74000000000000000000000000000000000000003900000000iteration_utilities-0.12.1/tests/test_unique_justseen.py# Licensed under Apache License Version 2.0 - see LICENSE import pickle import pytest from iteration_utilities import unique_justseen import helper_funcs as _hf from helper_cls import T, toT class T2: def __init__(self, value): self.value = value def __eq__(self, other): raise TypeError() def __ne__(self, other): raise TypeError() def test_unique_justseen_empty1(): assert list(unique_justseen([])) == [] == [] def test_unique_justseen_normal1(): assert list(unique_justseen(toT([1, 1, 2, 2, 3, 3]))) == toT([1, 2, 3]) def test_unique_justseen_normal2(): assert list(unique_justseen('aAabBb')) == ['a', 'A', 'a', 'b', 'B', 'b'] def test_unique_justseen_normal3(): assert list(unique_justseen('aAabBb', key=str.lower)) == ['a', 'b'] def test_unique_justseen_normal4(): # key=None is identical to no key assert list(unique_justseen(toT([1, 1, 2, 2, 3, 3]), None)) == toT([1, 2, 3]) def test_unique_justseen_attributes1(): it = unique_justseen(toT([1, 1, 2, 2, 3, 3]), None) assert it.key is None with pytest.raises(AttributeError): it.lastseen next(it) assert it.lastseen == T(1) def test_unique_justseen_failure1(): with pytest.raises(_hf.FailIter.EXC_TYP, match=_hf.FailIter.EXC_MSG): unique_justseen(_hf.FailIter()) def test_unique_justseen_failure2(): with pytest.raises(TypeError): # function call fails list(unique_justseen([T(1), T(2), T(3)], key=lambda x: x + 'a')) def test_unique_justseen_failure3(): # objects do not support eq or ne with pytest.raises(TypeError): list(unique_justseen([T2(1), T2(2)])) def test_unique_justseen_failure4(): # Test that a failing iterator doesn't raise a SystemError with pytest.raises(_hf.FailNext.EXC_TYP, match=_hf.FailNext.EXC_MSG): next(unique_justseen(_hf.FailNext())) def test_unique_justseen_failure5(): # Too few arguments with pytest.raises(TypeError): unique_justseen() def test_unique_justseen_copy1(): _hf.iterator_copy(unique_justseen([T(1), T(2), T(3)])) def test_unique_justseen_failure_setstate1(): _hf.iterator_setstate_list_fail(unique_justseen(toT([1, 2, 3]))) def test_unique_justseen_failure_setstate2(): _hf.iterator_setstate_empty_fail(unique_justseen(toT([1, 2, 3]))) def test_unique_justseen_pickle1(protocol): ujs = unique_justseen([T(1), T(2), T(3)]) x = pickle.dumps(ujs, protocol=protocol) assert list(pickle.loads(x)) == toT([1, 2, 3]) def test_unique_justseen_pickle2(protocol): ujs = unique_justseen([T(1), T(2), T(3)]) assert next(ujs) == T(1) x = pickle.dumps(ujs, protocol=protocol) assert list(pickle.loads(x)) == toT([2, 3]) def test_unique_justseen_pickle3(protocol): ujs = unique_justseen(['a', 'A', 'a'], key=str.lower) x = pickle.dumps(ujs, protocol=protocol) assert list(pickle.loads(x)) == ['a'] def test_unique_justseen_pickle4(protocol): ujs = unique_justseen(['a', 'A', 'a'], key=str.lower) assert next(ujs) == 'a' x = pickle.dumps(ujs, protocol=protocol) assert list(pickle.loads(x)) == [] 07070100000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000B00000000TRAILER!!!1998 blocks
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor