Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
home:seife:Factory
python-mysql-to-sqlite3
mysql-to-sqlite3-2.1.6.obscpio
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File mysql-to-sqlite3-2.1.6.obscpio of Package python-mysql-to-sqlite3
07070100000000000081A40000000000000000000000016575A15F00002011000000000000000000000000000000000000002300000000mysql-to-sqlite3-2.1.6/.bandit.yml### This config may optionally select a subset of tests to run or skip by ### filling out the 'tests' and 'skips' lists given below. If no tests are ### specified for inclusion then it is assumed all tests are desired. The skips ### set will remove specific tests from the include set. This can be controlled ### using the -t/-s CLI options. Note that the same test ID should not appear ### in both 'tests' and 'skips', this would be nonsensical and is detected by ### Bandit at runtime. # Available tests: # B101 : assert_used # B102 : exec_used # B103 : set_bad_file_permissions # B104 : hardcoded_bind_all_interfaces # B105 : hardcoded_password_string # B106 : hardcoded_password_funcarg # B107 : hardcoded_password_default # B108 : hardcoded_tmp_directory # B110 : try_except_pass # B112 : try_except_continue # B201 : flask_debug_true # B301 : pickle # B302 : marshal # B303 : md5 # B304 : ciphers # B305 : cipher_modes # B306 : mktemp_q # B307 : eval # B308 : mark_safe # B309 : httpsconnection # B310 : urllib_urlopen # B311 : random # B312 : telnetlib # B313 : xml_bad_cElementTree # B314 : xml_bad_ElementTree # B315 : xml_bad_expatreader # B316 : xml_bad_expatbuilder # B317 : xml_bad_sax # B318 : xml_bad_minidom # B319 : xml_bad_pulldom # B320 : xml_bad_etree # B321 : ftplib # B322 : input # B323 : unverified_context # B324 : hashlib_new_insecure_functions # B325 : tempnam # B401 : import_telnetlib # B402 : import_ftplib # B403 : import_pickle # B404 : import_subprocess # B405 : import_xml_etree # B406 : import_xml_sax # B407 : import_xml_expat # B408 : import_xml_minidom # B409 : import_xml_pulldom # B410 : import_lxml # B411 : import_xmlrpclib # B412 : import_httpoxy # B413 : import_pycrypto # B501 : request_with_no_cert_validation # B502 : ssl_with_bad_version # B503 : ssl_with_bad_defaults # B504 : ssl_with_no_version # B505 : weak_cryptographic_key # B506 : yaml_load # B507 : ssh_no_host_key_verification # B601 : paramiko_calls # B602 : subprocess_popen_with_shell_equals_true # B603 : subprocess_without_shell_equals_true # B604 : any_other_function_with_shell_equals_true # B605 : start_process_with_a_shell # B606 : start_process_with_no_shell # B607 : start_process_with_partial_path # B608 : hardcoded_sql_expressions # B609 : linux_commands_wildcard_injection # B610 : django_extra_used # B611 : django_rawsql_used # B701 : jinja2_autoescape_false # B702 : use_of_mako_templates # B703 : django_mark_safe # (optional) list included test IDs here, eg '[B101, B406]': tests: # (optional) list skipped test IDs here, eg '[B101, B406]': skips: - B404 - B603 - B607 - B608 ### (optional) plugin settings - some test plugins require configuration data ### that may be given here, per-plugin. All bandit test plugins have a built in ### set of sensible defaults and these will be used if no configuration is ### provided. It is not necessary to provide settings for every (or any) plugin ### if the defaults are acceptable. any_other_function_with_shell_equals_true: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run hardcoded_tmp_directory: tmp_dirs: - /tmp - /var/tmp - /dev/shm linux_commands_wildcard_injection: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run ssl_with_bad_defaults: bad_protocol_versions: - PROTOCOL_SSLv2 - SSLv2_METHOD - SSLv23_METHOD - PROTOCOL_SSLv3 - PROTOCOL_TLSv1 - SSLv3_METHOD - TLSv1_METHOD ssl_with_bad_version: bad_protocol_versions: - PROTOCOL_SSLv2 - SSLv2_METHOD - SSLv23_METHOD - PROTOCOL_SSLv3 - PROTOCOL_TLSv1 - SSLv3_METHOD - TLSv1_METHOD start_process_with_a_shell: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run start_process_with_no_shell: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run start_process_with_partial_path: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run subprocess_popen_with_shell_equals_true: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run subprocess_without_shell_equals_true: no_shell: - os.execl - os.execle - os.execlp - os.execlpe - os.execv - os.execve - os.execvp - os.execvpe - os.spawnl - os.spawnle - os.spawnlp - os.spawnlpe - os.spawnv - os.spawnve - os.spawnvp - os.spawnvpe - os.startfile shell: - os.system - os.popen - os.popen2 - os.popen3 - os.popen4 - popen2.popen2 - popen2.popen3 - popen2.popen4 - popen2.Popen3 - popen2.Popen4 - commands.getoutput - commands.getstatusoutput subprocess: - subprocess.Popen - subprocess.call - subprocess.check_call - subprocess.check_output - subprocess.run try_except_continue: check_typed_exception: false try_except_pass: check_typed_exception: false weak_cryptographic_key: weak_key_size_dsa_high: 1024 weak_key_size_dsa_medium: 2048 weak_key_size_ec_high: 160 weak_key_size_ec_medium: 224 weak_key_size_rsa_high: 1024 weak_key_size_rsa_medium: 2048 07070100000001000081A40000000000000000000000016575A15F0000006D000000000000000000000000000000000000001F00000000mysql-to-sqlite3-2.1.6/.flake8[flake8] ignore = I100,I201,I202,D203,D401,W503,E203,F401,F403,C901,E501 exclude = tests max-line-length = 8807070100000002000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000001F00000000mysql-to-sqlite3-2.1.6/.github07070100000003000081A40000000000000000000000016575A15F00000038000000000000000000000000000000000000002B00000000mysql-to-sqlite3-2.1.6/.github/FUNDING.ymlgithub: techouse custom: [ "https://paypal.me/ktusar" ] 07070100000004000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000002E00000000mysql-to-sqlite3-2.1.6/.github/ISSUE_TEMPLATE07070100000005000081A40000000000000000000000016575A15F000002BF000000000000000000000000000000000000003C00000000mysql-to-sqlite3-2.1.6/.github/ISSUE_TEMPLATE/bug_report.md--- name: Bug report about: Create a report to help us improve title: '' labels: bug assignees: techouse --- **Describe the bug** A clear and concise description of what the bug is. **Expected behaviour** What you expected. **Actual result** What happened instead. **System Information** ```bash $ mysql2sqlite --version ``` ``` <paste here> ``` This command is only available on v1.3.2 and greater. Otherwise, please provide some basic information about your system (Python version, operating system, etc.). **Additional context** Add any other context about the problem here. In case of errors please run the same command with `--debug`. This option is only available on v1.4.12 or greater. 07070100000006000081A40000000000000000000000016575A15F00000262000000000000000000000000000000000000004100000000mysql-to-sqlite3-2.1.6/.github/ISSUE_TEMPLATE/feature_request.md--- name: Feature request about: Suggest an idea for this project title: '' labels: enhancement assignees: techouse --- **Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here. 07070100000007000081A40000000000000000000000016575A15F000001F6000000000000000000000000000000000000002E00000000mysql-to-sqlite3-2.1.6/.github/dependabot.yml# To get started with Dependabot version updates, you'll need to specify which # package ecosystems to update and where the package manifests are located. # Please see the documentation for all configuration options: # https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates version: 2 updates: - package-ecosystem: "pip" # See documentation for possible values directory: "/" # Location of package manifests schedule: interval: "weekly" 07070100000008000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000002900000000mysql-to-sqlite3-2.1.6/.github/workflows07070100000009000081A40000000000000000000000016575A15F0000026C000000000000000000000000000000000000003D00000000mysql-to-sqlite3-2.1.6/.github/workflows/codeql-analysis.ymlname: "CodeQL" on: push: branches: [ master ] pull_request: # The branches below must be a subset of the branches above branches: [ master ] schedule: - cron: '0 12 * * 6' jobs: analyze: name: Analyze runs-on: ubuntu-latest strategy: fail-fast: false matrix: language: [ 'python' ] steps: - name: Checkout repository uses: actions/checkout@v4 - name: Initialize CodeQL uses: github/codeql-action/init@v2 with: languages: ${{ matrix.language }} - name: Perform CodeQL Analysis uses: github/codeql-action/analyze@v2 0707010000000A000081A40000000000000000000000016575A15F00000666000000000000000000000000000000000000003400000000mysql-to-sqlite3-2.1.6/.github/workflows/docker.ymlname: Publish Docker image on: workflow_call: defaults: run: shell: bash jobs: push_to_registry: name: Push Docker image to Docker Hub runs-on: ubuntu-latest permissions: packages: write contents: read environment: name: docker url: https://hub.docker.com/r/${{ vars.DOCKERHUB_REPOSITORY }} steps: - name: Check out the repo uses: actions/checkout@v4 - name: Set up QEMU uses: docker/setup-qemu-action@v3 - name: Set up Docker Buildx uses: docker/setup-buildx-action@v3 - name: Docker meta id: meta uses: docker/metadata-action@v5 with: images: | ${{ vars.DOCKERHUB_REPOSITORY }} ghcr.io/${{ github.repository }} tags: | type=ref,event=branch type=ref,event=pr type=semver,pattern={{version}} type=semver,pattern={{major}}.{{minor}}.{{patch}} - name: Log in to Docker Hub uses: docker/login-action@v3 with: username: ${{ secrets.DOCKERHUB_USERNAME }} password: ${{ secrets.DOCKERHUB_TOKEN }} - name: Log in to the Container registry uses: docker/login-action@v3 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.GITHUB_TOKEN }} - name: Build and push uses: docker/build-push-action@v5 with: context: . platforms: linux/amd64,linux/arm64 push: true tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }} 0707010000000B000081A40000000000000000000000016575A15F00000C84000000000000000000000000000000000000003500000000mysql-to-sqlite3-2.1.6/.github/workflows/publish.ymlname: Publish package on: push: tags: - 'v[0-9]+.[0-9]+.[0-9]+*' defaults: run: shell: bash permissions: read-all jobs: build-and-publish: runs-on: ubuntu-latest environment: name: pypi url: https://pypi.org/p/mysql-to-sqlite3 permissions: id-token: write contents: write steps: - uses: actions/checkout@v4 - name: Compare package version with ref/tag id: compare run: | set -e VERSION=$(awk -F'"' '/__version__/ {print $2}' mysql_to_sqlite3/__init__.py) TAG=${GITHUB_REF_NAME#v} if [[ "$VERSION" != "$TAG" ]]; then echo "Version in mysql-to-sqlite3/__version__.py ($VERSION) does not match tag ($TAG)" exit 1 fi echo "VERSION=$VERSION" >> $GITHUB_ENV - name: Set up Python id: setup_python uses: actions/setup-python@v4 with: python-version: "3.x" - name: Install build dependencies id: install_build_dependencies run: | set -e python3 -m pip install --upgrade pip pip install build setuptools wheel - name: Build a binary wheel and a source tarball id: build run: | set -e python3 -m build --sdist --wheel --outdir dist/ . - name: Publish distribution package to Test PyPI id: publish_test uses: pypa/gh-action-pypi-publish@release/v1 with: repository-url: https://test.pypi.org/legacy/ - name: Publish distribution package to PyPI id: publish if: startsWith(github.ref, 'refs/tags') uses: pypa/gh-action-pypi-publish@release/v1 - name: Install pyproject-parser id: install_pyproject_parser run: | set -e pip install pyproject-parser[cli] - name: Read project name from pyproject.toml id: read_project_name run: | set -e NAME=$(pyproject-parser info project.name -r | tr -d '"') echo "NAME=$NAME" >> $GITHUB_ENV - name: Create tag-specific CHANGELOG id: create_changelog run: | set -e CHANGELOG_PATH=$RUNNER_TEMP/CHANGELOG.md awk '/^#[[:space:]].*/ { if (count == 1) exit; count++; print } count == 1 && !/^#[[:space:]].*/ { print }' CHANGELOG.md | sed -e :a -e '/^\n*$/{$d;N;ba' -e '}' > $CHANGELOG_PATH echo -en "\n[https://pypi.org/project/$NAME/$VERSION/](https://pypi.org/project/$NAME/$VERSION/)" >> $CHANGELOG_PATH echo "CHANGELOG_PATH=$CHANGELOG_PATH" >> $GITHUB_ENV - name: Github Release id: github_release uses: softprops/action-gh-release@v1 with: name: ${{ env.VERSION }} tag_name: ${{ github.ref }} body_path: ${{ env.CHANGELOG_PATH }} files: | dist/*.whl dist/*.tar.gz - name: Cleanup if: ${{ always() }} run: | rm -rf dist rm -rf $CHANGELOG_PATH docker: needs: build-and-publish permissions: packages: write contents: read uses: ./.github/workflows/docker.yml secrets: inherit0707010000000C000081A40000000000000000000000016575A15F000035F4000000000000000000000000000000000000003200000000mysql-to-sqlite3-2.1.6/.github/workflows/test.ymlname: Test on: push: branches: - master pull_request: branches: - master defaults: run: shell: bash permissions: read-all jobs: analyze: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v4 with: python-version: "3.x" - name: Install dependencies run: | python3 -m pip install --upgrade pip pip install -r requirements_dev.txt - name: Run static analysis run: tox -e linters test: needs: analyze runs-on: ubuntu-latest strategy: matrix: include: - toxenv: "python3.8" db: "mariadb:5.5" legacy_db: 1 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:5.5" legacy_db: 1 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:5.5" legacy_db: 1 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:5.5" legacy_db: 1 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:5.5" legacy_db: 1 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.0" legacy_db: 1 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.0" legacy_db: 1 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.0" legacy_db: 1 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.0" legacy_db: 1 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.0" legacy_db: 1 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.1" legacy_db: 1 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.1" legacy_db: 1 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.1" legacy_db: 1 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.1" legacy_db: 1 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.1" legacy_db: 1 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.2" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.2" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.2" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.2" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.2" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.3" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.3" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.3" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.3" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.3" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.4" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.4" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.4" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.4" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.4" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.5" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.5" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.5" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.5" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.5" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.6" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.6" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.6" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.6" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.6" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mariadb:10.11" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mariadb:10.11" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mariadb:10.11" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mariadb:10.11" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mariadb:10.11" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mysql:5.5" legacy_db: 1 experimental: false py: "3.8" - toxenv: "python3.9" db: "mysql:5.5" legacy_db: 1 experimental: false py: "3.9" - toxenv: "python3.10" db: "mysql:5.5" legacy_db: 1 experimental: false py: "3.10" - toxenv: "python3.11" db: "mysql:5.5" legacy_db: 1 experimental: false py: "3.11" - toxenv: "python3.12" db: "mysql:5.5" legacy_db: 1 experimental: false py: "3.12" - toxenv: "python3.8" db: "mysql:5.6" legacy_db: 1 experimental: false py: "3.8" - toxenv: "python3.9" db: "mysql:5.6" legacy_db: 1 experimental: false py: "3.9" - toxenv: "python3.10" db: "mysql:5.6" legacy_db: 1 experimental: false py: "3.10" - toxenv: "python3.11" db: "mysql:5.6" legacy_db: 1 experimental: false py: "3.11" - toxenv: "python3.12" db: "mysql:5.6" legacy_db: 1 experimental: false py: "3.12" - toxenv: "python3.8" db: "mysql:5.7" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mysql:5.7" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mysql:5.7" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mysql:5.7" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mysql:5.7" legacy_db: 0 experimental: false py: "3.12" - toxenv: "python3.8" db: "mysql:8.0" legacy_db: 0 experimental: false py: "3.8" - toxenv: "python3.9" db: "mysql:8.0" legacy_db: 0 experimental: false py: "3.9" - toxenv: "python3.10" db: "mysql:8.0" legacy_db: 0 experimental: false py: "3.10" - toxenv: "python3.11" db: "mysql:8.0" legacy_db: 0 experimental: false py: "3.11" - toxenv: "python3.12" db: "mysql:8.0" legacy_db: 0 experimental: false py: "3.12" continue-on-error: ${{ matrix.experimental }} services: mysql: image: "${{ matrix.db }}" ports: - 3306:3306 env: MYSQL_ALLOW_EMPTY_PASSWORD: yes options: "--name=mysqld" steps: - uses: actions/checkout@v4 - name: Set up Python ${{ matrix.py }} uses: actions/setup-python@v4 with: python-version: ${{ matrix.py }} - uses: actions/cache@v3 with: path: ~/.cache/pip key: ${{ runner.os }}-pip-1 restore-keys: | ${{ runner.os }}-pip- - name: Install dependencies run: | set -e python -m pip install --upgrade pip python -m pip install -U codecov tox-gh-actions pip install -r requirements_dev.txt - name: Set up MySQL env: DB: ${{ matrix.db }} MYSQL_USER: ${{ secrets.MYSQL_USER }} MYSQL_PASSWORD: ${{ secrets.MYSQL_PASSWORD }} MYSQL_DATABASE: ${{ vars.MYSQL_DATABASE }} MYSQL_HOST: ${{ vars.MYSQL_HOST }} MYSQL_PORT: ${{ vars.MYSQL_PORT }} run: | set -e while : do sleep 1 mysql -h127.0.0.1 -uroot -e 'select version()' && break done if [ "$DB" == 'mysql:8.0' ]; then WITH_PLUGIN='with mysql_native_password' mysql -h127.0.0.1 -uroot -e "SET GLOBAL local_infile=on" docker cp mysqld:/var/lib/mysql/public_key.pem "${HOME}" docker cp mysqld:/var/lib/mysql/ca.pem "${HOME}" docker cp mysqld:/var/lib/mysql/server-cert.pem "${HOME}" docker cp mysqld:/var/lib/mysql/client-key.pem "${HOME}" docker cp mysqld:/var/lib/mysql/client-cert.pem "${HOME}" mysql -uroot -h127.0.0.1 -e ' CREATE USER user_sha256 IDENTIFIED WITH "sha256_password" BY "pass_sha256", nopass_sha256 IDENTIFIED WITH "sha256_password", user_caching_sha2 IDENTIFIED WITH "caching_sha2_password" BY "pass_caching_sha2", nopass_caching_sha2 IDENTIFIED WITH "caching_sha2_password" PASSWORD EXPIRE NEVER;' mysql -uroot -h127.0.0.1 -e 'GRANT RELOAD ON *.* TO user_caching_sha2;' else WITH_PLUGIN='' fi mysql -h127.0.0.1 -uroot -e "create database $MYSQL_DATABASE DEFAULT CHARACTER SET utf8mb4" mysql -h127.0.0.1 -uroot -e "create user $MYSQL_USER identified $WITH_PLUGIN by '${MYSQL_PASSWORD}'; grant all on ${MYSQL_DATABASE}.* to ${MYSQL_USER};" mysql -h127.0.0.1 -uroot -e "create user ${MYSQL_USER}@localhost identified $WITH_PLUGIN by '${MYSQL_PASSWORD}'; grant all on ${MYSQL_DATABASE}.* to ${MYSQL_USER}@localhost;" - name: Create db_credentials.json env: MYSQL_USER: ${{ secrets.MYSQL_USER }} MYSQL_PASSWORD: ${{ secrets.MYSQL_PASSWORD }} MYSQL_DATABASE: ${{ vars.MYSQL_DATABASE }} MYSQL_HOST: ${{ vars.MYSQL_HOST }} MYSQL_PORT: ${{ vars.MYSQL_PORT }} run: | set -e jq -n \ --arg mysql_user "$MYSQL_USER" \ --arg mysql_password "$MYSQL_PASSWORD" \ --arg mysql_database "$MYSQL_DATABASE" \ --arg mysql_host "$MYSQL_HOST" \ --arg mysql_port $MYSQL_PORT \ '$ARGS.named' > tests/db_credentials.json - name: Test with tox env: LEGACY_DB: ${{ matrix.legacy_db }} run: tox - name: Upload coverage to Codecov uses: codecov/codecov-action@v3 with: files: ./coverage.xml env_vars: OS,PYTHON verbose: true - name: Cleanup if: ${{ always() }} run: | rm -rf tests/db_credentials.json 0707010000000D000081A40000000000000000000000016575A15F000005AE000000000000000000000000000000000000002200000000mysql-to-sqlite3-2.1.6/.gitignore# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ pip-wheel-metadata/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .nox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover .hypothesis/ .pytest_cache/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py db.sqlite3 # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder target/ # Jupyter Notebook .ipynb_checkpoints # IPython profile_default/ ipython_config.py # pyenv .python-version # celery beat schedule file celerybeat-schedule # SageMath parsed files *.sage.py # Environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ .dmypy.json dmypy.json # Pyre type checker .pyre/ # IDE specific .idea # macOS specific .DS_Store # Potential leftovers tests/db_credentials.json log.txt 0707010000000E000081A40000000000000000000000016575A15F00001953000000000000000000000000000000000000002400000000mysql-to-sqlite3-2.1.6/CHANGELOG.md# 2.1.6 * [FEAT] build both linux/amd64 and linux/arm64 Docker images # 2.1.5 * [CHORE] fix Docker package publishing from Github Workflow # 2.1.4 * [FIX] fix invalid column_type error message # 2.1.3 * [CHORE] maintenance release to publish first containerized release # 2.1.2 * [FIX] throw more comprehensive error messages when translating column types # 2.1.1 * [CHORE] add support for Python 3.12 * [CHORE] bump minimum version of MySQL Connector/Python to 8.2.0 # 2.1.0 * [CHORE] drop support for Python 3.7 # 2.0.3 * [FIX] import MySQLConnectionAbstract instead of concrete implementations # 2.0.2 * [FIX] properly import CMySQLConnection # 2.0.1 * [FEAT] add support for MySQL character set introducers in DEFAULT clause # 2.0.0 * [CHORE] drop support for Python 2.7, 3.5 and 3.6 * [CHORE] migrate pytest.ini configuration into pyproject.toml * [CHORE] migrate from setuptools to hatch / hatchling * [CHORE] update dependencies * [CHORE] add types * [CHORE] add types to tests * [CHORE] update dependencies * [CHORE] use f-strings where appropriate # 1.4.18 * [CHORE] update dependencies * [CHORE] use [black](https://github.com/psf/black) and [isort](https://github.com/PyCQA/isort) in tox linters # 1.4.17 * [CHORE] migrate from setup.py to pyproject.toml * [CHORE] update the publishing workflow # 1.4.16 * [CHORE] add MariaDB 10.11 CI tests * [CHORE] add Python 3.11 support # 1.4.15 * [FIX] fix BLOB default value * [CHORE] remove CI tests for Python 3.5, 3.6, add tests for Python 3.11 # 1.4.14 * [FIX] pin mysql-connector-python to <8.0.30 * [CHORE] update CI actions/checkout to v3 * [CHORE] update CI actions/setup-python to v4 * [CHORE] update CI actions/cache to v3 * [CHORE] update CI github/codeql-action/init to v2 * [CHORE] update CI github/codeql-action/analyze to v2 # 1.4.13 * [FEAT] add option to exclude specific MySQL tables * [CHORE] update CI codecov/codecov-action to v2 # 1.4.12 * [FIX] fix SQLite convert_date converter * [CHORE] update tests # 1.4.11 * [FIX] pin python-slugify to <6.0.0 # 1.4.10 * [FEAT] add feature to transfer tables without any data (DDL only) # 1.4.9 * [CHORE] add Python 3.10 support * [CHORE] add Python 3.10 tests # 1.4.8 * [FEAT] transfer JSON columns as JSON # 1.4.7 * [CHORE] add experimental tests for Python 3.10-dev * [CHORE] add tests for MariaDB 10.6 # 1.4.6 * [FIX] pin Click to <8.0 # 1.4.5 * [FEAT] add -K, --prefix-indices CLI option to prefix indices with table names. This used to be the default behavior until now. To keep the old behavior simply use this CLI option. # 1.4.4 * [FEAT] add --limit-rows CLI option * [FEAT] add --collation CLI option to specify SQLite collation sequence # 1.4.3 * [FIX] pin python-tabulate to <0.8.6 for Python 3.4 or less * [FIX] pin python-slugify to <5.0.0 for Python 3.5 or less * [FIX] pin Click to 7.x for Python 3.5 or less # 1.4.2 * [FIX] fix default column value not getting converted # 1.4.1 * [FIX] get table list error when Click package is 8.0+ # 1.4.0 * [FEAT] add password prompt. This changes the default behavior of -p * [FEAT] add option to disable MySQL connection encryption * [FEAT] add non-chunked progress bar * [FIX] pin mysql-connector-python to <8.0.24 for Python 3.5 or lower * [FIX] require sqlalchemy <1.4.0 to make compatible with sqlalchemy-utils # 1.3.8 * [FIX] some MySQL integer column definitions result in TEXT fields in sqlite3 * [FIX] fix CI tests # 1.3.7 * [CHORE] transition from Travis CI to GitHub Actions # 1.3.6 * [FIX] Fix Python 3.9 tests # 1.3.5 * [FIX] add IF NOT EXISTS to the CREATE INDEX SQL command * [CHORE] add Python 3.9 CI tests # 1.3.4 * [FEAT] add --quiet option # 1.3.3 * [FIX] test for mysql client more gracefully # 1.3.2 * [FEAT] simpler access to the debug version info using the --version switch * [FEAT] add debug_info module to be used in bug reports * [CHORE] remove PyPy and PyPy3 CI tests * [CHORE] add tabulate to development dependencies * [CHORE] use pytest fixture fom Faker 4.1.0 in Python 3 tests * [CHORE] omit debug_info.py in coverage reports # 1.3.1 * [FIX] fix information_schema issue introduced with MySQL 8.0.21 * [FIX] fix MySQL 8 bug where column types would sometimes be returned as bytes instead of strings * [FIX] sqlalchemy-utils dropped Python 2.7 support in v0.36.7 * [CHORE] use MySQL Client instead of PyMySQL in tests * [CHORE] add MySQL version output to CI tests * [CHORE] add Python 3.9 to the CI tests * [CHORE] add MariaDB 10.5 to the CI tests * [CHORE] remove Python 2.7 from allowed CI test failures * [CHORE] use Ubuntu Bionic instead of Ubuntu Xenial in CI tests * [CHORE] use Ubuntu Xenial only for MariaDB 10.4 CI tests * [CHORE] test legacy databases in CI tests # 1.3.0 * [FEAT] add option to transfer only specific tables using -t * [CHORE] add tests for transferring only certain tables # 1.2.11 * [FIX] duplicate foreign keys # 1.2.10 * [FIX] properly escape SQLite index names * [FIX] fix SQLite global index name scoping * [CHORE] test the successful transfer of an unorthodox table name * [CHORE] test the successful transfer of indices with same names # 1.2.9 * [FIX] differentiate better between MySQL and SQLite errors * [CHORE] add Python 3.8 and 3.8-dev test build # 1.2.8 * [CHORE] add support for Python 3.8 * [CHORE] update mysql-connector-python to a minimum version of 8.0.18 to support Python 3.8 * [CHORE] update development dependencies * [CHORE] add [bandit](https://github.com/PyCQA/bandit) tests # 1.2.7 * [FEAT] transfer unique indices * [FIX] improve index transport * [CHORE] test transfer of indices # 1.2.6 * [CHORE] include tests in the PyPI package # 1.2.5 * [FEAT] transfer foreign keys * [CHORE] removed duplicate import in test database models # 1.2.4 * [CHORE] reformat MySQLtoSQLite constructor * [CHORE] reformat translator function * [CHORE] add more tests # 1.2.3 * [CHORE] add more tests # 1.2.2 * [CHORE] refactor package * [CHORE] fix CI tests * [CHORE] add linter rules # 1.2.1 * [FEAT] add Python 2.7 support # 1.2.0 * [CHORE] add CI tests * [CHORE] achieve 100% test coverage # 1.1.2 * [FIX] fix error of transferring tables without primary keys * [FIX] fix error of transferring empty tables # 1.1.1 * [FEAT] add option to use MySQLCursorBuffered cursors * [FEAT] add MySQL port * [FEAT] update --help hints * [FIX] fix slugify import * [FIX] cursor error # 1.1.0 * [FEAT] add VACUUM option # 1.0.0 Initial commit 0707010000000F000081A40000000000000000000000016575A15F00001589000000000000000000000000000000000000002A00000000mysql-to-sqlite3-2.1.6/CODE-OF-CONDUCT.md# Contributor Covenant Code of Conduct ## Our Pledge We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation. We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community. ## Our Standards Examples of behavior that contributes to a positive environment for our community include: * Demonstrating empathy and kindness toward other people * Being respectful of differing opinions, viewpoints, and experiences * Giving and gracefully accepting constructive feedback * Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience * Focusing on what is best not just for us as individuals, but for the overall community Examples of unacceptable behavior include: * The use of sexualized language or imagery, and sexual attention or advances of any kind * Trolling, insulting or derogatory comments, and personal or political attacks * Public or private harassment * Publishing others' private information, such as a physical or email address, without their explicit permission * Other conduct which could reasonably be considered inappropriate in a professional setting ## Enforcement Responsibilities Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful. Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate. ## Scope This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. ## Enforcement Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at [techouse@gmail.com](mailto:techouse@gmail.com). All complaints will be reviewed and investigated promptly and fairly. All community leaders are obligated to respect the privacy and security of the reporter of any incident. ## Enforcement Guidelines Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct: ### 1. Correction **Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community. **Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested. ### 2. Warning **Community Impact**: A violation through a single incident or series of actions. **Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban. ### 3. Temporary Ban **Community Impact**: A serious violation of community standards, including sustained inappropriate behavior. **Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban. ### 4. Permanent Ban **Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals. **Consequence**: A permanent ban from any sort of public interaction within the community. ## Attribution This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.1, available at [https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1]. Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder][Mozilla CoC]. For answers to common questions about this code of conduct, see the FAQ at [https://www.contributor-covenant.org/faq][FAQ]. Translations are available at [https://www.contributor-covenant.org/translations][translations]. [homepage]: https://www.contributor-covenant.org [v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html [Mozilla CoC]: https://github.com/mozilla/diversity [FAQ]: https://www.contributor-covenant.org/faq [translations]: https://www.contributor-covenant.org/translations 07070100000010000081A40000000000000000000000016575A15F00000BDF000000000000000000000000000000000000002700000000mysql-to-sqlite3-2.1.6/CONTRIBUTING.md# Contributing I greatly appreciate your interest in reading this message, as this project requires volunteer developers to assist in developing and maintaining it. Before making any changes to this repository, please first discuss the proposed modifications with the repository owners through an issue, email, or any other appropriate communication channel. Please be aware that a [code of conduct](CODE-OF-CONDUCT.md) is in place, and should be adhered to during all interactions related to the project. ## Python version support Ensuring backward compatibility is an imperative requirement. Currently, the tool supports Python versions 3.8, 3.9, 3.10, 3.11, and 3.12. ## MySQL version support This tool is intended to fully support MySQL versions 5.5, 5.6, 5.7, and 8.0, including major forks like MariaDB. We should prioritize and be dedicated to maintaining compatibility with these versions for a smooth user experience. ## Testing As this project/tool involves the critical process of transferring data between different database types, it is of utmost importance to ensure thorough testing. Please remember to write tests for any new code you create, utilizing the [pytest](https://docs.pytest.org/en/latest/) framework for all test cases. ### Running the test suite In order to run the test suite run these commands using a Docker MySQL image. **Requires a running Docker instance!** ```bash git clone https://github.com/techouse/mysql-to-sqlite3 cd mysql-to-sqlite3 python3 -m venv env source env/bin/activate pip install -e . pip install -r requirements_dev.txt tox ``` ## Submitting changes To contribute to this project, please submit a new [pull request](https://github.com/techouse/mysql-to-sqlite3/pull/new/master) and provide a clear list of your modifications. For guidance on creating pull requests, you can refer to [this resource](http://help.github.com/pull-requests/). When sending a pull request, we highly appreciate the inclusion of [pytest](https://docs.pytest.org/en/latest/) tests, as we strive to enhance our test coverage. Following our coding conventions is essential, and it would be ideal if you ensure that each commit focuses on a single feature. For commits, please write clear log messages. While concise one-line messages are suitable for small changes, more substantial modifications should follow a format similar to the example below: ```bash git commit -m "A brief summary of the commit > > A paragraph describing what changed and its impact." ``` ## Coding standards It is essential to prioritize code readability and conciseness. To achieve this, we recommend using [Black](https://github.com/psf/black) for code formatting. Once your work is deemed complete, it is advisable to run the following command: ```bash tox -e flake8,linters ``` This command executes various linters and checkers to identify any potential issues or inconsistencies in your code. By following these guidelines, you can ensure a high-quality codebase. Thanks, Klemen Tusar07070100000011000081A40000000000000000000000016575A15F000000C7000000000000000000000000000000000000002200000000mysql-to-sqlite3-2.1.6/DockerfileFROM python:3.12-alpine LABEL maintainer="https://github.com/techouse" RUN pip install --no-cache-dir --upgrade pip && \ pip install --no-cache-dir mysql-to-sqlite3 ENTRYPOINT ["mysql2sqlite"]07070100000012000081A40000000000000000000000016575A15F0000042D000000000000000000000000000000000000001F00000000mysql-to-sqlite3-2.1.6/LICENSEMIT License Copyright (c) 2023 Klemen Tusar Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 07070100000013000081A40000000000000000000000016575A15F000014BB000000000000000000000000000000000000002100000000mysql-to-sqlite3-2.1.6/README.md[![PyPI](https://img.shields.io/pypi/v/mysql-to-sqlite3)](https://pypi.org/project/mysql-to-sqlite3/) [![Downloads](https://pepy.tech/badge/mysql-to-sqlite3)](https://pepy.tech/project/mysql-to-sqlite3) [![GitHub Sponsors](https://img.shields.io/github/sponsors/techouse)](https://github.com/sponsors/techouse) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mysql-to-sqlite3)](https://pypi.org/project/mysql-to-sqlite3/) [![MySQL Support](https://img.shields.io/static/v1?label=MySQL&message=5.5+|+5.6+|+5.7+|+8.0&color=2b5d80)](https://img.shields.io/static/v1?label=MySQL&message=5.6+|+5.7+|+8.0&color=2b5d80) [![MariaDB Support](https://img.shields.io/static/v1?label=MariaDB&message=5.5+|+10.0+|+10.1+|+10.2+|+10.3+|+10.4+|+10.5+|+10.6|+10.11&color=C0765A)](https://img.shields.io/static/v1?label=MariaDB&message=10.0+|+10.1+|+10.2+|+10.3+|+10.4+|+10.5&color=C0765A) [![GitHub license](https://img.shields.io/github/license/techouse/mysql-to-sqlite3)](https://github.com/techouse/mysql-to-sqlite3/blob/master/LICENSE) [![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](CODE-OF-CONDUCT.md) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black) [![Codacy Badge](https://api.codacy.com/project/badge/Grade/64aae8e9599746d58d277852b35cc2bd)](https://www.codacy.com/manual/techouse/mysql-to-sqlite3?utm_source=github.com&utm_medium=referral&utm_content=techouse/mysql-to-sqlite3&utm_campaign=Badge_Grade) [![Build Status](https://github.com/techouse/mysql-to-sqlite3/workflows/Test/badge.svg)](https://github.com/techouse/mysql-to-sqlite3/actions?query=workflow%3ATest) [![codecov](https://codecov.io/gh/techouse/mysql-to-sqlite3/branch/master/graph/badge.svg)](https://codecov.io/gh/techouse/mysql-to-sqlite3) [![GitHub stars](https://img.shields.io/github/stars/techouse/mysql-to-sqlite3.svg?style=social&label=Star&maxAge=2592000)](https://github.com/techouse/mysql-to-sqlite3/stargazers) # MySQL to SQLite3 #### A simple Python tool to transfer data from MySQL to SQLite 3. ### How to run ```bash pip install mysql-to-sqlite3 mysql2sqlite --help ``` ### Usage ``` Usage: mysql2sqlite [OPTIONS] Transfer MySQL to SQLite using the provided CLI options. Options: -f, --sqlite-file PATH SQLite3 database file [required] -d, --mysql-database TEXT MySQL database name [required] -u, --mysql-user TEXT MySQL user [required] -p, --prompt-mysql-password Prompt for MySQL password --mysql-password TEXT MySQL password -t, --mysql-tables TUPLE Transfer only these specific tables (space separated table names). Implies --without- foreign-keys which inhibits the transfer of foreign keys. Can not be used together with --exclude-mysql-tables. -e, --exclude-mysql-tables TUPLE Transfer all tables except these specific tables (space separated table names). Implies --without-foreign-keys which inhibits the transfer of foreign keys. Can not be used together with --mysql-tables. -L, --limit-rows INTEGER Transfer only a limited number of rows from each table. -C, --collation [BINARY|NOCASE|RTRIM] Create datatypes of TEXT affinity using a specified collation sequence. [default: BINARY] -K, --prefix-indices Prefix indices with their corresponding tables. This ensures that their names remain unique across the SQLite database. -X, --without-foreign-keys Do not transfer foreign keys. -W, --without-data Do not transfer table data, DDL only. -h, --mysql-host TEXT MySQL host. Defaults to localhost. -P, --mysql-port INTEGER MySQL port. Defaults to 3306. -S, --skip-ssl Disable MySQL connection encryption. -c, --chunk INTEGER Chunk reading/writing SQL records -l, --log-file PATH Log file --json-as-text Transfer JSON columns as TEXT. -V, --vacuum Use the VACUUM command to rebuild the SQLite database file, repacking it into a minimal amount of disk space --use-buffered-cursors Use MySQLCursorBuffered for reading the MySQL database. This can be useful in situations where multiple queries, with small result sets, need to be combined or computed with each other. -q, --quiet Quiet. Display only errors. --debug Debug mode. Will throw exceptions. --version Show the version and exit. --help Show this message and exit. ```07070100000014000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000002800000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite307070100000015000081A40000000000000000000000016575A15F00000075000000000000000000000000000000000000003400000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/__init__.py"""Utility to transfer data from MySQL to SQLite 3.""" __version__ = "2.1.6" from .transporter import MySQLtoSQLite 07070100000016000081A40000000000000000000000016575A15F00001739000000000000000000000000000000000000002F00000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/cli.py"""The command line interface of MySQLtoSQLite.""" import os import sys import typing as t import click from tabulate import tabulate from . import MySQLtoSQLite from .click_utils import OptionEatAll, prompt_password, validate_positive_integer from .debug_info import info from .sqlite_utils import CollatingSequences @click.command() @click.option( "-f", "--sqlite-file", type=click.Path(), default=None, help="SQLite3 database file", required=True, ) @click.option("-d", "--mysql-database", default=None, help="MySQL database name", required=True) @click.option("-u", "--mysql-user", default=None, help="MySQL user", required=True) @click.option( "-p", "--prompt-mysql-password", is_flag=True, default=False, callback=prompt_password, help="Prompt for MySQL password", ) @click.option("--mysql-password", default=None, help="MySQL password") @click.option( "-t", "--mysql-tables", type=tuple, cls=OptionEatAll, help="Transfer only these specific tables (space separated table names). " "Implies --without-foreign-keys which inhibits the transfer of foreign keys. " "Can not be used together with --exclude-mysql-tables.", ) @click.option( "-e", "--exclude-mysql-tables", type=tuple, cls=OptionEatAll, help="Transfer all tables except these specific tables (space separated table names). " "Implies --without-foreign-keys which inhibits the transfer of foreign keys. " "Can not be used together with --mysql-tables.", ) @click.option( "-L", "--limit-rows", type=int, callback=validate_positive_integer, default=0, help="Transfer only a limited number of rows from each table.", ) @click.option( "-C", "--collation", type=click.Choice( [ CollatingSequences.BINARY, CollatingSequences.NOCASE, CollatingSequences.RTRIM, ], case_sensitive=False, ), default=CollatingSequences.BINARY, show_default=True, help="Create datatypes of TEXT affinity using a specified collation sequence.", ) @click.option( "-K", "--prefix-indices", is_flag=True, help="Prefix indices with their corresponding tables. " "This ensures that their names remain unique across the SQLite database.", ) @click.option("-X", "--without-foreign-keys", is_flag=True, help="Do not transfer foreign keys.") @click.option( "-W", "--without-data", is_flag=True, help="Do not transfer table data, DDL only.", ) @click.option("-h", "--mysql-host", default="localhost", help="MySQL host. Defaults to localhost.") @click.option("-P", "--mysql-port", type=int, default=3306, help="MySQL port. Defaults to 3306.") @click.option("-S", "--skip-ssl", is_flag=True, help="Disable MySQL connection encryption.") @click.option( "-c", "--chunk", type=int, default=200000, # this default is here for performance reasons help="Chunk reading/writing SQL records", ) @click.option("-l", "--log-file", type=click.Path(), help="Log file") @click.option("--json-as-text", is_flag=True, help="Transfer JSON columns as TEXT.") @click.option( "-V", "--vacuum", is_flag=True, help="Use the VACUUM command to rebuild the SQLite database file, " "repacking it into a minimal amount of disk space", ) @click.option( "--use-buffered-cursors", is_flag=True, help="Use MySQLCursorBuffered for reading the MySQL database. This " "can be useful in situations where multiple queries, with small " "result sets, need to be combined or computed with each other.", ) @click.option("-q", "--quiet", is_flag=True, help="Quiet. Display only errors.") @click.option("--debug", is_flag=True, help="Debug mode. Will throw exceptions.") @click.version_option(message=tabulate(info(), headers=["software", "version"], tablefmt="github")) def cli( sqlite_file: t.Union[str, "os.PathLike[t.Any]"], mysql_user: str, prompt_mysql_password: bool, mysql_password: str, mysql_database: str, mysql_tables: t.Optional[t.Sequence[str]], exclude_mysql_tables: t.Optional[t.Sequence[str]], limit_rows: int, collation: t.Optional[str], prefix_indices: bool, without_foreign_keys: bool, without_data: bool, mysql_host: str, mysql_port: int, skip_ssl: bool, chunk: int, log_file: t.Union[str, "os.PathLike[t.Any]"], json_as_text: bool, vacuum: bool, use_buffered_cursors: bool, quiet: bool, debug: bool, ) -> None: """Transfer MySQL to SQLite using the provided CLI options.""" try: if mysql_tables and exclude_mysql_tables: raise click.UsageError("Illegal usage: --mysql-tables and --exclude-mysql-tables are mutually exclusive!") converter = MySQLtoSQLite( sqlite_file=sqlite_file, mysql_user=mysql_user, mysql_password=mysql_password or prompt_mysql_password, mysql_database=mysql_database, mysql_tables=mysql_tables, exclude_mysql_tables=exclude_mysql_tables, limit_rows=limit_rows, collation=collation, prefix_indices=prefix_indices, without_foreign_keys=without_foreign_keys or (mysql_tables is not None and len(mysql_tables) > 0), without_data=without_data, mysql_host=mysql_host, mysql_port=mysql_port, mysql_ssl_disabled=skip_ssl, chunk=chunk, json_as_text=json_as_text, vacuum=vacuum, buffered=use_buffered_cursors, log_file=log_file, quiet=quiet, ) converter.transfer() except KeyboardInterrupt: if debug: raise print("\nProcess interrupted. Exiting...") sys.exit(1) except Exception as err: # pylint: disable=W0703 if debug: raise print(err) sys.exit(1) 07070100000017000081A40000000000000000000000016575A15F00000A65000000000000000000000000000000000000003700000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/click_utils.py"""Click utilities.""" import typing as t import click class OptionEatAll(click.Option): """Taken from https://stackoverflow.com/questions/48391777/nargs-equivalent-for-options-in-click#answer-48394004.""" # noqa: ignore=E501 pylint: disable=C0301 def __init__(self, *args, **kwargs): """Override.""" self.save_other_options = kwargs.pop("save_other_options", True) nargs = kwargs.pop("nargs", -1) if nargs != -1: raise ValueError(f"nargs, if set, must be -1 not {nargs}") super(OptionEatAll, self).__init__(*args, **kwargs) self._previous_parser_process = None self._eat_all_parser = None def add_to_parser(self, parser, ctx) -> None: """Override.""" def parser_process(value, state): # method to hook to the parser.process done = False value = [value] if self.save_other_options: # grab everything up to the next option while state.rargs and not done: for prefix in self._eat_all_parser.prefixes: if state.rargs[0].startswith(prefix): done = True if not done: value.append(state.rargs.pop(0)) else: # grab everything remaining value += state.rargs state.rargs[:] = [] value = tuple(value) # call the actual process self._previous_parser_process(value, state) retval = super(OptionEatAll, self).add_to_parser(parser, ctx) # pylint: disable=E1111 for name in self.opts: # pylint: disable=W0212 our_parser = parser._long_opt.get(name) or parser._short_opt.get(name) if our_parser: self._eat_all_parser = our_parser self._previous_parser_process = our_parser.process our_parser.process = parser_process break return retval def prompt_password(ctx: click.core.Context, param: t.Any, use_password: bool): # pylint: disable=W0613 """Prompt for password.""" if use_password: mysql_password = ctx.params.get("mysql_password") if not mysql_password: mysql_password = click.prompt("MySQL password", hide_input=True) return mysql_password def validate_positive_integer(ctx: click.core.Context, param: t.Any, value: int): # pylint: disable=W0613 """Allow only positive integers and 0.""" if value < 0: raise click.BadParameter("Should be a positive integer or 0.") return value 07070100000018000081A40000000000000000000000016575A15F00000D02000000000000000000000000000000000000003600000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/debug_info.py"""Module containing bug report helper(s). Adapted from https://github.com/psf/requests/blob/master/requests/help.py """ import platform import sqlite3 import sys import typing as t from shutil import which from subprocess import check_output import click import mysql.connector import pytimeparse2 import simplejson import slugify import tabulate import tqdm from . import __version__ as package_version def _implementation() -> str: """Return a dict with the Python implementation and version. Provide both the name and the version of the Python implementation currently running. For example, on CPython 2.7.5 it will return {'name': 'CPython', 'version': '2.7.5'}. This function works best on CPython and PyPy: in particular, it probably doesn't work for Jython or IronPython. Future investigation should be done to work out the correct shape of the code for those platforms. """ implementation: str = platform.python_implementation() if implementation == "CPython": implementation_version = platform.python_version() elif implementation == "PyPy": implementation_version = "%s.%s.%s" % ( sys.pypy_version_info.major, # type: ignore # noqa: ignore=E1101 pylint: disable=E1101 sys.pypy_version_info.minor, # type: ignore # noqa: ignore=E1101 pylint: disable=E1101 sys.pypy_version_info.micro, # type: ignore # noqa: ignore=E1101 pylint: disable=E1101 ) rel = sys.pypy_version_info.releaselevel # type: ignore # noqa: ignore=E1101 pylint: disable=E1101 if rel != "final": implementation_version = "".join([implementation_version, rel]) elif implementation == "Jython": implementation_version = platform.python_version() # Complete Guess elif implementation == "IronPython": implementation_version = platform.python_version() # Complete Guess else: implementation_version = "Unknown" return f"{implementation} {implementation_version}" def _mysql_version() -> str: if which("mysql") is not None: try: mysql_version: t.Union[str, bytes] = check_output(["mysql", "-V"]) try: return mysql_version.decode().strip() # type: ignore except (UnicodeDecodeError, AttributeError): return str(mysql_version) except Exception: # nosec pylint: disable=W0703 pass return "MySQL client not found on the system" def info() -> t.List[t.List[str]]: """Generate information for a bug report.""" try: platform_info: str = f"{platform.system()} {platform.release()}" except IOError: platform_info = "Unknown" return [ ["mysql-to-sqlite3", package_version], ["", ""], ["Operating System", platform_info], ["Python", _implementation()], ["MySQL", _mysql_version()], ["SQLite", sqlite3.sqlite_version], ["", ""], ["click", click.__version__], ["mysql-connector-python", mysql.connector.__version__], ["python-slugify", slugify.__version__], ["pytimeparse2", pytimeparse2.__version__], ["simplejson", simplejson.__version__], # type: ignore ["tabulate", tabulate.__version__], ["tqdm", tqdm.__version__], ] 07070100000019000081A40000000000000000000000016575A15F000000F8000000000000000000000000000000000000003700000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/mysql_utils.py"""Miscellaneous MySQL utilities.""" import typing as t from mysql.connector.charsets import MYSQL_CHARACTER_SETS CHARSET_INTRODUCERS: t.Tuple[str, ...] = tuple( f"_{charset[0]}" for charset in MYSQL_CHARACTER_SETS if charset is not None ) 0707010000001A000081A40000000000000000000000016575A15F00000000000000000000000000000000000000000000003100000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/py.typed0707010000001B000081A40000000000000000000000016575A15F000005E4000000000000000000000000000000000000003800000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/sqlite_utils.py"""SQLite adapters and converters for unsupported data types.""" import sqlite3 import typing as t from datetime import date, timedelta from decimal import Decimal from pytimeparse2 import parse def adapt_decimal(value: t.Any) -> str: """Convert decimal.Decimal to string.""" return str(value) def convert_decimal(value: t.Any) -> Decimal: """Convert string to decimal.Decimal.""" return Decimal(value) def adapt_timedelta(value: t.Any) -> str: """Convert datetime.timedelta to %H:%M:%S string.""" hours, remainder = divmod(value.total_seconds(), 3600) minutes, seconds = divmod(remainder, 60) return "{:02}:{:02}:{:02}".format(int(hours), int(minutes), int(seconds)) def convert_timedelta(value: t.Any) -> timedelta: """Convert %H:%M:%S string to datetime.timedelta.""" return timedelta(seconds=parse(value)) def encode_data_for_sqlite(value: t.Any) -> t.Any: """Fix encoding bytes.""" try: return value.decode() except (UnicodeDecodeError, AttributeError): return sqlite3.Binary(value) class CollatingSequences: """Taken from https://www.sqlite.org/datatype3.html#collating_sequences.""" BINARY: str = "BINARY" NOCASE: str = "NOCASE" RTRIM: str = "RTRIM" def convert_date(value: t.Any) -> date: """Handle SQLite date conversion.""" try: return date.fromisoformat(value.decode()) except ValueError as err: raise ValueError(f"DATE field contains {err}") # pylint: disable=W0707 0707010000001C000081A40000000000000000000000016575A15F00006DEB000000000000000000000000000000000000003700000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/transporter.py"""Use to transfer a MySQL database to SQLite.""" import logging import os import re import sqlite3 import typing as t from datetime import timedelta from decimal import Decimal from math import ceil from os.path import realpath from sys import stdout import mysql.connector import typing_extensions as tx from mysql.connector import errorcode from mysql.connector.abstracts import MySQLConnectionAbstract from mysql.connector.types import ToPythonOutputTypes from tqdm import tqdm, trange from mysql_to_sqlite3.mysql_utils import CHARSET_INTRODUCERS from mysql_to_sqlite3.sqlite_utils import ( CollatingSequences, adapt_decimal, adapt_timedelta, convert_date, convert_decimal, convert_timedelta, encode_data_for_sqlite, ) from mysql_to_sqlite3.types import MySQLtoSQLiteAttributes, MySQLtoSQLiteParams class MySQLtoSQLite(MySQLtoSQLiteAttributes): """Use this class to transfer a MySQL database to SQLite.""" COLUMN_PATTERN: t.Pattern[str] = re.compile(r"^[^(]+") COLUMN_LENGTH_PATTERN: t.Pattern[str] = re.compile(r"\(\d+\)$") def __init__(self, **kwargs: tx.Unpack[MySQLtoSQLiteParams]) -> None: """Constructor.""" if kwargs.get("mysql_database") is not None: self._mysql_database = str(kwargs.get("mysql_database")) else: raise ValueError("Please provide a MySQL database") if kwargs.get("mysql_user") is not None: self._mysql_user = str(kwargs.get("mysql_user")) else: raise ValueError("Please provide a MySQL user") if kwargs.get("sqlite_file") is None: raise ValueError("Please provide an SQLite file") else: self._sqlite_file = realpath(str(kwargs.get("sqlite_file"))) self._mysql_password = str(kwargs.get("mysql_password")) or None self._mysql_host = kwargs.get("mysql_host") or "localhost" self._mysql_port = kwargs.get("mysql_port") or 3306 self._mysql_tables = kwargs.get("mysql_tables") or tuple() self._exclude_mysql_tables = kwargs.get("exclude_mysql_tables") or tuple() if len(self._mysql_tables) > 0 and len(self._exclude_mysql_tables) > 0: raise ValueError("mysql_tables and exclude_mysql_tables are mutually exclusive") self._limit_rows = kwargs.get("limit_rows") or 0 if kwargs.get("collation") is not None and str(kwargs.get("collation")).upper() in { CollatingSequences.BINARY, CollatingSequences.NOCASE, CollatingSequences.RTRIM, }: self._collation = str(kwargs.get("collation")).upper() else: self._collation = CollatingSequences.BINARY self._prefix_indices = kwargs.get("prefix_indices") or False if len(self._mysql_tables) > 0 or len(self._exclude_mysql_tables) > 0: self._without_foreign_keys = True else: self._without_foreign_keys = kwargs.get("without_foreign_keys") or False self._without_data = kwargs.get("without_data") or False self._mysql_ssl_disabled = kwargs.get("mysql_ssl_disabled") or False self._current_chunk_number = 0 self._chunk_size = kwargs.get("chunk") or None self._buffered = kwargs.get("buffered") or False self._vacuum = kwargs.get("vacuum") or False self._quiet = kwargs.get("quiet") or False self._logger = self._setup_logger(log_file=kwargs.get("log_file") or None, quiet=self._quiet) sqlite3.register_adapter(Decimal, adapt_decimal) sqlite3.register_converter("DECIMAL", convert_decimal) sqlite3.register_adapter(timedelta, adapt_timedelta) sqlite3.register_converter("DATE", convert_date) sqlite3.register_converter("TIME", convert_timedelta) self._sqlite = sqlite3.connect(realpath(self._sqlite_file), detect_types=sqlite3.PARSE_DECLTYPES) self._sqlite.row_factory = sqlite3.Row self._sqlite_cur = self._sqlite.cursor() self._json_as_text = kwargs.get("json_as_text") or False self._sqlite_json1_extension_enabled = not self._json_as_text and self._check_sqlite_json1_extension_enabled() try: _mysql_connection = mysql.connector.connect( user=self._mysql_user, password=self._mysql_password, host=self._mysql_host, port=self._mysql_port, ssl_disabled=self._mysql_ssl_disabled, ) if isinstance(_mysql_connection, MySQLConnectionAbstract): self._mysql = _mysql_connection else: raise ConnectionError("Unable to connect to MySQL") if not self._mysql.is_connected(): raise ConnectionError("Unable to connect to MySQL") self._mysql_cur = self._mysql.cursor(buffered=self._buffered, raw=True) # type: ignore[assignment] self._mysql_cur_prepared = self._mysql.cursor(prepared=True) # type: ignore[assignment] self._mysql_cur_dict = self._mysql.cursor( # type: ignore[assignment] buffered=self._buffered, dictionary=True, ) try: self._mysql.database = self._mysql_database except (mysql.connector.Error, Exception) as err: if hasattr(err, "errno") and err.errno == errorcode.ER_BAD_DB_ERROR: self._logger.error("MySQL Database does not exist!") raise self._logger.error(err) raise except mysql.connector.Error as err: self._logger.error(err) raise @classmethod def _setup_logger( cls, log_file: t.Optional[t.Union[str, "os.PathLike[t.Any]"]] = None, quiet: bool = False ) -> logging.Logger: formatter: logging.Formatter = logging.Formatter( fmt="%(asctime)s %(levelname)-8s %(message)s", datefmt="%Y-%m-%d %H:%M:%S" ) logger: logging.Logger = logging.getLogger(cls.__name__) logger.setLevel(logging.DEBUG) if not quiet: screen_handler = logging.StreamHandler(stream=stdout) screen_handler.setFormatter(formatter) logger.addHandler(screen_handler) if log_file: file_handler = logging.FileHandler(realpath(log_file), mode="w") file_handler.setFormatter(formatter) logger.addHandler(file_handler) return logger @classmethod def _valid_column_type(cls, column_type: str) -> t.Optional[t.Match[str]]: return cls.COLUMN_PATTERN.match(column_type.strip()) @classmethod def _column_type_length(cls, column_type: str) -> str: suffix: t.Optional[t.Match[str]] = cls.COLUMN_LENGTH_PATTERN.search(column_type) if suffix: return suffix.group(0) return "" @staticmethod def _decode_column_type(column_type: t.Union[str, bytes]) -> str: if isinstance(column_type, str): return column_type if isinstance(column_type, bytes): try: return column_type.decode() except (UnicodeDecodeError, AttributeError): pass return str(column_type) @classmethod def _translate_type_from_mysql_to_sqlite( cls, column_type: t.Union[str, bytes], sqlite_json1_extension_enabled=False ) -> str: _column_type: str = cls._decode_column_type(column_type) # This could be optimized even further, however is seems adequate. match: t.Optional[t.Match[str]] = cls._valid_column_type(_column_type) if not match: raise ValueError(f'"{_column_type}" is not a valid column_type!') data_type: str = match.group(0).upper() if data_type.endswith(" UNSIGNED"): data_type = data_type.replace(" UNSIGNED", "") if data_type in { "BIGINT", "BLOB", "BOOLEAN", "DATE", "DATETIME", "DECIMAL", "DOUBLE", "FLOAT", "INTEGER", "MEDIUMINT", "NUMERIC", "REAL", "SMALLINT", "TIME", "TINYINT", "YEAR", }: return data_type if data_type in { "BIT", "BINARY", "LONGBLOB", "MEDIUMBLOB", "TINYBLOB", "VARBINARY", }: return "BLOB" if data_type in {"NCHAR", "NVARCHAR", "VARCHAR"}: return data_type + cls._column_type_length(_column_type) if data_type == "CHAR": return "CHARACTER" + cls._column_type_length(_column_type) if data_type == "INT": return "INTEGER" if data_type in "TIMESTAMP": return "DATETIME" if data_type == "JSON" and sqlite_json1_extension_enabled: return "JSON" return "TEXT" @classmethod def _translate_default_from_mysql_to_sqlite( cls, column_default: ToPythonOutputTypes = None, column_type: t.Optional[str] = None, column_extra: ToPythonOutputTypes = None, ) -> str: is_binary: bool is_hex: bool if isinstance(column_default, bytes): if column_type in { "BIT", "BINARY", "BLOB", "LONGBLOB", "MEDIUMBLOB", "TINYBLOB", "VARBINARY", }: if column_extra in {"DEFAULT_GENERATED", "default_generated"}: for charset_introducer in CHARSET_INTRODUCERS: if column_default.startswith(charset_introducer.encode()): is_binary = False is_hex = False for b_prefix in ("B", "b"): if column_default.startswith(rf"{charset_introducer} {b_prefix}\'".encode()): is_binary = True break for x_prefix in ("X", "x"): if column_default.startswith(rf"{charset_introducer} {x_prefix}\'".encode()): is_hex = True break column_default = ( column_default.replace(charset_introducer.encode(), b"") .replace(rb"x\'", b"") .replace(rb"X\'", b"") .replace(rb"b\'", b"") .replace(rb"B\'", b"") .replace(rb"\'", b"") .replace(rb"'", b"") .strip() ) if is_binary: return f"DEFAULT '{chr(int(column_default, 2))}'" if is_hex: return f"DEFAULT x'{column_default.decode()}'" break return f"DEFAULT x'{column_default.hex()}'" try: column_default = column_default.decode() except (UnicodeDecodeError, AttributeError): pass if column_default is None: return "" if isinstance(column_default, bool): if column_type == "BOOLEAN" and sqlite3.sqlite_version >= "3.23.0": if column_default: return "DEFAULT(TRUE)" return "DEFAULT(FALSE)" return f"DEFAULT '{int(column_default)}'" if isinstance(column_default, str): if column_extra in {"DEFAULT_GENERATED", "default_generated"}: if column_default.upper() in { "CURRENT_TIME", "CURRENT_DATE", "CURRENT_TIMESTAMP", }: return f"DEFAULT {column_default.upper()}" for charset_introducer in CHARSET_INTRODUCERS: if column_default.startswith(charset_introducer): is_binary = False is_hex = False for b_prefix in ("B", "b"): if column_default.startswith(rf"{charset_introducer} {b_prefix}\'"): is_binary = True break for x_prefix in ("X", "x"): if column_default.startswith(rf"{charset_introducer} {x_prefix}\'"): is_hex = True break column_default = ( column_default.replace(charset_introducer, "") .replace(r"x\'", "") .replace(r"X\'", "") .replace(r"b\'", "") .replace(r"B\'", "") .replace(r"\'", "") .replace(r"'", "") .strip() ) if is_binary: return f"DEFAULT '{chr(int(column_default, 2))}'" if is_hex: return f"DEFAULT x'{column_default}'" return f"DEFAULT '{column_default}'" return "DEFAULT '{}'".format(column_default.replace(r"\'", r"''")) return "DEFAULT '{}'".format(str(column_default).replace(r"\'", r"''")) @classmethod def _data_type_collation_sequence( cls, collation: str = CollatingSequences.BINARY, column_type: t.Optional[str] = None ) -> str: if column_type and collation != CollatingSequences.BINARY: if column_type.startswith( ( "CHARACTER", "NCHAR", "NVARCHAR", "TEXT", "VARCHAR", ) ): return f"COLLATE {collation}" return "" def _check_sqlite_json1_extension_enabled(self) -> bool: try: self._sqlite_cur.execute("PRAGMA compile_options") return "ENABLE_JSON1" in set(row[0] for row in self._sqlite_cur.fetchall()) except sqlite3.Error: return False def _build_create_table_sql(self, table_name: str) -> str: sql: str = f'CREATE TABLE IF NOT EXISTS "{table_name}" (' primary: str = "" indices: str = "" self._mysql_cur_dict.execute(f"SHOW COLUMNS FROM `{table_name}`") for row in self._mysql_cur_dict.fetchall(): if row is not None: column_type = self._translate_type_from_mysql_to_sqlite( column_type=row["Type"], # type: ignore[arg-type] sqlite_json1_extension_enabled=self._sqlite_json1_extension_enabled, ) sql += '\n\t"{name}" {type} {notnull} {default} {collation},'.format( name=row["Field"].decode() if isinstance(row["Field"], bytes) else row["Field"], type=column_type, notnull="NULL" if row["Null"] == "YES" else "NOT NULL", default=self._translate_default_from_mysql_to_sqlite(row["Default"], column_type, row["Extra"]), collation=self._data_type_collation_sequence(self._collation, column_type), ) self._mysql_cur_dict.execute( """ SELECT INDEX_NAME AS `name`, IF (NON_UNIQUE = 0 AND INDEX_NAME = 'PRIMARY', 1, 0) AS `primary`, IF (NON_UNIQUE = 0 AND INDEX_NAME <> 'PRIMARY', 1, 0) AS `unique`, GROUP_CONCAT(COLUMN_NAME ORDER BY SEQ_IN_INDEX) AS `columns` FROM information_schema.STATISTICS WHERE TABLE_SCHEMA = %s AND TABLE_NAME = %s GROUP BY INDEX_NAME, NON_UNIQUE """, (self._mysql_database, table_name), ) for index in self._mysql_cur_dict.fetchall(): if index is not None: columns: str = "" if isinstance(index["columns"], bytes): columns = index["columns"].decode() elif isinstance(index["columns"], str): columns = index["columns"] if len(columns) > 0: if index["primary"] in {1, "1"}: primary += "\n\tPRIMARY KEY ({})".format( ", ".join(f'"{column}"' for column in columns.split(",")) ) else: indices += """CREATE {unique} INDEX IF NOT EXISTS "{name}" ON "{table}" ({columns});""".format( unique="UNIQUE" if index["unique"] in {1, "1"} else "", name="{table}_{name}".format( table=table_name, name=index["name"].decode() if isinstance(index["name"], bytes) else index["name"], ) if self._prefix_indices else index["name"].decode() if isinstance(index["name"], bytes) else index["name"], table=table_name, columns=", ".join(f'"{column}"' for column in columns.split(",")), ) sql += primary sql = sql.rstrip(", ") if not self._without_foreign_keys: server_version: t.Tuple[int, ...] = self._mysql.get_server_version() self._mysql_cur_dict.execute( """ SELECT k.COLUMN_NAME AS `column`, k.REFERENCED_TABLE_NAME AS `ref_table`, k.REFERENCED_COLUMN_NAME AS `ref_column`, c.UPDATE_RULE AS `on_update`, c.DELETE_RULE AS `on_delete` FROM information_schema.TABLE_CONSTRAINTS AS i {JOIN} information_schema.KEY_COLUMN_USAGE AS k ON i.CONSTRAINT_NAME = k.CONSTRAINT_NAME {JOIN} information_schema.REFERENTIAL_CONSTRAINTS AS c ON c.CONSTRAINT_NAME = i.CONSTRAINT_NAME WHERE i.TABLE_SCHEMA = %s AND i.TABLE_NAME = %s AND i.CONSTRAINT_TYPE = %s GROUP BY i.CONSTRAINT_NAME, k.COLUMN_NAME, k.REFERENCED_TABLE_NAME, k.REFERENCED_COLUMN_NAME, c.UPDATE_RULE, c.DELETE_RULE """.format( JOIN="JOIN" if (server_version[0] == 8 and server_version[2] > 19) else "LEFT JOIN" ), (self._mysql_database, table_name, "FOREIGN KEY"), ) for foreign_key in self._mysql_cur_dict.fetchall(): if foreign_key is not None: sql += ( ',\n\tFOREIGN KEY("{column}") REFERENCES "{ref_table}" ("{ref_column}") ' "ON UPDATE {on_update} " "ON DELETE {on_delete}".format(**foreign_key) # type: ignore[str-bytes-safe] ) sql += "\n);" sql += indices return sql def _create_table(self, table_name: str, attempting_reconnect: bool = False) -> None: try: if attempting_reconnect: self._mysql.reconnect() self._sqlite_cur.executescript(self._build_create_table_sql(table_name)) self._sqlite.commit() except mysql.connector.Error as err: if err.errno == errorcode.CR_SERVER_LOST: if not attempting_reconnect: self._logger.warning("Connection to MySQL server lost.\nAttempting to reconnect.") self._create_table(table_name, True) else: self._logger.warning("Connection to MySQL server lost.\nReconnection attempt aborted.") raise self._logger.error( "MySQL failed reading table definition from table %s: %s", table_name, err, ) raise except sqlite3.Error as err: self._logger.error("SQLite failed creating table %s: %s", table_name, err) raise def _transfer_table_data( self, table_name: str, sql: str, total_records: int = 0, attempting_reconnect: bool = False ) -> None: if attempting_reconnect: self._mysql.reconnect() try: if self._chunk_size is not None and self._chunk_size > 0: for chunk in trange( self._current_chunk_number, int(ceil(total_records / self._chunk_size)), disable=self._quiet, ): self._current_chunk_number = chunk self._sqlite_cur.executemany( sql, ( tuple(encode_data_for_sqlite(col) if col is not None else None for col in row) for row in self._mysql_cur.fetchmany(self._chunk_size) ), ) else: self._sqlite_cur.executemany( sql, ( tuple(encode_data_for_sqlite(col) if col is not None else None for col in row) for row in tqdm( self._mysql_cur.fetchall(), total=total_records, disable=self._quiet, ) ), ) self._sqlite.commit() except mysql.connector.Error as err: if err.errno == errorcode.CR_SERVER_LOST: if not attempting_reconnect: self._logger.warning("Connection to MySQL server lost.\nAttempting to reconnect.") self._transfer_table_data( table_name=table_name, sql=sql, total_records=total_records, attempting_reconnect=True, ) else: self._logger.warning("Connection to MySQL server lost.\nReconnection attempt aborted.") raise self._logger.error( "MySQL transfer failed reading table data from table %s: %s", table_name, err, ) raise except sqlite3.Error as err: self._logger.error( "SQLite transfer failed inserting data into table %s: %s", table_name, err, ) raise def transfer(self) -> None: """The primary and only method with which we transfer all the data.""" if len(self._mysql_tables) > 0 or len(self._exclude_mysql_tables) > 0: # transfer only specific tables specific_tables: t.Sequence[str] = ( self._exclude_mysql_tables if len(self._exclude_mysql_tables) > 0 else self._mysql_tables ) self._mysql_cur_prepared.execute( """ SELECT TABLE_NAME FROM information_schema.TABLES WHERE TABLE_SCHEMA = SCHEMA() AND TABLE_NAME {exclude} IN ({placeholders}) """.format( exclude="NOT" if len(self._exclude_mysql_tables) > 0 else "", placeholders=("%s, " * len(specific_tables)).rstrip(" ,"), ), specific_tables, ) tables: t.Iterable[ToPythonOutputTypes] = (row[0] for row in self._mysql_cur_prepared.fetchall()) else: # transfer all tables self._mysql_cur.execute( """ SELECT TABLE_NAME FROM information_schema.TABLES WHERE TABLE_SCHEMA = SCHEMA() """ ) tables = (row[0].decode() for row in self._mysql_cur.fetchall()) # type: ignore[union-attr] try: # turn off foreign key checking in SQLite while transferring data self._sqlite_cur.execute("PRAGMA foreign_keys=OFF") for table_name in tables: if isinstance(table_name, bytes): table_name = table_name.decode() self._logger.info( "%sTransferring table %s", "[WITHOUT DATA] " if self._without_data else "", table_name, ) # reset the chunk self._current_chunk_number = 0 # create the table self._create_table(table_name) # type: ignore[arg-type] if not self._without_data: # get the size of the data if self._limit_rows > 0: # limit to the requested number of rows self._mysql_cur_dict.execute( "SELECT COUNT(*) AS `total_records` " f"FROM (SELECT * FROM `{table_name}` LIMIT {self._limit_rows}) AS `table`" ) else: # get all rows self._mysql_cur_dict.execute(f"SELECT COUNT(*) AS `total_records` FROM `{table_name}`") total_records: t.Optional[t.Dict[str, ToPythonOutputTypes]] = self._mysql_cur_dict.fetchone() if total_records is not None: total_records_count: int = int(total_records["total_records"]) # type: ignore[arg-type] else: total_records_count = 0 # only continue if there is anything to transfer if total_records_count > 0: # populate it self._mysql_cur.execute( "SELECT * FROM `{table_name}` {limit}".format( table_name=table_name, limit=f"LIMIT {self._limit_rows}" if self._limit_rows > 0 else "", ) ) columns: t.Tuple[str, ...] = tuple(column[0] for column in self._mysql_cur.description) # type: ignore[union-attr] # build the SQL string sql = """ INSERT OR IGNORE INTO "{table}" ({fields}) VALUES ({placeholders}) """.format( table=table_name, fields=('"{}", ' * len(columns)).rstrip(" ,").format(*columns), placeholders=("?, " * len(columns)).rstrip(" ,"), ) self._transfer_table_data( table_name=table_name, # type: ignore[arg-type] sql=sql, total_records=total_records_count, ) except Exception: # pylint: disable=W0706 raise finally: # re-enable foreign key checking once done transferring self._sqlite_cur.execute("PRAGMA foreign_keys=ON") if self._vacuum: self._logger.info("Vacuuming created SQLite database file.\nThis might take a while.") self._sqlite_cur.execute("VACUUM") self._logger.info("Done!") 0707010000001D000081A40000000000000000000000016575A15F000007C5000000000000000000000000000000000000003100000000mysql-to-sqlite3-2.1.6/mysql_to_sqlite3/types.py"""Types for mysql-to-sqlite3.""" import os import typing as t from logging import Logger from sqlite3 import Connection, Cursor import typing_extensions as tx from mysql.connector.abstracts import MySQLConnectionAbstract from mysql.connector.cursor import MySQLCursorDict, MySQLCursorPrepared, MySQLCursorRaw class MySQLtoSQLiteParams(tx.TypedDict): """MySQLtoSQLite parameters.""" buffered: t.Optional[bool] chunk: t.Optional[int] collation: t.Optional[str] exclude_mysql_tables: t.Optional[t.Sequence[str]] json_as_text: t.Optional[bool] limit_rows: t.Optional[int] log_file: t.Optional[t.Union[str, "os.PathLike[t.Any]"]] mysql_database: str mysql_host: str mysql_password: t.Optional[t.Union[str, bool]] mysql_port: int mysql_ssl_disabled: t.Optional[bool] mysql_tables: t.Optional[t.Sequence[str]] mysql_user: str prefix_indices: t.Optional[bool] quiet: t.Optional[bool] sqlite_file: t.Union[str, "os.PathLike[t.Any]"] vacuum: t.Optional[bool] without_data: t.Optional[bool] without_foreign_keys: t.Optional[bool] class MySQLtoSQLiteAttributes: """MySQLtoSQLite attributes.""" _buffered: bool _chunk_size: t.Optional[int] _collation: str _current_chunk_number: int _exclude_mysql_tables: t.Sequence[str] _json_as_text: bool _limit_rows: int _logger: Logger _mysql: MySQLConnectionAbstract _mysql_cur: MySQLCursorRaw _mysql_cur_dict: MySQLCursorDict _mysql_cur_prepared: MySQLCursorPrepared _mysql_database: str _mysql_host: str _mysql_password: t.Optional[str] _mysql_port: int _mysql_ssl_disabled: bool _mysql_tables: t.Sequence[str] _mysql_user: str _prefix_indices: bool _quiet: bool _sqlite: Connection _sqlite_cur: Cursor _sqlite_file: t.Union[str, "os.PathLike[t.Any]"] _sqlite_json1_extension_enabled: bool _vacuum: bool _without_data: bool _without_foreign_keys: bool 0707010000001E000081A40000000000000000000000016575A15F00000C0B000000000000000000000000000000000000002600000000mysql-to-sqlite3-2.1.6/pyproject.toml[build-system] requires = ["hatchling"] build-backend = "hatchling.build" [project] name = "mysql-to-sqlite3" description = "A simple Python tool to transfer data from MySQL to SQLite 3" readme = "README.md" license = { text = "MIT" } requires-python = ">=3.8" authors = [ { name = "Klemen Tusar", email = "techouse@gmail.com" }, ] keywords = [ "mysql", "sqlite3", "transfer", "data", "migrate", "migration", ] classifiers = [ "Development Status :: 5 - Production/Stable", "Environment :: Console", "Intended Audience :: End Users/Desktop", "Intended Audience :: Developers", "Intended Audience :: System Administrators", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", "Programming Language :: Python :: Implementation :: CPython", "Topic :: Database", ] dependencies = [ "Click>=8.1.3", "mysql-connector-python>=8.2.0", "pytimeparse2", "python-slugify>=7.0.0", "simplejson>=3.19.0", "tqdm>=4.65.0", "tabulate", "typing_extensions", ] dynamic = ["version"] [project.urls] Changelog = "https://github.com/techouse/mysql-to-sqlite3/blob/master/CHANGELOG.md" Source = "https://github.com/techouse/mysql-to-sqlite3" Sponsor = "https://github.com/sponsors/techouse" PayPal = "https://paypal.me/ktusar" [tool.hatch.version] path = "mysql_to_sqlite3/__init__.py" [tool.hatch.build.targets.sdist] include = [ "mysql_to_sqlite3", "tests", "README.md", "CHANGELOG.md", "CODE-OF-CONDUCT.md", "LICENSE", "requirements_dev.txt", ] [project.scripts] mysql2sqlite = "mysql_to_sqlite3.cli:cli" [tool.black] line-length = 120 target-version = ["py38", "py39", "py310", "py311", "py312"] include = '\.pyi?$' exclude = ''' ( /( \.eggs | \.git | \.hg | \.mypy_cache | \.tox | \.venv | _build | buck-out | build | dist )/ | foo.py ) ''' [tool.isort] line_length = 120 profile = "black" lines_after_imports = 2 known_first_party = "mysql_to_sqlite3" skip_gitignore = true [tool.pytest.ini_options] testpaths = ["tests"] norecursedirs = [".*", "venv", "env", "*.egg", "dist", "build"] minversion = "7.3.1" addopts = "-rsxX -l --tb=short --strict-markers" timeout = 300 markers = [ "init: Run the initialisation test functions", "transfer: Run the main transfer test functions", "exceptions: Run SQL exception test functions", "cli: Run the cli test functions", ] [tool.mypy] python_version = "3.8" exclude = [ "tests", "build", "dist", "venv", "env", ] warn_return_any = true warn_unused_configs = true [[tool.mypy.overrides]] module = "pytimeparse2.*,factory.*,docker.*" ignore_missing_imports = true0707010000001F000081A40000000000000000000000016575A15F000001A5000000000000000000000000000000000000002C00000000mysql-to-sqlite3-2.1.6/requirements_dev.txtClick>=8.1.3 docker>=6.1.3 factory-boy Faker>=18.10.0 mysql-connector-python>=8.2.0 mysqlclient>=2.1.1 pytest>=7.3.1 pytest-cov pytest-mock pytest-timeout pytimeparse2 python-slugify>=7.0.0 types-python-slugify simplejson>=3.19.1 types-simplejson sqlalchemy>=2.0.0 sqlalchemy-utils types-sqlalchemy-utils tox tqdm>=4.65.0 types-tqdm packaging tabulate types-tabulate typing_extensions requests types-requests mypy>=1.3.0 07070100000020000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000001D00000000mysql-to-sqlite3-2.1.6/tests07070100000021000081A40000000000000000000000016575A15F00000000000000000000000000000000000000000000002900000000mysql-to-sqlite3-2.1.6/tests/__init__.py07070100000022000081A40000000000000000000000016575A15F00002A67000000000000000000000000000000000000002900000000mysql-to-sqlite3-2.1.6/tests/conftest.pyimport json import os import socket import typing as t from codecs import open from contextlib import contextmanager from os.path import abspath, dirname, isfile, join from pathlib import Path from random import choice from string import ascii_lowercase, ascii_uppercase, digits from time import sleep import docker import mysql.connector import pytest from _pytest._py.path import LocalPath from _pytest.config import Config from _pytest.config.argparsing import Parser from _pytest.legacypath import TempdirFactory from click.testing import CliRunner from docker import DockerClient from docker.errors import NotFound from docker.models.containers import Container from faker import Faker from mysql.connector import MySQLConnection, errorcode from mysql.connector.connection_cext import CMySQLConnection from mysql.connector.pooling import PooledMySQLConnection from requests import HTTPError from sqlalchemy.exc import IntegrityError from sqlalchemy.orm import Session from sqlalchemy_utils import database_exists, drop_database from . import database, factories, models def pytest_addoption(parser: "Parser"): parser.addoption( "--mysql-user", dest="mysql_user", default="tester", help="MySQL user. Defaults to 'tester'.", ) parser.addoption( "--mysql-password", dest="mysql_password", default="testpass", help="MySQL password. Defaults to 'testpass'.", ) parser.addoption( "--mysql-database", dest="mysql_database", default="test_db", help="MySQL database name. Defaults to 'test_db'.", ) parser.addoption( "--mysql-host", dest="mysql_host", default="0.0.0.0", help="Test against a MySQL server running on this host. Defaults to '0.0.0.0'.", ) parser.addoption( "--mysql-port", dest="mysql_port", type=int, default=None, help="The TCP port of the MySQL server.", ) parser.addoption( "--no-docker", dest="use_docker", default=True, action="store_false", help="Do not use a Docker MySQL image to run the tests. " "If you decide to use this switch you will have to use a physical MySQL server.", ) parser.addoption( "--docker-mysql-image", dest="docker_mysql_image", default="mysql:latest", help="Run the tests against a specific MySQL Docker image. Defaults to mysql:latest. " "Check all supported versions here https://hub.docker.com/_/mysql", ) @pytest.fixture(scope="session", autouse=True) def cleanup_hanged_docker_containers() -> None: try: client: DockerClient = docker.from_env() for container in client.containers.list(): if container.name == "pytest_mysql_to_sqlite3": container.kill() break except Exception: pass def pytest_keyboard_interrupt() -> None: try: client: DockerClient = docker.from_env() for container in client.containers.list(): if container.name == "pytest_mysql_to_sqlite3": container.kill() break except Exception: pass class Helpers: @staticmethod @contextmanager def not_raises(exception: t.Type[Exception]) -> t.Generator: try: yield except exception: raise pytest.fail(f"DID RAISE {exception}") @staticmethod @contextmanager def session_scope(db: database.Database) -> t.Generator: """Provide a transactional scope around a series of operations.""" session: Session = db.Session() try: yield session session.commit() except Exception: session.rollback() raise finally: session.close() @pytest.fixture def helpers() -> t.Type[Helpers]: return Helpers @pytest.fixture() def sqlite_database(tmpdir: LocalPath) -> t.Union[str, Path, "os.PathLike[t.Any]"]: db_name: str = "".join(choice(ascii_uppercase + ascii_lowercase + digits) for _ in range(32)) return Path(tmpdir.join(Path(f"{db_name}.sqlite3"))) def is_port_in_use(port: int, host: str = "0.0.0.0") -> bool: with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: return s.connect_ex((host, port)) == 0 class MySQLCredentials(t.NamedTuple): """MySQL credentials.""" user: str password: str host: str port: int database: str @pytest.fixture(scope="session") def mysql_credentials(pytestconfig: Config) -> MySQLCredentials: db_credentials_file: str = abspath(join(dirname(__file__), "db_credentials.json")) if isfile(db_credentials_file): with open(db_credentials_file, "r", "utf-8") as fh: db_credentials: t.Dict[str, t.Any] = json.load(fh) return MySQLCredentials( user=db_credentials["mysql_user"], password=db_credentials["mysql_password"], database=db_credentials["mysql_database"], host=db_credentials["mysql_host"], port=db_credentials["mysql_port"], ) port: int = pytestconfig.getoption("mysql_port") or 3306 if pytestconfig.getoption("use_docker"): while is_port_in_use(port, pytestconfig.getoption("mysql_host")): if port >= 2**16 - 1: pytest.fail(f"No ports appear to be available on the host {pytestconfig.getoption('mysql_host')}") port += 1 return MySQLCredentials( user=pytestconfig.getoption("mysql_user") or "tester", password=pytestconfig.getoption("mysql_password") or "testpass", database=pytestconfig.getoption("mysql_database") or "test_db", host=pytestconfig.getoption("mysql_host") or "0.0.0.0", port=port, ) @pytest.fixture(scope="session") def mysql_instance(mysql_credentials: MySQLCredentials, pytestconfig: Config) -> t.Iterator[MySQLConnection]: container: t.Optional[Container] = None mysql_connection: t.Optional[t.Union[PooledMySQLConnection, MySQLConnection, CMySQLConnection]] = None mysql_available: bool = False mysql_connection_retries: int = 15 # failsafe db_credentials_file = abspath(join(dirname(__file__), "db_credentials.json")) if isfile(db_credentials_file): use_docker = False else: use_docker = pytestconfig.getoption("use_docker") if use_docker: """Connecting to a MySQL server within a Docker container is quite tricky :P Read more on the issue here https://hub.docker.com/_/mysql#no-connections-until-mysql-init-completes """ try: client = docker.from_env() except Exception as err: pytest.fail(str(err)) docker_mysql_image = pytestconfig.getoption("docker_mysql_image") or "mysql:latest" if not any(docker_mysql_image in image.tags for image in client.images.list()): print(f"Attempting to download Docker image {docker_mysql_image}'") try: client.images.pull(docker_mysql_image) except (HTTPError, NotFound) as err: pytest.fail(str(err)) container = client.containers.run( image=docker_mysql_image, name="pytest_mysql_to_sqlite3", ports={"3306/tcp": (mysql_credentials.host, f"{mysql_credentials.port}/tcp")}, environment={ "MYSQL_RANDOM_ROOT_PASSWORD": "yes", "MYSQL_USER": mysql_credentials.user, "MYSQL_PASSWORD": mysql_credentials.password, "MYSQL_DATABASE": mysql_credentials.database, }, command=[ "--character-set-server=utf8mb4", "--collation-server=utf8mb4_unicode_ci", ], detach=True, auto_remove=True, ) while not mysql_available and mysql_connection_retries > 0: try: mysql_connection = mysql.connector.connect( user=mysql_credentials.user, password=mysql_credentials.password, host=mysql_credentials.host, port=mysql_credentials.port, ) except mysql.connector.Error as err: if err.errno == errorcode.CR_SERVER_LOST: # sleep for two seconds and retry the connection sleep(2) else: raise finally: mysql_connection_retries -= 1 if mysql_connection and mysql_connection.is_connected(): mysql_available = True mysql_connection.close() else: if not mysql_available and mysql_connection_retries <= 0: raise ConnectionAbortedError("Maximum MySQL connection retries exhausted! Are you sure MySQL is running?") yield # type: ignore[misc] if use_docker and container is not None: container.kill() @pytest.fixture(scope="session") def mysql_database( tmpdir_factory: TempdirFactory, mysql_instance: MySQLConnection, mysql_credentials: MySQLCredentials, _session_faker: Faker, ) -> t.Iterator[database.Database]: temp_image_dir: LocalPath = tmpdir_factory.mktemp("images") db: database.Database = database.Database( f"mysql+mysqldb://{mysql_credentials.user}:{mysql_credentials.password}@{mysql_credentials.host}:{mysql_credentials.port}/{mysql_credentials.database}" ) with Helpers.session_scope(db) as session: for _ in range(_session_faker.pyint(min_value=12, max_value=24)): article: models.Article = factories.ArticleFactory() article.authors.append(factories.AuthorFactory()) article.tags.append(factories.TagFactory()) article.misc.append(factories.MiscFactory()) for _ in range(_session_faker.pyint(min_value=1, max_value=4)): article.images.append( factories.ImageFactory( path=join( str(temp_image_dir), _session_faker.year(), _session_faker.month(), _session_faker.day_of_month(), _session_faker.file_name(extension="jpg"), ) ) ) session.add(article) for _ in range(_session_faker.pyint(min_value=12, max_value=24)): session.add(factories.CrazyNameFactory()) try: session.commit() except IntegrityError: session.rollback() yield db if database_exists(db.engine.url): drop_database(db.engine.url) @pytest.fixture() def cli_runner() -> t.Iterator[CliRunner]: yield CliRunner() 07070100000023000081A40000000000000000000000016575A15F00000597000000000000000000000000000000000000002900000000mysql-to-sqlite3-2.1.6/tests/database.pyimport typing as t from datetime import datetime, timedelta from decimal import Decimal import simplejson as json from sqlalchemy import create_engine from sqlalchemy.engine import Engine from sqlalchemy.orm import sessionmaker from sqlalchemy_utils import create_database, database_exists from .models import Base class Database: engine: Engine Session: sessionmaker def __init__(self, database_uri): self.Session = sessionmaker() self.engine = create_engine(database_uri, json_serializer=self.dumps, json_deserializer=json.loads) if not database_exists(self.engine.url): create_database(self.engine.url) self._create_db_tables() self.Session.configure(bind=self.engine) def _create_db_tables(self) -> None: Base.metadata.create_all(self.engine) @classmethod def dumps(cls, data: t.Any) -> str: return json.dumps(data, default=cls.json_serializer) @staticmethod def json_serializer(data: t.Any) -> t.Optional[str]: if isinstance(data, datetime): return data.isoformat() if isinstance(data, Decimal): return str(data) if isinstance(data, timedelta): hours, remainder = divmod(data.total_seconds(), 3600) minutes, seconds = divmod(remainder, 60) return "{:02}:{:02}:{:02}".format(int(hours), int(minutes), int(seconds)) return None 07070100000024000081A40000000000000000000000016575A15F000011DF000000000000000000000000000000000000002A00000000mysql-to-sqlite3-2.1.6/tests/factories.pyimport typing as t from os import environ import factory from . import faker_providers, models factory.Faker.add_provider(faker_providers.DateTimeProviders) class AuthorFactory(factory.Factory): class Meta: model: t.Type[models.Author] = models.Author name: factory.Faker = factory.Faker("name") class ImageFactory(factory.Factory): class Meta: model: t.Type[models.Image] = models.Image path: factory.Faker = factory.Faker("file_path", depth=3, extension="jpg") description: factory.Faker = factory.Faker("sentence", nb_words=12, variable_nb_words=True) class TagFactory(factory.Factory): class Meta: model: t.Type[models.Tag] = models.Tag name: factory.Faker = factory.Faker("sentence", nb_words=3, variable_nb_words=True) class MiscFactory(factory.Factory): class Meta: model: t.Type[models.Misc] = models.Misc big_integer_field: factory.Faker = factory.Faker("pyint", max_value=10**9) large_binary_field: factory.Faker = factory.Faker("binary", length=1024 * 10) boolean_field: factory.Faker = factory.Faker("boolean") char_field: factory.Faker = factory.Faker("text", max_nb_chars=255) date_field: factory.Faker = factory.Faker("date_this_decade") date_time_field: factory.Faker = factory.Faker("date_time_this_century_without_microseconds") decimal_field: factory.Faker = factory.Faker("pydecimal", left_digits=8, right_digits=2) float_field: factory.Faker = factory.Faker("pyfloat", left_digits=8, right_digits=4) integer_field: factory.Faker = factory.Faker("pyint", min_value=-(2**31), max_value=2**31 - 1) if environ.get("LEGACY_DB", "0") == "0": json_field: factory.Faker = factory.Faker("pydict") nchar_field: factory.Faker = factory.Faker("text", max_nb_chars=255) numeric_field: factory.Faker = factory.Faker("pyfloat", left_digits=8, right_digits=4) unicode_field: factory.Faker = factory.Faker("text", max_nb_chars=255) real_field: factory.Faker = factory.Faker("pyfloat", left_digits=8, right_digits=4) small_integer_field: factory.Faker = factory.Faker("pyint", min_value=-(2**15), max_value=2**15 - 1) string_field: factory.Faker = factory.Faker("text", max_nb_chars=255) text_field: factory.Faker = factory.Faker("text", max_nb_chars=1024) time_field: factory.Faker = factory.Faker("time_object_without_microseconds") varbinary_field: factory.Faker = factory.Faker("binary", length=255) varchar_field: factory.Faker = factory.Faker("text", max_nb_chars=255) timestamp_field: factory.Faker = factory.Faker("date_time_this_century_without_microseconds") class ArticleFactory(factory.Factory): class Meta: model: t.Type[models.Article] = models.Article hash: factory.Faker = factory.Faker("md5") title: factory.Faker = factory.Faker("sentence", nb_words=6) slug: factory.Faker = factory.Faker("slug") content: factory.Faker = factory.Faker("text", max_nb_chars=1024) status: factory.Faker = factory.Faker("pystr", max_chars=1) published: factory.Faker = factory.Faker("date_between", start_date="-1y", end_date="-1d") @factory.post_generation def authors(self, create, extracted, **kwargs): if not create: # Simple build, do nothing. return if extracted: # A list of authors were passed in, use them for author in extracted: self.authors.add(author) @factory.post_generation def tags(self, create, extracted, **kwargs): if not create: # Simple build, do nothing. return if extracted: # A list of authors were passed in, use them for tag in extracted: self.tags.add(tag) @factory.post_generation def images(self, create, extracted, **kwargs): if not create: # Simple build, do nothing. return if extracted: # A list of authors were passed in, use them for image in extracted: self.images.add(image) @factory.post_generation def misc(self, create, extracted, **kwargs): if not create: # Simple build, do nothing. return if extracted: # A list of authors were passed in, use them for misc in extracted: self.misc.add(misc) class CrazyNameFactory(factory.Factory): class Meta: model: t.Type[models.CrazyName] = models.CrazyName name: factory.Faker = factory.Faker("name") 07070100000025000081A40000000000000000000000016575A15F00000308000000000000000000000000000000000000003000000000mysql-to-sqlite3-2.1.6/tests/faker_providers.pyimport datetime from typing import Optional from faker.providers import BaseProvider, date_time from faker.typing import DateParseType class DateTimeProviders(BaseProvider): def time_object_without_microseconds(self, end_datetime: Optional[DateParseType] = None) -> datetime.time: return date_time.Provider(self.generator).time_object(end_datetime).replace(microsecond=0) def date_time_this_century_without_microseconds( self, before_now: bool = True, after_now: bool = False, tzinfo: Optional[datetime.tzinfo] = None, ) -> datetime.datetime: return ( date_time.Provider(self.generator) .date_time_this_century(before_now, after_now, tzinfo) .replace(microsecond=0) ) 07070100000026000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000002200000000mysql-to-sqlite3-2.1.6/tests/func07070100000027000081A40000000000000000000000016575A15F00000000000000000000000000000000000000000000002E00000000mysql-to-sqlite3-2.1.6/tests/func/__init__.py07070100000028000081A40000000000000000000000016575A15F0000B995000000000000000000000000000000000000003B00000000mysql-to-sqlite3-2.1.6/tests/func/mysql_to_sqlite3_test.pyimport logging import os import re import typing as t from collections import namedtuple from decimal import Decimal from pathlib import Path from random import choice, sample import mysql.connector import pytest import simplejson as json from _pytest._py.path import LocalPath from _pytest.logging import LogCaptureFixture from faker import Faker from mysql.connector import MySQLConnection, errorcode from mysql.connector.connection_cext import CMySQLConnection from mysql.connector.cursor import MySQLCursor from mysql.connector.pooling import PooledMySQLConnection from pytest_mock import MockFixture from sqlalchemy import ( Connection, CursorResult, Engine, Inspector, MetaData, Row, Select, Table, TextClause, create_engine, inspect, select, text, ) from sqlalchemy.engine.interfaces import ReflectedIndex from mysql_to_sqlite3 import MySQLtoSQLite from tests.conftest import Helpers, MySQLCredentials from tests.database import Database @pytest.mark.usefixtures("mysql_instance") class TestMySQLtoSQLite: @pytest.mark.init @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_missing_mysql_user_raises_exception(self, mysql_credentials: MySQLCredentials, quiet: bool) -> None: with pytest.raises(ValueError) as excinfo: MySQLtoSQLite(mysql_database=mysql_credentials.database, quiet=quiet) # type: ignore[call-arg] assert "Please provide a MySQL user" in str(excinfo.value) @pytest.mark.init @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_missing_mysql_database_raises_exception(self, faker: Faker, quiet: bool) -> None: with pytest.raises(ValueError) as excinfo: MySQLtoSQLite(mysql_user=faker.first_name().lower(), quiet=quiet) # type: ignore[call-arg] assert "Please provide a MySQL database" in str(excinfo.value) @pytest.mark.init @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_invalid_mysql_credentials_raises_access_denied_exception( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, faker: Faker, quiet: bool, ) -> None: with pytest.raises(mysql.connector.Error) as excinfo: MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=faker.first_name().lower(), mysql_password=faker.password(length=16), mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, quiet=quiet, ) assert "Access denied for user" in str(excinfo.value) @pytest.mark.init @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_bad_mysql_connection( self, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, mocker: MockFixture, quiet: bool, ) -> None: FakeConnector = namedtuple("FakeConnector", ["is_connected"]) mocker.patch.object( mysql.connector, "connect", return_value=FakeConnector(is_connected=lambda: False), ) with pytest.raises((ConnectionError, IOError)) as excinfo: MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, mysql_database=mysql_credentials.database, chunk=1000, quiet=quiet, ) assert "Unable to connect to MySQL" in str(excinfo.value) @pytest.mark.init @pytest.mark.parametrize( "exception, quiet", [ pytest.param( mysql.connector.Error(msg="Unknown database 'test_db'", errno=errorcode.ER_BAD_DB_ERROR), False, id="mysql.connector.Error verbose", ), pytest.param( mysql.connector.Error(msg="Unknown database 'test_db'", errno=errorcode.ER_BAD_DB_ERROR), True, id="mysql.connector.Error quiet", ), pytest.param(Exception("Unknown database 'test_db'"), False, id="Exception verbose"), pytest.param(Exception("Unknown database 'test_db'"), True, id="Exception quiet"), ], ) def test_non_existing_mysql_database_raises_exception( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, faker: Faker, mocker: MockFixture, caplog: LogCaptureFixture, exception: Exception, quiet: bool, ) -> None: class FakeMySQLConnection(MySQLConnection): @property def database(self) -> str: return self._database @database.setter def database(self, value) -> None: self._database = value # raise a fake exception raise exception def is_connected(self) -> bool: return True def cursor( self, buffered: t.Optional[bool] = None, raw: t.Optional[bool] = None, prepared: t.Optional[bool] = None, cursor_class: t.Optional[t.Type[MySQLCursor]] = None, dictionary: t.Optional[bool] = None, named_tuple: t.Optional[bool] = None, ) -> t.Union[t.Any, MySQLCursor]: return True caplog.set_level(logging.DEBUG) mocker.patch.object(mysql.connector, "connect", return_value=FakeMySQLConnection()) with pytest.raises((mysql.connector.Error, Exception)) as excinfo: MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, quiet=quiet, ) assert any("MySQL Database does not exist!" in message for message in caplog.messages) assert "Unknown database" in str(excinfo.value) @pytest.mark.xfail @pytest.mark.init @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_log_to_file( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, caplog: LogCaptureFixture, tmpdir: LocalPath, faker: Faker, quiet: bool, ) -> None: log_file: LocalPath = tmpdir.join(Path("db.log")) caplog.set_level(logging.DEBUG) with pytest.raises(mysql.connector.Error): MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=faker.first_name().lower(), mysql_password=faker.password(length=16), mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, log_file=str(log_file), quiet=quiet, ) assert any("Access denied for user" in message for message in caplog.messages) with log_file.open("r") as log_fh: log = log_fh.read() if len(caplog.messages) > 1: assert caplog.messages[1] in log else: assert caplog.messages[0] in log assert re.match(r"^\d{4,}-\d{2,}-\d{2,}\s+\d{2,}:\d{2,}:\d{2,}\s+\w+\s+", log) is not None @pytest.mark.transfer @pytest.mark.parametrize( "chunk, vacuum, buffered, prefix_indices", [ # 0000 pytest.param( None, False, False, False, id="no chunk, no vacuum, no buffered cursor, no prefix indices", ), # 0001 pytest.param( None, False, False, True, id="no chunk, no vacuum, no buffered cursor, prefix indices", ), # 1110 pytest.param( 10, True, True, False, id="chunk, vacuum, buffered cursor, no prefix indices", ), # 1111 pytest.param( 10, True, True, True, id="chunk, vacuum, buffered cursor, prefix indices", ), # 1100 pytest.param( 10, True, False, False, id="chunk, vacuum, no buffered cursor, no prefix indices", ), # 1101 pytest.param( 10, True, False, True, id="chunk, vacuum, no buffered cursor, prefix indices", ), # 0110 pytest.param( None, True, True, False, id="no chunk, vacuum, buffered cursor, no prefix indices", ), # 0111 pytest.param( None, True, True, True, id="no chunk, vacuum, buffered cursor, prefix indices", ), # 0100 pytest.param( None, True, False, False, id="no chunk, vacuum, no buffered cursor, no prefix indices", ), # 0101 pytest.param( None, True, False, True, id="no chunk, vacuum, no buffered cursor, prefix indices", ), # 1000 pytest.param( 10, False, False, False, id="chunk, no vacuum, no buffered cursor, no prefix indices", ), # 1001 pytest.param( 10, False, False, True, id="chunk, no vacuum, no buffered cursor, prefix indices", ), # 0010 pytest.param( None, False, True, False, id="no chunk, no vacuum, buffered cursor, no prefix indices", ), # 0011 pytest.param( None, False, True, True, id="no chunk, no vacuum, buffered cursor, prefix indices", ), # 1010 pytest.param( 10, False, True, False, id="chunk, no vacuum, buffered cursor, no prefix indices", ), # 1011 pytest.param( 10, False, True, True, id="chunk, no vacuum, buffered cursor, prefix indices", ), ], ) def test_transfer_transfers_all_tables_from_mysql_to_sqlite( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, helpers: Helpers, caplog: LogCaptureFixture, chunk: t.Optional[int], vacuum: bool, buffered: bool, prefix_indices: bool, ) -> None: proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, chunk=chunk, vacuum=vacuum, buffered=buffered, prefix_indices=prefix_indices, ) caplog.set_level(logging.DEBUG) proc.transfer() assert all( message in [record.message for record in caplog.records] for message in { "Transferring table article_authors", "Transferring table article_images", "Transferring table article_tags", "Transferring table articles", "Transferring table authors", "Transferring table images", "Transferring table tags", "Transferring table misc", "Done!", } ) assert all(record.levelname == "INFO" for record in caplog.records) assert not any(record.levelname == "ERROR" for record in caplog.records) sqlite_engine: Engine = create_engine( f"sqlite:///{sqlite_database}", json_serializer=json.dumps, json_deserializer=json.loads, ) sqlite_cnx: Connection = sqlite_engine.connect() sqlite_inspect: Inspector = inspect(sqlite_engine) sqlite_tables: t.List[str] = sqlite_inspect.get_table_names() mysql_engine: Engine = create_engine( f"mysql+mysqldb://{mysql_credentials.user}:{mysql_credentials.password}@{mysql_credentials.host}:{mysql_credentials.port}/{mysql_credentials.database}" ) mysql_cnx: Connection = mysql_engine.connect() mysql_inspect: Inspector = inspect(mysql_engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() mysql_connector_connection: t.Union[ PooledMySQLConnection, MySQLConnection, CMySQLConnection ] = mysql.connector.connect( user=mysql_credentials.user, password=mysql_credentials.password, host=mysql_credentials.host, port=mysql_credentials.port, database=mysql_credentials.database, ) server_version: t.Tuple[int, ...] = mysql_connector_connection.get_server_version() """ Test if both databases have the same table names """ assert sqlite_tables == mysql_tables """ Test if all the tables have the same column names """ for table_name in sqlite_tables: assert [column["name"] for column in sqlite_inspect.get_columns(table_name)] == [ column["name"] for column in mysql_inspect.get_columns(table_name) ] """ Test if all the tables have the same indices """ index_keys: t.Tuple[str, ...] = ("name", "column_names", "unique") mysql_indices: t.List[ReflectedIndex] = [] for table_name in mysql_tables: for index in mysql_inspect.get_indexes(table_name): mysql_index: t.Dict[str, t.Any] = {} for key in index_keys: if key == "name" and prefix_indices: mysql_index[key] = f"{table_name}_{index[key]}" # type: ignore[literal-required] else: mysql_index[key] = index[key] # type: ignore[literal-required] mysql_indices.append(t.cast(ReflectedIndex, mysql_index)) for table_name in sqlite_tables: for sqlite_index in sqlite_inspect.get_indexes(table_name): sqlite_index["unique"] = bool(sqlite_index["unique"]) if "dialect_options" in sqlite_index: sqlite_index.pop("dialect_options", None) assert sqlite_index in mysql_indices """ Test if all the tables have the same foreign keys """ for table_name in mysql_tables: mysql_fk_stmt: TextClause = text( """ SELECT k.COLUMN_NAME AS `from`, k.REFERENCED_TABLE_NAME AS `table`, k.REFERENCED_COLUMN_NAME AS `to`, c.UPDATE_RULE AS `on_update`, c.DELETE_RULE AS `on_delete` FROM information_schema.TABLE_CONSTRAINTS AS i {JOIN} information_schema.KEY_COLUMN_USAGE AS k ON i.CONSTRAINT_NAME = k.CONSTRAINT_NAME {JOIN} information_schema.REFERENTIAL_CONSTRAINTS c ON c.CONSTRAINT_NAME = i.CONSTRAINT_NAME WHERE i.TABLE_SCHEMA = :table_schema AND i.TABLE_NAME = :table_name AND i.CONSTRAINT_TYPE = :constraint_type """.format( # MySQL 8.0.19 still works with "LEFT JOIN" everything above requires "JOIN" JOIN="JOIN" if (server_version[0] == 8 and server_version[2] > 19) else "LEFT JOIN" ) ).bindparams( table_schema=mysql_credentials.database, table_name=table_name, constraint_type="FOREIGN KEY", ) mysql_fk_result: CursorResult = mysql_cnx.execute(mysql_fk_stmt) mysql_foreign_keys: t.List[t.Dict[str, t.Any]] = [dict(row) for row in mysql_fk_result.mappings()] sqlite_fk_stmt: TextClause = text(f'PRAGMA foreign_key_list("{table_name}")') sqlite_fk_result: CursorResult = sqlite_cnx.execute(sqlite_fk_stmt) if sqlite_fk_result.returns_rows: for row in sqlite_fk_result.mappings(): fk: t.Dict[str, t.Any] = dict(row) assert { "table": fk["table"], "from": fk["from"], "to": fk["to"], "on_update": fk["on_update"], "on_delete": fk["on_delete"], } in mysql_foreign_keys """ Check if all the data was transferred correctly """ sqlite_results: t.List[t.Tuple[t.Tuple[t.Any, ...], ...]] = [] mysql_results: t.List[t.Tuple[t.Tuple[t.Any, ...], ...]] = [] meta: MetaData = MetaData() for table_name in sqlite_tables: sqlite_table: Table = Table(table_name, meta, autoload_with=sqlite_engine) sqlite_stmt: Select = select(sqlite_table) sqlite_result: t.List[Row[t.Any]] = list(sqlite_cnx.execute(sqlite_stmt).fetchall()) sqlite_result.sort() sqlite_result_adapted: t.Tuple[t.Tuple[t.Any, ...], ...] = tuple( tuple(float(data) if isinstance(data, Decimal) else data for data in row) for row in sqlite_result ) sqlite_results.append(sqlite_result_adapted) for table_name in mysql_tables: mysql_table: Table = Table(table_name, meta, autoload_with=mysql_engine) mysql_stmt: Select = select(mysql_table) mysql_result: t.List[Row[t.Any]] = list(mysql_cnx.execute(mysql_stmt).fetchall()) mysql_result.sort() mysql_result_adapted: t.Tuple[t.Tuple[t.Any, ...], ...] = tuple( tuple(float(data) if isinstance(data, Decimal) else data for data in row) for row in mysql_result ) mysql_results.append(mysql_result_adapted) assert sqlite_results == mysql_results mysql_cnx.close() sqlite_cnx.close() mysql_engine.dispose() sqlite_engine.dispose() @pytest.mark.transfer def test_specific_tables_include_and_exclude_are_mutually_exclusive_options( self, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, caplog: LogCaptureFixture, faker: Faker, ) -> None: with pytest.raises(ValueError) as excinfo: MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_tables=faker.words(nb=3), exclude_mysql_tables=faker.words(nb=3), mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, ) assert "mysql_tables and exclude_mysql_tables are mutually exclusive" in str(excinfo.value) @pytest.mark.transfer @pytest.mark.parametrize( "chunk, vacuum, buffered, prefix_indices, exclude_tables", [ # 00000 pytest.param( None, False, False, False, False, id="no chunk, no vacuum, no buffered cursor, no prefix indices, include tables", ), # 00001 pytest.param( None, False, False, False, True, id="no chunk, no vacuum, no buffered cursor, no prefix indices, exclude tables", ), # 00010 pytest.param( None, False, False, True, False, id="no chunk, no vacuum, no buffered cursor, prefix indices, include tables", ), # 00011 pytest.param( None, False, False, True, True, id="no chunk, no vacuum, no buffered cursor, prefix indices, exclude tables", ), # 11100 pytest.param( 10, True, True, False, False, id="chunk, vacuum, buffered cursor, no prefix indices, include tables", ), # 11101 pytest.param( 10, True, True, False, True, id="chunk, vacuum, buffered cursor, no prefix indices, exclude tables", ), # 11110 pytest.param( 10, True, True, True, False, id="chunk, vacuum, buffered cursor, prefix indices, include tables", ), # 11111 pytest.param( 10, True, True, True, True, id="chunk, vacuum, buffered cursor, prefix indices, exclude tables", ), # 11000 pytest.param( 10, True, False, False, False, id="chunk, vacuum, no buffered cursor, no prefix indices, include tables", ), # 11001 pytest.param( 10, True, False, False, True, id="chunk, vacuum, no buffered cursor, no prefix indices, exclude tables", ), # 11010 pytest.param( 10, True, False, True, False, id="chunk, vacuum, no buffered cursor, prefix indices, include tables", ), # 11011 pytest.param( 10, True, False, True, True, id="chunk, vacuum, no buffered cursor, prefix indices, exclude tables", ), # 01100 pytest.param( None, True, True, False, False, id="no chunk, vacuum, buffered cursor, no prefix indices, include tables", ), # 01101 pytest.param( None, True, True, False, True, id="no chunk, vacuum, buffered cursor, no prefix indices, exclude tables", ), # 01110 pytest.param( None, True, True, True, False, id="no chunk, vacuum, buffered cursor, prefix indices, include tables", ), # 01111 pytest.param( None, True, True, True, True, id="no chunk, vacuum, buffered cursor, prefix indices, exclude tables", ), # 01000 pytest.param( None, True, False, False, False, id="no chunk, vacuum, no buffered cursor, no prefix indices, include tables", ), # 01001 pytest.param( None, True, False, False, True, id="no chunk, vacuum, no buffered cursor, no prefix indices, exclude tables", ), # 01010 pytest.param( None, True, False, True, False, id="no chunk, vacuum, no buffered cursor, prefix indices, include tables", ), # 01011 pytest.param( None, True, False, True, True, id="no chunk, vacuum, no buffered cursor, prefix indices, exclude tables", ), # 10000 pytest.param( 10, False, False, False, False, id="chunk, no vacuum, no buffered cursor, no prefix indices, include tables", ), # 10001 pytest.param( 10, False, False, False, True, id="chunk, no vacuum, no buffered cursor, no prefix indices, exclude tables", ), # 10010 pytest.param( 10, False, False, True, False, id="chunk, no vacuum, no buffered cursor, prefix indices, include tables", ), # 10011 pytest.param( 10, False, False, True, True, id="chunk, no vacuum, no buffered cursor, prefix indices, exclude tables", ), # 00100 pytest.param( None, False, True, False, False, id="no chunk, no vacuum, buffered cursor, no prefix indices, include tables", ), # 00101 pytest.param( None, False, True, False, True, id="no chunk, no vacuum, buffered cursor, no prefix indices, exclude tables", ), # 00110 pytest.param( None, False, True, True, False, id="no chunk, no vacuum, buffered cursor, prefix indices, include tables", ), # 00111 pytest.param( None, False, True, True, True, id="no chunk, no vacuum, buffered cursor, prefix indices, exclude tables", ), # 10100 pytest.param( 10, False, True, False, False, id="chunk, no vacuum, buffered cursor, no prefix indices, include tables", ), # 10101 pytest.param( 10, False, True, False, True, id="chunk, no vacuum, buffered cursor, no prefix indices, exclude tables", ), # 10110 pytest.param( 10, False, True, True, False, id="chunk, no vacuum, buffered cursor, prefix indices, include tables", ), # 10111 pytest.param( 10, False, True, True, True, id="chunk, no vacuum, buffered cursor, prefix indices, exclude tables", ), ], ) def test_transfer_specific_tables_transfers_only_specified_tables_from_mysql_to_sqlite( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, helpers: Helpers, caplog: LogCaptureFixture, chunk: t.Optional[int], vacuum: bool, buffered: bool, prefix_indices: bool, exclude_tables: bool, ) -> None: mysql_engine: Engine = create_engine( f"mysql+mysqldb://{mysql_credentials.user}:{mysql_credentials.password}@{mysql_credentials.host}:{mysql_credentials.port}/{mysql_credentials.database}" ) mysql_cnx: Connection = mysql_engine.connect() mysql_inspect: Inspector = inspect(mysql_engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() table_number: int = choice(range(1, len(mysql_tables))) random_mysql_tables: t.List[str] = sample(mysql_tables, table_number) random_mysql_tables.sort() remaining_tables: t.List[str] = list(set(mysql_tables) - set(random_mysql_tables)) remaining_tables.sort() proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_tables=None if exclude_tables else random_mysql_tables, exclude_mysql_tables=random_mysql_tables if exclude_tables else None, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, prefix_indices=prefix_indices, ) caplog.set_level(logging.DEBUG) proc.transfer() assert all( message in [record.message for record in caplog.records] for message in set( [ f"Transferring table {table}" for table in (remaining_tables if exclude_tables else random_mysql_tables) ] + ["Done!"] ) ) assert all(record.levelname == "INFO" for record in caplog.records) assert not any(record.levelname == "ERROR" for record in caplog.records) sqlite_engine: Engine = create_engine( f"sqlite:///{sqlite_database}", json_serializer=json.dumps, json_deserializer=json.loads, ) sqlite_cnx: Connection = sqlite_engine.connect() sqlite_inspect: Inspector = inspect(sqlite_engine) sqlite_tables: t.List[str] = sqlite_inspect.get_table_names() """ Test if both databases have the same table names """ if exclude_tables: assert set(sqlite_tables) == set(remaining_tables) else: assert set(sqlite_tables) == set(random_mysql_tables) """ Test if all the tables have the same column names """ for table_name in sqlite_tables: assert [column["name"] for column in sqlite_inspect.get_columns(table_name)] == [ column["name"] for column in mysql_inspect.get_columns(table_name) ] """ Test if all the tables have the same indices """ index_keys: t.Tuple[str, ...] = ("name", "column_names", "unique") mysql_indices: t.List[ReflectedIndex] = [] for table_name in remaining_tables if exclude_tables else random_mysql_tables: for index in mysql_inspect.get_indexes(table_name): mysql_index: t.Dict[str, t.Any] = {} for key in index_keys: if key == "name" and prefix_indices: mysql_index[key] = f"{table_name}_{index[key]}" # type: ignore[literal-required] else: mysql_index[key] = index[key] # type: ignore[literal-required] mysql_indices.append(t.cast(ReflectedIndex, mysql_index)) for table_name in sqlite_tables: for sqlite_index in sqlite_inspect.get_indexes(table_name): sqlite_index["unique"] = bool(sqlite_index["unique"]) if "dialect_options" in sqlite_index: sqlite_index.pop("dialect_options", None) assert sqlite_index in mysql_indices """ Check if all the data was transferred correctly """ sqlite_results: t.List[t.Tuple[t.Tuple[t.Any, ...], ...]] = [] mysql_results: t.List[t.Tuple[t.Tuple[t.Any, ...], ...]] = [] meta: MetaData = MetaData() for table_name in sqlite_tables: sqlite_table: Table = Table(table_name, meta, autoload_with=sqlite_engine) sqlite_stmt: Select = select(sqlite_table) sqlite_result: t.List[Row[t.Any]] = list(sqlite_cnx.execute(sqlite_stmt).fetchall()) sqlite_result.sort() sqlite_result_adapted = tuple( tuple(float(data) if isinstance(data, Decimal) else data for data in row) for row in sqlite_result ) sqlite_results.append(sqlite_result_adapted) for table_name in remaining_tables if exclude_tables else random_mysql_tables: mysql_table: Table = Table(table_name, meta, autoload_with=mysql_engine) mysql_stmt: Select = select(mysql_table) mysql_result: t.List[Row[t.Any]] = list(mysql_cnx.execute(mysql_stmt).fetchall()) mysql_result.sort() mysql_result_adapted = tuple( tuple(float(data) if isinstance(data, Decimal) else data for data in row) for row in mysql_result ) mysql_results.append(mysql_result_adapted) assert sqlite_results == mysql_results mysql_cnx.close() sqlite_cnx.close() mysql_engine.dispose() sqlite_engine.dispose() @pytest.mark.transfer @pytest.mark.parametrize( "chunk, vacuum, buffered, prefix_indices", [ # 0000 pytest.param( None, False, False, False, id="no chunk, no vacuum, no buffered cursor, no prefix indices", ), # 0001 pytest.param( None, False, False, True, id="no chunk, no vacuum, no buffered cursor, prefix indices", ), # 1110 pytest.param( 10, True, True, False, id="chunk, vacuum, buffered cursor, no prefix indices", ), # 1111 pytest.param( 10, True, True, True, id="chunk, vacuum, buffered cursor, prefix indices", ), # 1100 pytest.param( 10, True, False, False, id="chunk, vacuum, no buffered cursor, no prefix indices", ), # 1101 pytest.param( 10, True, False, True, id="chunk, vacuum, no buffered cursor, prefix indices", ), # 0110 pytest.param( None, True, True, False, id="no chunk, vacuum, buffered cursor, no prefix indices", ), # 0111 pytest.param( None, True, True, True, id="no chunk, vacuum, buffered cursor, prefix indices", ), # 0100 pytest.param( None, True, False, False, id="no chunk, vacuum, no buffered cursor, no prefix indices", ), # 0101 pytest.param( None, True, False, True, id="no chunk, vacuum, no buffered cursor, prefix indices", ), # 1000 pytest.param( 10, False, False, False, id="chunk, no vacuum, no buffered cursor, no prefix indices", ), # 1001 pytest.param( 10, False, False, True, id="chunk, no vacuum, no buffered cursor, prefix indices", ), # 0010 pytest.param( None, False, True, False, id="no chunk, no vacuum, buffered cursor, no prefix indices", ), # 0011 pytest.param( None, False, True, True, id="no chunk, no vacuum, buffered cursor, prefix indices", ), # 1010 pytest.param( 10, False, True, False, id="chunk, no vacuum, buffered cursor, no prefix indices", ), # 1011 pytest.param( 10, False, True, True, id="chunk, no vacuum, buffered cursor, prefix indices", ), ], ) def test_transfer_limited_rows_from_mysql_to_sqlite( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, helpers: Helpers, caplog: LogCaptureFixture, chunk: t.Optional[int], vacuum: bool, buffered: bool, prefix_indices: bool, ) -> None: limit_rows: int = choice(range(1, 10)) proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, limit_rows=limit_rows, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, prefix_indices=prefix_indices, ) caplog.set_level(logging.DEBUG) proc.transfer() assert all( message in [record.message for record in caplog.records] for message in { "Transferring table article_authors", "Transferring table article_images", "Transferring table article_tags", "Transferring table articles", "Transferring table authors", "Transferring table images", "Transferring table tags", "Transferring table misc", "Done!", } ) assert all(record.levelname == "INFO" for record in caplog.records) assert not any(record.levelname == "ERROR" for record in caplog.records) sqlite_engine: Engine = create_engine( f"sqlite:///{sqlite_database}", json_serializer=json.dumps, json_deserializer=json.loads, ) sqlite_cnx: Connection = sqlite_engine.connect() sqlite_inspect: Inspector = inspect(sqlite_engine) sqlite_tables: t.List[str] = sqlite_inspect.get_table_names() mysql_engine: Engine = create_engine( f"mysql+mysqldb://{mysql_credentials.user}:{mysql_credentials.password}@{mysql_credentials.host}:{mysql_credentials.port}/{mysql_credentials.database}" ) mysql_cnx: Connection = mysql_engine.connect() mysql_inspect: Inspector = inspect(mysql_engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() mysql_connector_connection: t.Union[ PooledMySQLConnection, MySQLConnection, CMySQLConnection ] = mysql.connector.connect( user=mysql_credentials.user, password=mysql_credentials.password, host=mysql_credentials.host, port=mysql_credentials.port, database=mysql_credentials.database, ) server_version: t.Tuple[int, ...] = mysql_connector_connection.get_server_version() """ Test if both databases have the same table names """ assert sqlite_tables == mysql_tables """ Test if all the tables have the same column names """ for table_name in sqlite_tables: assert [column["name"] for column in sqlite_inspect.get_columns(table_name)] == [ column["name"] for column in mysql_inspect.get_columns(table_name) ] """ Test if all the tables have the same indices """ index_keys: t.Tuple[str, ...] = ("name", "column_names", "unique") mysql_indices: t.List[ReflectedIndex] = [] for table_name in mysql_tables: for index in mysql_inspect.get_indexes(table_name): mysql_index: t.Dict[str, t.Any] = {} for key in index_keys: if key == "name" and prefix_indices: mysql_index[key] = f"{table_name}_{index[key]}" # type: ignore[literal-required] else: mysql_index[key] = index[key] # type: ignore[literal-required] mysql_indices.append(t.cast(ReflectedIndex, mysql_index)) for table_name in sqlite_tables: for sqlite_index in sqlite_inspect.get_indexes(table_name): sqlite_index["unique"] = bool(sqlite_index["unique"]) if "dialect_options" in sqlite_index: sqlite_index.pop("dialect_options", None) assert sqlite_index in mysql_indices """ Test if all the tables have the same foreign keys """ for table_name in mysql_tables: mysql_fk_stmt: TextClause = text( """ SELECT k.COLUMN_NAME AS `from`, k.REFERENCED_TABLE_NAME AS `table`, k.REFERENCED_COLUMN_NAME AS `to`, c.UPDATE_RULE AS `on_update`, c.DELETE_RULE AS `on_delete` FROM information_schema.TABLE_CONSTRAINTS AS i {JOIN} information_schema.KEY_COLUMN_USAGE AS k ON i.CONSTRAINT_NAME = k.CONSTRAINT_NAME {JOIN} information_schema.REFERENTIAL_CONSTRAINTS c ON c.CONSTRAINT_NAME = i.CONSTRAINT_NAME WHERE i.TABLE_SCHEMA = :table_schema AND i.TABLE_NAME = :table_name AND i.CONSTRAINT_TYPE = :constraint_type """.format( # MySQL 8.0.19 still works with "LEFT JOIN" everything above requires "JOIN" JOIN="JOIN" if (server_version[0] == 8 and server_version[2] > 19) else "LEFT JOIN" ) ).bindparams( table_schema=mysql_credentials.database, table_name=table_name, constraint_type="FOREIGN KEY", ) mysql_fk_result: CursorResult = mysql_cnx.execute(mysql_fk_stmt) mysql_foreign_keys: t.List[t.Dict[str, t.Any]] = [dict(row) for row in mysql_fk_result.mappings()] sqlite_fk_stmt: TextClause = text(f'PRAGMA foreign_key_list("{table_name}")') sqlite_fk_result: CursorResult = sqlite_cnx.execute(sqlite_fk_stmt) if sqlite_fk_result.returns_rows: for row in sqlite_fk_result.mappings(): fk: t.Dict[str, t.Any] = dict(row) assert { "table": fk["table"], "from": fk["from"], "to": fk["to"], "on_update": fk["on_update"], "on_delete": fk["on_delete"], } in mysql_foreign_keys """ Check if all the data was transferred correctly """ sqlite_results: t.List[t.Tuple[t.Tuple[t.Any, ...], ...]] = [] mysql_results: t.List[t.Tuple[t.Tuple[t.Any, ...], ...]] = [] meta: MetaData = MetaData() for table_name in sqlite_tables: sqlite_table: Table = Table(table_name, meta, autoload_with=sqlite_engine) sqlite_stmt: Select = select(sqlite_table) sqlite_result: t.List[Row[t.Any]] = list(sqlite_cnx.execute(sqlite_stmt).fetchall()) sqlite_result.sort() sqlite_result_adapted = tuple( tuple(float(data) if isinstance(data, Decimal) else data for data in row) for row in sqlite_result ) sqlite_results.append(sqlite_result_adapted) for table_name in mysql_tables: mysql_table: Table = Table(table_name, meta, autoload_with=mysql_engine) mysql_stmt: Select = select(mysql_table).limit(limit_rows) mysql_result: t.List[Row[t.Any]] = list(mysql_cnx.execute(mysql_stmt).fetchall()) mysql_result.sort() sqlite_result_adapted = tuple( tuple(float(data) if isinstance(data, Decimal) else data for data in row) for row in mysql_result ) mysql_results.append(sqlite_result_adapted) assert sqlite_results == mysql_results mysql_cnx.close() sqlite_cnx.close() mysql_engine.dispose() sqlite_engine.dispose() 07070100000029000081A40000000000000000000000016575A15F00003CD4000000000000000000000000000000000000002E00000000mysql-to-sqlite3-2.1.6/tests/func/test_cli.pyimport os import typing as t from random import choice, sample import pytest from click.testing import CliRunner, Result from faker import Faker from pytest_mock import MockFixture from sqlalchemy import Connection, Engine, Inspector, create_engine, inspect from mysql_to_sqlite3 import MySQLtoSQLite from mysql_to_sqlite3.cli import cli as mysql2sqlite from tests.conftest import MySQLCredentials from tests.database import Database @pytest.mark.cli @pytest.mark.usefixtures("mysql_instance") class TestMySQLtoSQLite: def test_no_arguments(self, cli_runner: CliRunner) -> None: result: Result = cli_runner.invoke(mysql2sqlite) assert result.exit_code > 0 assert any( message in result.output for message in { 'Error: Missing option "-f" / "--sqlite-file"', "Error: Missing option '-f' / '--sqlite-file'", } ) def test_non_existing_sqlite_file(self, cli_runner: CliRunner, faker: Faker) -> None: result: Result = cli_runner.invoke(mysql2sqlite, ["-f", faker.file_path(depth=1, extension=".sqlite3")]) assert result.exit_code > 0 assert any( message in result.output for message in { 'Error: Missing option "-d" / "--mysql-database"', "Error: Missing option '-d' / '--mysql-database'", } ) def test_no_database_name(self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]") -> None: result: Result = cli_runner.invoke(mysql2sqlite, ["-f", str(sqlite_database)]) assert result.exit_code > 0 assert any( message in result.output for message in { 'Error: Missing option "-d" / "--mysql-database"', "Error: Missing option '-d' / '--mysql-database'", } ) def test_no_database_user( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, ) -> None: result = cli_runner.invoke(mysql2sqlite, ["-f", str(sqlite_database), "-d", mysql_credentials.database]) assert result.exit_code > 0 assert any( message in result.output for message in { 'Error: Missing option "-u" / "--mysql-user"', "Error: Missing option '-u' / '--mysql-user'", } ) def test_invalid_database_name( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, faker: Faker, ) -> None: result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", "_".join(faker.words(nb=3)), "-u", faker.first_name().lower(), ], ) assert result.exit_code > 0 assert "1045 (28000): Access denied" in result.output def test_invalid_database_user( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, faker: Faker, ) -> None: result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", faker.first_name().lower(), ], ) assert result.exit_code > 0 assert "1045 (28000): Access denied" in result.output def test_invalid_database_password( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, faker: Faker, ) -> None: result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", mysql_credentials.user, "--mysql-password", faker.password(length=16), ], ) assert result.exit_code > 0 assert "1045 (28000): Access denied" in result.output def test_database_password_prompt( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, mysql_database: Database, ) -> None: result: Result = cli_runner.invoke( mysql2sqlite, args=[ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", mysql_credentials.user, "-p", ], input=mysql_credentials.password, ) assert result.exit_code == 0 def test_invalid_database_password_prompt( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, mysql_database: Database, faker: Faker, ) -> None: result: Result = cli_runner.invoke( mysql2sqlite, args=[ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", mysql_credentials.user, "-p", ], input=faker.password(length=16), ) assert result.exit_code > 0 assert "1045 (28000): Access denied" in result.output def test_invalid_database_port( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, faker: Faker, ) -> None: port: int = choice(range(2, 2**16 - 1)) if port == mysql_credentials.port: port -= 1 result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", mysql_credentials.user, "--mysql-password", mysql_credentials.password, "-h", mysql_credentials.host, "-P", str(port), ], ) assert result.exit_code > 0 assert any( message in result.output for message in { "2003 (HY000): Can't connect to MySQL server on", "2003: Can't connect to MySQL server", } ) @pytest.mark.parametrize( "chunk, vacuum, use_buffered_cursors, quiet", [ # 0000 pytest.param( None, False, False, False, id="no chunk, no vacuum, no buffered cursor, verbose", ), # 1110 pytest.param(10, True, True, False, id="chunk, vacuum, buffered cursor, verbose"), # 1100 pytest.param(10, True, False, False, id="chunk, vacuum, no buffered cursor, verbose"), # 0110 pytest.param(None, True, True, False, id="no chunk, vacuum, buffered cursor, verbose"), # 0100 pytest.param( None, True, False, False, id="no chunk, vacuum, no buffered cursor, verbose", ), # 1000 pytest.param( 10, False, False, False, id="chunk, no vacuum, no buffered cursor, verbose", ), # 0010 pytest.param( None, False, True, False, id="no chunk, no vacuum, buffered cursor, verbose", ), # 1010 pytest.param(10, False, True, False, id="chunk, no vacuum, buffered cursor, verbose"), # 0001 pytest.param( None, False, False, True, id="no chunk, no vacuum, no buffered cursor, quiet", ), # 1111 pytest.param(10, True, True, True, id="chunk, vacuum, buffered cursor, quiet"), # 1101 pytest.param(10, True, False, True, id="chunk, vacuum, no buffered cursor, quiet"), # 0111 pytest.param(None, True, True, True, id="no chunk, vacuum, buffered cursor, quiet"), # 0101 pytest.param( None, True, False, True, id="no chunk, vacuum, no buffered cursor, quiet", ), # 1001 pytest.param(10, False, False, True, id="chunk, no vacuum, no buffered cursor, quiet"), # 0011 pytest.param( None, False, True, True, id="no chunk, no vacuum, buffered cursor, quiet", ), # 1011 pytest.param(10, False, True, True, id="chunk, no vacuum, buffered cursor, quiet"), ], ) def test_minimum_valid_parameters( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, chunk: t.Optional[int], vacuum: bool, use_buffered_cursors: bool, quiet: bool, ) -> None: arguments: t.List[str] = [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", mysql_credentials.user, "--mysql-password", mysql_credentials.password, "-h", mysql_credentials.host, "-P", str(mysql_credentials.port), ] if chunk: arguments.append("-c") arguments.append(str(chunk)) if vacuum: arguments.append("-V") if use_buffered_cursors: arguments.append("--use-buffered-cursors") if quiet: arguments.append("-q") result: Result = cli_runner.invoke(mysql2sqlite, arguments) assert result.exit_code == 0 if quiet: assert result.output == "" else: assert result.output != "" def test_keyboard_interrupt( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, mysql_database: Database, mocker: MockFixture, ) -> None: mocker.patch.object(MySQLtoSQLite, "transfer", side_effect=KeyboardInterrupt()) result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-u", mysql_credentials.user, "--mysql-password", mysql_credentials.password, "-h", mysql_credentials.host, "-P", str(mysql_credentials.port), ], ) assert result.exit_code > 0 assert "Process interrupted" in result.output def test_specific_tables_include_and_exclude_are_mutually_exclusive_options( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, mysql_database: Database, ) -> None: mysql_engine: Engine = create_engine( f"mysql+mysqldb://{mysql_credentials.user}:{mysql_credentials.password}@{mysql_credentials.host}:{mysql_credentials.port}/{mysql_credentials.database}" ) mysql_cnx: Connection = mysql_engine.connect() mysql_inspect: Inspector = inspect(mysql_engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() table_number: int = choice(range(1, len(mysql_tables) // 2)) include_mysql_tables: t.List[str] = sample(mysql_tables, table_number) include_mysql_tables.sort() exclude_mysql_tables = list(set(sample(mysql_tables, table_number)) - set(include_mysql_tables)) exclude_mysql_tables.sort() result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-t", " ".join(include_mysql_tables), "-e", " ".join(exclude_mysql_tables), "-u", mysql_credentials.user, "--mysql-password", mysql_credentials.password, "-h", mysql_credentials.host, "-P", str(mysql_credentials.port), ], ) assert result.exit_code > 0 assert "Illegal usage: --mysql-tables and --exclude-mysql-tables are mutually exclusive!" in result.output mysql_cnx.close() mysql_engine.dispose() def test_transfer_specific_tables_only( self, cli_runner: CliRunner, sqlite_database: "os.PathLike[t.Any]", mysql_credentials: MySQLCredentials, mysql_database: Database, ) -> None: mysql_engine: Engine = create_engine( f"mysql+mysqldb://{mysql_credentials.user}:{mysql_credentials.password}@{mysql_credentials.host}:{mysql_credentials.port}/{mysql_credentials.database}" ) mysql_inspect: Inspector = inspect(mysql_engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() table_number: int = choice(range(1, len(mysql_tables))) result: Result = cli_runner.invoke( mysql2sqlite, [ "-f", str(sqlite_database), "-d", mysql_credentials.database, "-t", " ".join(sample(mysql_tables, table_number)), "-u", mysql_credentials.user, "--mysql-password", mysql_credentials.password, "-h", mysql_credentials.host, "-P", str(mysql_credentials.port), ], ) assert result.exit_code == 0 @pytest.mark.xfail def test_version(self, cli_runner: CliRunner) -> None: result = cli_runner.invoke(mysql2sqlite, ["--version"]) assert result.exit_code == 0 assert all( message in result.output for message in { "mysql-to-sqlite3", "Operating", "System", "Python", "MySQL", "SQLite", "click", "mysql-connector-python", "python-slugify", "pytimeparse2", "simplejson", "tabulate", "tqdm", } ) 0707010000002A000081A40000000000000000000000016575A15F00001B0C000000000000000000000000000000000000002700000000mysql-to-sqlite3-2.1.6/tests/models.pyimport typing as t from datetime import date, datetime, time from decimal import Decimal from os import environ from sqlalchemy import ( CHAR, DECIMAL, JSON, NCHAR, REAL, TIMESTAMP, VARBINARY, VARCHAR, BigInteger, Column, ForeignKey, Integer, LargeBinary, Numeric, SmallInteger, String, Table, Text, Time, Unicode, ) from sqlalchemy.dialects.mysql import BIGINT, INTEGER, MEDIUMINT, SMALLINT, TINYINT from sqlalchemy.orm import DeclarativeBase, Mapped, backref, mapped_column, relationship from sqlalchemy.sql.functions import current_timestamp class Base(DeclarativeBase): pass class Author(Base): __tablename__ = "authors" id: Mapped[int] = mapped_column(primary_key=True) name: Mapped[str] = mapped_column(String(128), nullable=False, index=True) dupe: Mapped[bool] = mapped_column(index=True, default=False) def __repr__(self): return f"<Author(id='{self.id}', name='{self.name}')>" article_authors = Table( "article_authors", Base.metadata, Column("article_id", Integer, ForeignKey("articles.id"), primary_key=True), Column("author_id", Integer, ForeignKey("authors.id"), primary_key=True), ) class Image(Base): __tablename__ = "images" id: Mapped[int] = mapped_column(primary_key=True) path: Mapped[str] = mapped_column(String(255), index=True) description: Mapped[str] = mapped_column(String(255), nullable=True) dupe: Mapped[bool] = mapped_column(index=True, default=False) def __repr__(self): return f"<Image(id='{self.id}', path='{self.path}')>" article_images = Table( "article_images", Base.metadata, Column("article_id", Integer, ForeignKey("articles.id"), primary_key=True), Column("image_id", Integer, ForeignKey("images.id"), primary_key=True), ) class Tag(Base): __tablename__ = "tags" id: Mapped[int] = mapped_column(primary_key=True) name: Mapped[str] = mapped_column(String(128), nullable=False, index=True) dupe: Mapped[bool] = mapped_column(index=True, default=False) def __repr__(self): return f"<Tag(id='{self.id}', name='{self.name}')>" article_tags = Table( "article_tags", Base.metadata, Column("article_id", Integer, ForeignKey("articles.id"), primary_key=True), Column("tag_id", Integer, ForeignKey("tags.id"), primary_key=True), ) class Misc(Base): """This model contains all possible MySQL types""" __tablename__ = "misc" id: Mapped[int] = mapped_column(primary_key=True) big_integer_field: Mapped[int] = mapped_column(BigInteger, default=0) big_integer_unsigned_field: Mapped[int] = mapped_column(BIGINT(unsigned=True), default=0) if environ.get("LEGACY_DB", "0") == "0": large_binary_field: Mapped[bytes] = mapped_column(LargeBinary, nullable=True, default=b"Lorem ipsum dolor") else: large_binary_field = mapped_column(LargeBinary, nullable=True) boolean_field: Mapped[bool] = mapped_column(default=False) char_field: Mapped[str] = mapped_column(CHAR(255), nullable=True) date_field: Mapped[date] = mapped_column(nullable=True) date_time_field: Mapped[datetime] = mapped_column(nullable=True) decimal_field: Mapped[Decimal] = mapped_column(DECIMAL(10, 2), nullable=True) float_field: Mapped[Decimal] = mapped_column(DECIMAL(12, 4), default=0) integer_field: Mapped[int] = mapped_column(default=0) integer_unsigned_field: Mapped[int] = mapped_column(INTEGER(unsigned=True), default=0) tinyint_field: Mapped[int] = mapped_column(TINYINT, default=0) tinyint_unsigned_field: Mapped[int] = mapped_column(TINYINT(unsigned=True), default=0) mediumint_field: Mapped[int] = mapped_column(MEDIUMINT, default=0) mediumint_unsigned_field: Mapped[int] = mapped_column(MEDIUMINT(unsigned=True), default=0) if environ.get("LEGACY_DB", "0") == "0": json_field: Mapped[t.Mapping[str, t.Any]] = mapped_column(JSON, nullable=True) nchar_field: Mapped[str] = mapped_column(NCHAR(255), nullable=True) numeric_field: Mapped[float] = mapped_column(Numeric(12, 4), default=0) unicode_field: Mapped[str] = mapped_column(Unicode(255), nullable=True) real_field: Mapped[float] = mapped_column(REAL(12), default=0) small_integer_field: Mapped[int] = mapped_column(SmallInteger, default=0) small_integer_unsigned_field: Mapped[int] = mapped_column(SMALLINT(unsigned=True), default=0) string_field: Mapped[str] = mapped_column(String(255), nullable=True) text_field: Mapped[str] = mapped_column(Text, nullable=True) time_field: Mapped[time] = mapped_column(Time, nullable=True) varbinary_field: Mapped[bytes] = mapped_column(VARBINARY(255), nullable=True) varchar_field: Mapped[str] = mapped_column(VARCHAR(255), nullable=True) timestamp_field: Mapped[datetime] = mapped_column(TIMESTAMP, default=current_timestamp()) dupe: Mapped[bool] = mapped_column(index=True, default=False) article_misc = Table( "article_misc", Base.metadata, Column("article_id", Integer, ForeignKey("articles.id"), primary_key=True), Column("misc_id", Integer, ForeignKey("misc.id"), primary_key=True), ) class Article(Base): __tablename__ = "articles" id: Mapped[int] = mapped_column(primary_key=True) hash: Mapped[str] = mapped_column(String(32), unique=True) slug: Mapped[str] = mapped_column(String(255), index=True) title: Mapped[str] = mapped_column(String(255), index=True) content: Mapped[str] = mapped_column(Text, nullable=True) status: Mapped[str] = mapped_column(CHAR(1), index=True) published: Mapped[datetime] = mapped_column(nullable=True) dupe: Mapped[bool] = mapped_column(index=True, default=False) # relationships authors: Mapped[t.List[Author]] = relationship( "Author", secondary=article_authors, backref=backref("authors", lazy="dynamic"), lazy="dynamic", ) tags: Mapped[t.List[Tag]] = relationship( "Tag", secondary=article_tags, backref=backref("tags", lazy="dynamic"), lazy="dynamic", ) images: Mapped[t.List[Image]] = relationship( "Image", secondary=article_images, backref=backref("images", lazy="dynamic"), lazy="dynamic", ) misc: Mapped[t.List[Misc]] = relationship( "Misc", secondary=article_misc, backref=backref("misc", lazy="dynamic"), lazy="dynamic", ) def __repr__(self): return f"<Article(id='{self.id}', title='{self.title}')>" class CrazyName(Base): __tablename__ = "crazy_name." id: Mapped[int] = mapped_column(primary_key=True) name: Mapped[str] = mapped_column(String(128), nullable=False, index=True) dupe: Mapped[bool] = mapped_column(index=True, default=False) def __repr__(self): return f"<CrazyName(id='{self.id}', name='{self.name}')>" 0707010000002B000041ED0000000000000000000000026575A15F00000000000000000000000000000000000000000000002200000000mysql-to-sqlite3-2.1.6/tests/unit0707010000002C000081A40000000000000000000000016575A15F00000000000000000000000000000000000000000000002E00000000mysql-to-sqlite3-2.1.6/tests/unit/__init__.py0707010000002D000081A40000000000000000000000016575A15F0000636A000000000000000000000000000000000000003B00000000mysql-to-sqlite3-2.1.6/tests/unit/mysql_to_sqlite3_test.pyimport logging import os import sqlite3 import typing as t from random import choice import mysql.connector import pytest from _pytest.logging import LogCaptureFixture from mysql.connector import errorcode from pytest_mock import MockerFixture, MockFixture from sqlalchemy import Inspector, inspect from sqlalchemy.dialects.mysql import __all__ as mysql_column_types from mysql_to_sqlite3 import MySQLtoSQLite from mysql_to_sqlite3.sqlite_utils import CollatingSequences from tests.conftest import MySQLCredentials from tests.database import Database class TestMySQLtoSQLiteClassmethods: def test_translate_type_from_mysql_to_sqlite_invalid_column_type( self, mocker: MockFixture, ) -> None: with pytest.raises(ValueError) as excinfo: mocker.patch.object(MySQLtoSQLite, "_valid_column_type", return_value=False) MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type="text") assert "is not a valid column_type!" in str(excinfo.value) def test_translate_type_from_mysql_to_sqlite_all_valid_columns(self) -> None: for column_type in mysql_column_types + ( "BIGINT UNSIGNED", "INTEGER UNSIGNED", "INT", "INT UNSIGNED", "SMALLINT UNSIGNED", "TINYINT UNSIGNED", "MEDIUMINT UNSIGNED", "CHAR(2)", "NCHAR(7)", "NVARCHAR(17)", "VARCHAR(123)", ): if any(c for c in column_type if c.islower()): continue elif column_type == "INT": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "INTEGER" elif column_type == "DECIMAL": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "DECIMAL" elif column_type == "YEAR": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "YEAR" elif column_type == "TIME": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "TIME" elif column_type == "TIMESTAMP": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "DATETIME" elif column_type in { "BINARY", "BIT", "LONGBLOB", "MEDIUMBLOB", "TINYBLOB", "VARBINARY", }: assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "BLOB" elif column_type == "CHAR": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "CHARACTER" elif column_type == "CHAR(2)": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "CHARACTER(2)" elif column_type == "NCHAR(7)": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "NCHAR(7)" elif column_type == "NVARCHAR(17)": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "NVARCHAR(17)" elif column_type == "VARCHAR(123)": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "VARCHAR(123)" elif column_type in { "ENUM", "LONGTEXT", "MEDIUMTEXT", "SET", "TINYTEXT", }: assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "TEXT" elif column_type == "JSON": assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "TEXT" assert ( MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type, sqlite_json1_extension_enabled=True) == "JSON" ) elif column_type.endswith(" UNSIGNED"): if column_type.startswith("INT "): assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == "INTEGER" else: assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == column_type.replace( " UNSIGNED", "" ) else: assert MySQLtoSQLite._translate_type_from_mysql_to_sqlite(column_type) == column_type @pytest.mark.parametrize( "column_default, column_extra, sqlite_default_translation", [ pytest.param(None, None, "", id="None"), pytest.param("", None, "DEFAULT ''", id='""'), pytest.param("lorem", None, "DEFAULT 'lorem'", id='"lorem"'), pytest.param( "lorem ipsum dolor", None, "DEFAULT 'lorem ipsum dolor'", id='"lorem ipsum dolor"', ), pytest.param("CURRENT_TIME", "DEFAULT_GENERATED", "DEFAULT CURRENT_TIME", id='"CURRENT_TIME"'), pytest.param("current_time", "DEFAULT_GENERATED", "DEFAULT CURRENT_TIME", id='"current_time"'), pytest.param("CURRENT_DATE", "DEFAULT_GENERATED", "DEFAULT CURRENT_DATE", id='"CURRENT_DATE"'), pytest.param("current_date", "DEFAULT_GENERATED", "DEFAULT CURRENT_DATE", id='"current_date"'), pytest.param( "CURRENT_TIMESTAMP", "DEFAULT_GENERATED", "DEFAULT CURRENT_TIMESTAMP", id='"CURRENT_TIMESTAMP"', ), pytest.param( "current_timestamp", "DEFAULT_GENERATED", "DEFAULT CURRENT_TIMESTAMP", id='"current_timestamp"', ), pytest.param(r"""_utf8mb4\'[]\'""", "DEFAULT_GENERATED", "DEFAULT '[]'", id=r"""_utf8mb4\'[]\'"""), pytest.param(r"""_latin1\'abc\'""", "DEFAULT_GENERATED", "DEFAULT 'abc'", id=r"""_latin1\'abc\'"""), pytest.param(r"""_binary\'abc\'""", "DEFAULT_GENERATED", "DEFAULT 'abc'", id=r"""_binary\'abc\'"""), pytest.param( r"""_latin1 X\'4D7953514C\'""", "DEFAULT_GENERATED", "DEFAULT x'4D7953514C'", id=r"""_latin1 X\'4D7953514C\'""", ), pytest.param( r"""_latin1 b\'1000001\'""", "DEFAULT_GENERATED", "DEFAULT 'A'", id=r"""_latin1 b\'1000001\'""" ), ], ) def test_translate_default_from_mysql_to_sqlite( self, column_default: t.Optional[str], column_extra: t.Optional[str], sqlite_default_translation: str, ) -> None: assert ( MySQLtoSQLite._translate_default_from_mysql_to_sqlite(column_default, column_extra=column_extra) == sqlite_default_translation ) @pytest.mark.parametrize( "column_default, sqlite_default_translation, sqlite_version", [ pytest.param(False, "DEFAULT(FALSE)", "3.23.0", id="False (NEW)"), pytest.param(True, "DEFAULT(TRUE)", "3.23.0", id="True (NEW)"), pytest.param(False, "DEFAULT '0'", "3.22.0", id="False (OLD)"), pytest.param(True, "DEFAULT '1'", "3.22.0", id="True (OLD)"), ], ) def test_translate_default_booleans_from_mysql_to_sqlite( self, mocker: MockerFixture, column_default: bool, sqlite_default_translation: str, sqlite_version: str, ) -> None: mocker.patch.object(sqlite3, "sqlite_version", sqlite_version) assert ( MySQLtoSQLite._translate_default_from_mysql_to_sqlite(column_default, "BOOLEAN") == sqlite_default_translation ) @pytest.mark.parametrize( "column_default, sqlite_default_translation, column_type", [ pytest.param("0", "DEFAULT '0'", "NUMERIC", id='"0" (NUMERIC)'), pytest.param("1", "DEFAULT '1'", "NUMERIC", id='"1" (NUMERIC)'), pytest.param("0", "DEFAULT '0'", "TEXT", id='"0" (TEXT)'), pytest.param("1", "DEFAULT '1'", "TEXT", id='"1" (TEXT)'), pytest.param(0, "DEFAULT '0'", "NUMERIC", id="0 (NUMERIC)"), pytest.param(1, "DEFAULT '1'", "NUMERIC", id="1 (NUMERIC)"), pytest.param(0, "DEFAULT '0'", "TEXT", id="0 (TEXT)"), pytest.param(1, "DEFAULT '1'", "TEXT", id="1 (TEXT)"), pytest.param(123456789, "DEFAULT '123456789'", "NUMERIC", id="123456789 (NUMERIC)"), pytest.param(1234.56789, "DEFAULT '1234.56789'", "NUMERIC", id="1234.56789 (NUMERIC)"), pytest.param(123456789, "DEFAULT '123456789'", "TEXT", id="123456789 (TEXT)"), pytest.param(1234.56789, "DEFAULT '1234.56789'", "TEXT", id="1234.56789 (TEXT)"), ], ) def test_translate_default_numbers_from_mysql_to_sqlite( self, column_default: t.Union[int, float, str], sqlite_default_translation: str, column_type: str, ) -> None: assert ( MySQLtoSQLite._translate_default_from_mysql_to_sqlite(column_default, column_type) == sqlite_default_translation ) @pytest.mark.parametrize( "column_default, sqlite_default_translation", [ pytest.param(b"", "DEFAULT x''", id="b''"), pytest.param(b"-1", "DEFAULT x'2d31'", id="b'-1'"), pytest.param(b"0", "DEFAULT x'30'", id="b'0'"), pytest.param(b"1", "DEFAULT x'31'", id="b'1'"), pytest.param(b"-1234567890", "DEFAULT x'2d31323334353637383930'", id="b'-1234567890'"), pytest.param(b"1234567890", "DEFAULT x'31323334353637383930'", id="b'1234567890'"), pytest.param(b"SQLite", "DEFAULT x'53514c697465'", id="b'SQLite'"), pytest.param( b"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam pretium, purus vitae sollicitudin varius, nisi lectus vehicula dui, ut dignissim felis dolor blandit justo. Donec eleifend lectus ut feugiat rhoncus. Donec erat nibh, dapibus nec diam id, lacinia lacinia nisl. Mauris sagittis efficitur nisl. Ut tincidunt elementum rhoncus. Cras suscipit dolor sed est ultricies, quis dapibus neque suscipit. Etiam ac enim eu ligula bibendum blandit quis sit amet felis. Praesent mi nisi, luctus sit amet nunc ut, fermentum tempus purus. Suspendisse vel purus a nibh aliquam hendrerit. Aliquam sit amet tristique lorem. Sed elementum congue ante id mollis. Donec vitae pretium neque.", "DEFAULT x'4c6f72656d20697073756d20646f6c6f722073697420616d65742c20636f6e73656374657475722061646970697363696e6720656c69742e204e616d207072657469756d2c20707572757320766974616520736f6c6c696369747564696e207661726975732c206e697369206c6563747573207665686963756c61206475692c207574206469676e697373696d2066656c697320646f6c6f7220626c616e646974206a7573746f2e20446f6e656320656c656966656e64206c656374757320757420666575676961742072686f6e6375732e20446f6e65632065726174206e6962682c2064617069627573206e6563206469616d2069642c206c6163696e6961206c6163696e6961206e69736c2e204d617572697320736167697474697320656666696369747572206e69736c2e2055742074696e636964756e7420656c656d656e74756d2072686f6e6375732e204372617320737573636970697420646f6c6f72207365642065737420756c747269636965732c20717569732064617069627573206e657175652073757363697069742e20457469616d20616320656e696d206575206c6967756c6120626962656e64756d20626c616e64697420717569732073697420616d65742066656c69732e205072616573656e74206d69206e6973692c206c75637475732073697420616d6574206e756e632075742c206665726d656e74756d2074656d7075732070757275732e2053757370656e64697373652076656c2070757275732061206e69626820616c697175616d2068656e6472657269742e20416c697175616d2073697420616d657420747269737469717565206c6f72656d2e2053656420656c656d656e74756d20636f6e67756520616e7465206964206d6f6c6c69732e20446f6e6563207669746165207072657469756d206e657175652e'", id="b'Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam pretium, purus vitae sollicitudin varius, nisi lectus vehicula dui, ut dignissim felis dolor blandit justo. Donec eleifend lectus ut feugiat rhoncus. Donec erat nibh, dapibus nec diam id, lacinia lacinia nisl. Mauris sagittis efficitur nisl. Ut tincidunt elementum rhoncus. Cras suscipit dolor sed est ultricies, quis dapibus neque suscipit. Etiam ac enim eu ligula bibendum blandit quis sit amet felis. Praesent mi nisi, luctus sit amet nunc ut, fermentum tempus purus. Suspendisse vel purus a nibh aliquam hendrerit. Aliquam sit amet tristique lorem. Sed elementum congue ante id mollis. Donec vitae pretium neque.'", ), ], ) def test_translate_default_blob_bytes_from_mysql_to_sqlite( self, column_default: bytes, sqlite_default_translation: str, ) -> None: assert ( MySQLtoSQLite._translate_default_from_mysql_to_sqlite(column_default, "BLOB") == sqlite_default_translation ) @pytest.mark.parametrize( "collation, resulting_column_collation, column_type", [ pytest.param( CollatingSequences.BINARY, "", "CHARACTER", id=f"{CollatingSequences.BINARY} (CHARACTER)", ), pytest.param( CollatingSequences.NOCASE, f"COLLATE {CollatingSequences.NOCASE}", "CHARACTER", id=f"{CollatingSequences.NOCASE} (CHARACTER)", ), pytest.param( CollatingSequences.RTRIM, f"COLLATE {CollatingSequences.RTRIM}", "CHARACTER", id=f"{CollatingSequences.RTRIM} (CHARACTER)", ), pytest.param( CollatingSequences.BINARY, "", "NCHAR", id=f"{CollatingSequences.BINARY} (NCHAR)", ), pytest.param( CollatingSequences.NOCASE, f"COLLATE {CollatingSequences.NOCASE}", "NCHAR", id=f"{CollatingSequences.NOCASE} (NCHAR)", ), pytest.param( CollatingSequences.RTRIM, f"COLLATE {CollatingSequences.RTRIM}", "NCHAR", id=f"{CollatingSequences.RTRIM} (NCHAR)", ), pytest.param( CollatingSequences.BINARY, "", "NVARCHAR", id=f"{CollatingSequences.BINARY} (NVARCHAR)", ), pytest.param( CollatingSequences.NOCASE, f"COLLATE {CollatingSequences.NOCASE}", "NVARCHAR", id=f"{CollatingSequences.NOCASE} (NVARCHAR)", ), pytest.param( CollatingSequences.RTRIM, f"COLLATE {CollatingSequences.RTRIM}", "NVARCHAR", id=f"{CollatingSequences.RTRIM} (NVARCHAR)", ), pytest.param( CollatingSequences.BINARY, "", "TEXT", id=f"{CollatingSequences.BINARY} (TEXT)", ), pytest.param( CollatingSequences.NOCASE, f"COLLATE {CollatingSequences.NOCASE}", "TEXT", id=f"{CollatingSequences.NOCASE} (TEXT)", ), pytest.param( CollatingSequences.RTRIM, f"COLLATE {CollatingSequences.RTRIM}", "TEXT", id=f"{CollatingSequences.RTRIM} (TEXT)", ), pytest.param( CollatingSequences.BINARY, "", "VARCHAR", id=f"{CollatingSequences.BINARY} (VARCHAR)", ), pytest.param( CollatingSequences.NOCASE, f"COLLATE {CollatingSequences.NOCASE}", "VARCHAR", id=f"{CollatingSequences.NOCASE} (VARCHAR)", ), pytest.param( CollatingSequences.RTRIM, f"COLLATE {CollatingSequences.RTRIM}", "VARCHAR", id=f"{CollatingSequences.RTRIM} (VARCHAR)", ), ], ) def test_data_type_collation_sequence_is_applied_on_textual_data_types( self, collation: str, resulting_column_collation: str, column_type: str, ) -> None: assert MySQLtoSQLite._data_type_collation_sequence(collation, column_type) == resulting_column_collation def test_data_type_collation_sequence_is_not_applied_on_non_textual_data_types(self) -> None: for column_type in ( "BIGINT", "BINARY", "BIT", "BLOB", "BOOLEAN", "DATE", "DATETIME", "DATETIME", "DECIMAL", "DOUBLE", "FLOAT", "INTEGER", "INTEGER", "LONGBLOB", "MEDIUMBLOB", "MEDIUMINT", "NUMERIC", "REAL", "SMALLINT", "TIME", "TINYBLOB", "TINYINT", "VARBINARY", "YEAR", ): for collation in ( CollatingSequences.BINARY, CollatingSequences.NOCASE, CollatingSequences.RTRIM, ): assert MySQLtoSQLite._data_type_collation_sequence(collation, column_type) == "" @pytest.mark.exceptions @pytest.mark.usefixtures("mysql_instance") class TestMySQLtoSQLiteSQLExceptions: @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_create_table_server_lost_connection_error( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, mocker: MockerFixture, caplog: LogCaptureFixture, quiet: bool, ) -> None: proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, quiet=quiet, ) class FakeSQLiteCursor: def executescript(self, *args, **kwargs) -> t.Any: raise mysql.connector.Error( msg="Error Code: 2013. Lost connection to MySQL server during query", errno=errorcode.CR_SERVER_LOST, ) class FakeSQLiteConnector: def commit(self, *args, **kwargs) -> t.Any: return True mysql_inspect: Inspector = inspect(mysql_database.engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() mocker.patch.object(proc, "_sqlite_cur", FakeSQLiteCursor()) mocker.patch.object(proc._mysql, "reconnect", return_value=True) mocker.patch.object(proc, "_sqlite", FakeSQLiteConnector()) caplog.set_level(logging.DEBUG) with pytest.raises(mysql.connector.Error): proc._create_table(choice(mysql_tables)) @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_create_table_unknown_mysql_connector_error( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, mocker: MockerFixture, caplog: LogCaptureFixture, quiet: bool, ) -> None: proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, quiet=quiet, ) class FakeSQLiteCursor: def executescript(self, statement: t.Any) -> t.Any: raise mysql.connector.Error( msg="Error Code: 2000. Unknown MySQL error", errno=errorcode.CR_UNKNOWN_ERROR, ) mysql_inspect: Inspector = inspect(mysql_database.engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() mocker.patch.object(proc, "_sqlite_cur", FakeSQLiteCursor()) caplog.set_level(logging.DEBUG) with pytest.raises(mysql.connector.Error): proc._create_table(choice(mysql_tables)) @pytest.mark.parametrize( "quiet", [ pytest.param(False, id="verbose"), pytest.param(True, id="quiet"), ], ) def test_create_table_sqlite3_error( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, mocker: MockerFixture, caplog: LogCaptureFixture, quiet: bool, ) -> None: proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, quiet=quiet, ) class FakeSQLiteCursor: def executescript(self, *args, **kwargs) -> t.Any: raise sqlite3.Error("Unknown SQLite error") mysql_inspect: Inspector = inspect(mysql_database.engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() mocker.patch.object(proc, "_sqlite_cur", FakeSQLiteCursor()) caplog.set_level(logging.DEBUG) with pytest.raises(sqlite3.Error): proc._create_table(choice(mysql_tables)) @pytest.mark.parametrize( "exception, quiet", [ pytest.param( mysql.connector.Error( msg="Error Code: 2013. Lost connection to MySQL server during query", errno=errorcode.CR_SERVER_LOST, ), False, id="errorcode.CR_SERVER_LOST verbose", ), pytest.param( mysql.connector.Error( msg="Error Code: 2013. Lost connection to MySQL server during query", errno=errorcode.CR_SERVER_LOST, ), True, id="errorcode.CR_SERVER_LOST quiet", ), pytest.param( mysql.connector.Error( msg="Error Code: 2000. Unknown MySQL error", errno=errorcode.CR_UNKNOWN_ERROR, ), False, id="errorcode.CR_UNKNOWN_ERROR verbose", ), pytest.param( mysql.connector.Error( msg="Error Code: 2000. Unknown MySQL error", errno=errorcode.CR_UNKNOWN_ERROR, ), True, id="errorcode.CR_UNKNOWN_ERROR quiet", ), pytest.param(sqlite3.Error("Unknown SQLite error"), False, id="sqlite3.Error verbose"), pytest.param(sqlite3.Error("Unknown SQLite error"), True, id="sqlite3.Error quiet"), ], ) def test_transfer_table_data_exceptions( self, sqlite_database: "os.PathLike[t.Any]", mysql_database: Database, mysql_credentials: MySQLCredentials, mocker: MockerFixture, caplog: LogCaptureFixture, exception: Exception, quiet: bool, ) -> None: proc: MySQLtoSQLite = MySQLtoSQLite( # type: ignore[call-arg] sqlite_file=sqlite_database, mysql_user=mysql_credentials.user, mysql_password=mysql_credentials.password, mysql_database=mysql_credentials.database, mysql_host=mysql_credentials.host, mysql_port=mysql_credentials.port, quiet=quiet, ) class FakeMySQLCursor: def fetchall(self) -> t.Any: raise exception def fetchmany(self, size: int = 1) -> t.Any: raise exception mysql_inspect: Inspector = inspect(mysql_database.engine) mysql_tables: t.List[str] = mysql_inspect.get_table_names() table_name: str = choice(mysql_tables) columns: t.List[str] = [column["name"] for column in mysql_inspect.get_columns(table_name)] sql: str = 'INSERT OR IGNORE INTO "{table}" ({fields}) VALUES ({placeholders})'.format( table=table_name, fields=('"{}", ' * len(columns)).rstrip(" ,").format(*columns), placeholders=("?, " * len(columns)).rstrip(" ,"), ) mocker.patch.object(proc, "_mysql_cur", FakeMySQLCursor()) with pytest.raises((mysql.connector.Error, sqlite3.Error)): proc._transfer_table_data(table_name, sql) 0707010000002E000081A40000000000000000000000016575A15F00000849000000000000000000000000000000000000001F00000000mysql-to-sqlite3-2.1.6/tox.ini[tox] isolated_build = true envlist = python3.8, python3.9, python3.10, python3.11, python3.12, black, flake8, linters skip_missing_interpreters = true [gh-actions] python = 3.8: python3.8 3.9: python3.9 3.10: python3.10 3.11: python3.11 3.12: python3.12 [testenv] passenv = LANG LEGACY_DB deps = -rrequirements_dev.txt commands = pytest -v --cov=mysql_to_sqlite3 --cov-report=xml [testenv:black] basepython = python3 skip_install = true deps = black commands = black mysql_to_sqlite3 tests/ [testenv:isort] basepython = python3 skip_install = true deps = isort commands = isort --check-only --diff . [testenv:flake8] basepython = python3 skip_install = true deps = flake8 flake8-colors flake8-docstrings flake8-import-order flake8-typing-imports pep8-naming commands = flake8 mysql_to_sqlite3 [testenv:pylint] basepython = python3 skip_install = true deps = pylint -rrequirements_dev.txt disable = C0209,C0301,C0411,R,W0107,W0622 commands = pylint --rcfile=tox.ini mysql_to_sqlite3 [testenv:bandit] basepython = python3 skip_install = true deps = bandit commands = bandit -r mysql_to_sqlite3 -c .bandit.yml [testenv:mypy] basepython = python3 skip_install = true deps = mypy>=1.3.0 -rrequirements_dev.txt commands = mypy mysql_to_sqlite3 --enable-incomplete-feature=Unpack [testenv:linters] basepython = python3 skip_install = true deps = {[testenv:black]deps} {[testenv:isort]deps} {[testenv:flake8]deps} {[testenv:pylint]deps} {[testenv:bandit]deps} {[testenv:mypy]deps} commands = {[testenv:black]commands} {[testenv:isort]commands} {[testenv:flake8]commands} {[testenv:pylint]commands} {[testenv:bandit]commands} {[testenv:mypy]commands} [flake8] ignore = I100,I201,I202,D203,D401,W503,E203,F401,F403,C901,E501 exclude = *__init__.py *__version__.py .tox max-complexity = 10 max-line-length = 88 import-order-style = pycharm application-import-names = flake8 [pylint] disable = C0209,C0301,C0411,R,W0107,W062207070100000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000B00000000TRAILER!!!435 blocks
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor