Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 1, 2025

Bumps the actions group with 7 updates in the / directory:

Package From To
actions/setup-python 5 6
scientific-python/action-towncrier-changelog 1 2
github/codeql-action 3 4
pypa/cibuildwheel 3.1.4 3.2.1
actions/upload-artifact 4 5
actions/download-artifact 5 6
pypa/gh-action-pypi-publish 1.12.4 1.13.0

Updates actions/setup-python from 5 to 6

Release notes

Sourced from actions/setup-python's releases.

v6.0.0

What's Changed

Breaking Changes

Make sure your runner is on version v2.327.1 or later to ensure compatibility with this release. See Release Notes

Enhancements:

Bug fixes:

Dependency updates:

New Contributors

Full Changelog: actions/setup-python@v5...v6.0.0

v5.6.0

What's Changed

Full Changelog: actions/setup-python@v5...v5.6.0

v5.5.0

What's Changed

Enhancements:

Bug fixes:

... (truncated)

Commits
  • e797f83 Upgrade to node 24 (#1164)
  • 3d1e2d2 Revert "Enhance cache-dependency-path handling to support files outside the w...
  • 65b0712 Clarify pythonLocation behavior for PyPy and GraalPy in environment variables...
  • 5b668cf Bump actions/checkout from 4 to 5 (#1181)
  • f62a0e2 Change missing cache directory error to warning (#1182)
  • 9322b3c Upgrade setuptools to 78.1.1 to fix path traversal vulnerability in PackageIn...
  • fbeb884 Bump form-data to fix critical vulnerabilities #182 & #183 (#1163)
  • 03bb615 Bump idna from 2.9 to 3.7 in /tests/data (#843)
  • 36da51d Add version parsing from Pipfile (#1067)
  • 3c6f142 update documentation (#1156)
  • Additional commits viewable in compare view

Updates scientific-python/action-towncrier-changelog from 1 to 2

Release notes

Sourced from scientific-python/action-towncrier-changelog's releases.

v2.0.0 Release Notes

Also see CHANGES.rst.

What's Changed

Full Changelog: scientific-python/action-towncrier-changelog@v1.0.0...v2.0.0

Commits

Updates github/codeql-action from 3 to 4

Release notes

Sourced from github/codeql-action's releases.

v3.31.2

CodeQL Action Changelog

See the releases page for the relevant changes to the CodeQL CLI and language packs.

3.31.2 - 30 Oct 2025

No user facing changes.

See the full CHANGELOG.md for more information.

v3.31.1

CodeQL Action Changelog

See the releases page for the relevant changes to the CodeQL CLI and language packs.

3.31.1 - 30 Oct 2025

  • The add-snippets input has been removed from the analyze action. This input has been deprecated since CodeQL Action 3.26.4 in August 2024 when this removal was announced.

See the full CHANGELOG.md for more information.

v3.31.0

CodeQL Action Changelog

See the releases page for the relevant changes to the CodeQL CLI and language packs.

3.31.0 - 24 Oct 2025

  • Bump minimum CodeQL bundle version to 2.17.6. #3223
  • When SARIF files are uploaded by the analyze or upload-sarif actions, the CodeQL Action automatically performs post-processing steps to prepare the data for the upload. Previously, these post-processing steps were only performed before an upload took place. We are now changing this so that the post-processing steps will always be performed, even when the SARIF files are not uploaded. This does not change anything for the upload-sarif action. For analyze, this may affect Advanced Setup for CodeQL users who specify a value other than always for the upload input. #3222

See the full CHANGELOG.md for more information.

v3.30.9

CodeQL Action Changelog

See the releases page for the relevant changes to the CodeQL CLI and language packs.

3.30.9 - 17 Oct 2025

  • Update default CodeQL bundle version to 2.23.3. #3205
  • Experimental: A new setup-codeql action has been added which is similar to init, except it only installs the CodeQL CLI and does not initialize a database. Do not use this in production as it is part of an internal experiment and subject to change at any time. #3204

See the full CHANGELOG.md for more information.

v3.30.8

CodeQL Action Changelog

See the releases page for the relevant changes to the CodeQL CLI and language packs.

... (truncated)

Changelog

Sourced from github/codeql-action's changelog.

4.31.2 - 30 Oct 2025

No user facing changes.

4.31.1 - 30 Oct 2025

  • The add-snippets input has been removed from the analyze action. This input has been deprecated since CodeQL Action 3.26.4 in August 2024 when this removal was announced.

4.31.0 - 24 Oct 2025

  • Bump minimum CodeQL bundle version to 2.17.6. #3223
  • When SARIF files are uploaded by the analyze or upload-sarif actions, the CodeQL Action automatically performs post-processing steps to prepare the data for the upload. Previously, these post-processing steps were only performed before an upload took place. We are now changing this so that the post-processing steps will always be performed, even when the SARIF files are not uploaded. This does not change anything for the upload-sarif action. For analyze, this may affect Advanced Setup for CodeQL users who specify a value other than always for the upload input. #3222

4.30.9 - 17 Oct 2025

  • Update default CodeQL bundle version to 2.23.3. #3205
  • Experimental: A new setup-codeql action has been added which is similar to init, except it only installs the CodeQL CLI and does not initialize a database. Do not use this in production as it is part of an internal experiment and subject to change at any time. #3204

4.30.8 - 10 Oct 2025

No user facing changes.

4.30.7 - 06 Oct 2025

  • [v4+ only] The CodeQL Action now runs on Node.js v24. #3169

3.30.6 - 02 Oct 2025

  • Update default CodeQL bundle version to 2.23.2. #3168

3.30.5 - 26 Sep 2025

  • We fixed a bug that was introduced in 3.30.4 with upload-sarif which resulted in files without a .sarif extension not getting uploaded. #3160

3.30.4 - 25 Sep 2025

  • We have improved the CodeQL Action's ability to validate that the workflow it is used in does not use different versions of the CodeQL Action for different workflow steps. Mixing different versions of the CodeQL Action in the same workflow is unsupported and can lead to unpredictable results. A warning will now be emitted from the codeql-action/init step if different versions of the CodeQL Action are detected in the workflow file. Additionally, an error will now be thrown by the other CodeQL Action steps if they load a configuration file that was generated by a different version of the codeql-action/init step. #3099 and #3100
  • We added support for reducing the size of dependency caches for Java analyses, which will reduce cache usage and speed up workflows. This will be enabled automatically at a later time. #3107
  • You can now run the latest CodeQL nightly bundle by passing tools: nightly to the init action. In general, the nightly bundle is unstable and we only recommend running it when directed by GitHub staff. #3130
  • Update default CodeQL bundle version to 2.23.1. #3118

3.30.3 - 10 Sep 2025

No user facing changes.

3.30.2 - 09 Sep 2025

  • Fixed a bug which could cause language autodetection to fail. #3084
  • Experimental: The quality-queries input that was added in 3.29.2 as part of an internal experiment is now deprecated and will be removed in an upcoming version of the CodeQL Action. It has been superseded by a new analysis-kinds input, which is part of the same internal experiment. Do not use this in production as it is subject to change at any time. #3064

... (truncated)

Commits
  • 74c8748 Update analyze/action.yml
  • 34c50c1 Merge pull request #3251 from github/mbg/user-error/enablement
  • 4ae68af Warn if the add-snippets input is used
  • 52a7bd7 Check for 403 status
  • 194ba0e Make error message tests less brittle
  • 53acf0b Turn enablement errors into configuration errors
  • ac9aeee Merge pull request #3249 from github/henrymercer/api-logging
  • d49e837 Merge branch 'main' into henrymercer/api-logging
  • 3d988b2 Pass minimal copy of core
  • 8cc18ac Merge pull request #3250 from github/henrymercer/prefer-fs-delete
  • Additional commits viewable in compare view

Updates pypa/cibuildwheel from 3.1.4 to 3.2.1

Release notes

Sourced from pypa/cibuildwheel's releases.

v3.2.1

  • 🛠 Update to CPython 3.14.0 final (#2614)
  • 🐛 Fix the default MACOSX_DEPLOYMENT_TARGET on Python 3.14 (#2613)
  • 📚 Docs improvements (#2617)

v3.2.0

  • ✨ Adds GraalPy v25 (Python 3.12) support (#2597)
  • 🛠 Update to CPython 3.14.0rc3 (#2602)
  • 🛠 Adds CPython 3.14.0 prerelease support for Android, and a number of improvements to Android builds (#2568, #2591)
  • 🛠 Improvements to testing on Android, passing environment markers when installing the venv, and providing more debug output when build-verbosity is set (#2575)
  • ⚠️ PyPy 3.10 was moved to pypy-eol in the enable option, as it is now end-of-life. (#2521)
  • 📚 Docs improvements (#2574, #2601, #2598)
Changelog

Sourced from pypa/cibuildwheel's changelog.


title: Changelog

Changelog

v3.2.1

12 October 2025

  • 🛠 Update to CPython 3.14.0 final (#2614)
  • 🐛 Fix the default MACOSX_DEPLOYMENT_TARGET on Python 3.14 (#2613)
  • 📚 Docs improvements (#2617)

v3.2.0

22 September 2025

  • ✨ Adds GraalPy v25 (Python 3.12) support (#2597)
  • 🛠 Update to CPython 3.14.0rc3 (#2602)
  • 🛠 Adds CPython 3.14.0 prerelease support for Android, and a number of improvements to Android builds (#2568, #2591)
  • 🛠 Improvements to testing on Android, passing environment markers when installing the venv, and providing more debug output when build-verbosity is set (#2575)
  • ⚠️ PyPy 3.10 was moved to pypy-eol in the enable option, as it is now end-of-life. (#2521)
  • 📚 Docs improvements (#2574, #2601, #2598)

v3.1.4

19 August 2025

  • ✨ Add a --clean-cache command to clean up our cache (#2489)
  • 🛠 Update Python to 3.14rc2 and other patch version bumps (#2542, #2556)
  • 🛠 Update Pyodide to 0.28.2 (#2562, #2558)
  • 🐛 Fix resolution with pyodide-build when dependency-versions is set (#2548)
  • 🐛 Set CMAKE_FIND_ROOT_PATH_MODE_PACKAGE to BOTH on Android (#2547)
  • 🐛 Add patchelf dependency for platforms that can build Android wheels (#2552)
  • 🐛 Ignore empty values for CIBW_ARCHS like most other environment variables (#2541)
  • 💼 The color and suggest_on_error argparse options are now default in 3.14rc1+ (#2554)
  • 💼 Use the virtualenv release URL instead of blob URL (should be more robust) (#2555)
  • 🧪 For iOS, lowering to macos-14 is needed for now due to issues with GitHub's runner images (#2557)
  • 🧪 Split out platforms iOS and Android in our tests (#2519)
  • 🧪 Fix and enable doctests (#2546)
  • 📚 Improve our docs on free-threading (#2549)

v3.1.3

1 August 2025

  • 🐛 Fix bug where "latest" dependencies couldn't update to pip 25.2 on Windows (#2537)
  • 🧪 Use pytest-rerunfailures to improve some of our iOS/Android tests (#2527, #2539)

... (truncated)

Commits

Updates actions/upload-artifact from 4 to 5

Release notes

Sourced from actions/upload-artifact's releases.

v5.0.0

What's Changed

BREAKING CHANGE: this update supports Node v24.x. This is not a breaking change per-se but we're treating it as such.

New Contributors

Full Changelog: actions/upload-artifact@v4...v5.0.0

v4.6.2

What's Changed

New Contributors

Full Changelog: actions/upload-artifact@v4...v4.6.2

v4.6.1

What's Changed

Full Changelog: actions/upload-artifact@v4...v4.6.1

v4.6.0

What's Changed

Full Changelog: actions/upload-artifact@v4...v4.6.0

v4.5.0

What's Changed

New Contributors

... (truncated)

Commits
  • 330a01c Merge pull request #734 from actions/danwkennedy/prepare-5.0.0
  • 03f2824 Update github.dep.yml
  • 905a1ec Prepare v5.0.0
  • 2d9f9cd Merge pull request #725 from patrikpolyak/patch-1
  • 9687587 Merge branch 'main' into patch-1
  • 2848b2c Merge pull request #727 from danwkennedy/patch-1
  • 9b51177 Spell out the first use of GHES
  • cd231ca Update GHES guidance to include reference to Node 20 version
  • de65e23 Merge pull request #712 from actions/nebuk89-patch-1
  • 8747d8c Update README.md
  • Additional commits viewable in compare view

Updates actions/download-artifact from 5 to 6

Release notes

Sourced from actions/download-artifact's releases.

v6.0.0

What's Changed

BREAKING CHANGE: this update supports Node v24.x. This is not a breaking change per-se but we're treating it as such.

New Contributors

Full Changelog: actions/download-artifact@v5...v6.0.0

Commits
  • 018cc2c Merge pull request #438 from actions/danwkennedy/prepare-6.0.0
  • 815651c Revert "Remove github.dep.yml"
  • bb3a066 Remove github.dep.yml
  • fa1ce46 Prepare v6.0.0
  • 4a24838 Merge pull request #431 from danwkennedy/patch-1
  • 5e3251c Readme: spell out the first use of GHES
  • abefc31 Merge pull request #424 from actions/yacaovsnc/update_readme
  • ac43a60 Update README with artifact extraction details
  • de96f46 Merge pull request #417 from actions/yacaovsnc/update_readme
  • 7993cb4 Remove migration guide for artifact download changes
  • Additional commits viewable in compare view

Updates pypa/gh-action-pypi-publish from 1.12.4 to 1.13.0

Release notes

Sourced from pypa/gh-action-pypi-publish's releases.

v1.13.0

[!important] 🚨 This release includes fixes for GHSA-vxmw-7h4f-hqxh discovered by @​woodruffw💰. We've also integrated Zizmor to catch similar issues in the future and you should too.

✨ New Stuff

@​woodruffw💰 updated the README to no longer mention the attestations feature being experimental in #347: it's been rather stable for a year already 🎉 He also added more diagnostic output which includes printing out the GitHub Environment claim via #371 and warning about the unsupported reusable workflows configurations #306, when using Trusted Publishing.

[!tip] The official support for reusable workflows is currently blocked on changes to PyPI. To get updates about progress on the action side, you may want to subscribe to #166. At PyCon US 2025 Sprints, @​facutuesca💰, @​miketheman💰, @​woodruffw💰 and I💰 spent several hours IRL brainstorming how to fix this and migrate projects that happen to rely on an obscure corner case with reusable workflows that temporarily allows them to function by accident. The result of that discussion is posted @ pypi/warehouse#11096. Note that this is a volunteer-led effort and there is no ETA. If you need this soon, make your employer sponsor the PSF and maybe they'll be able to hire somebody for this work on Warehouse.

In addition to that, @​konstin💰 sent #378 to pin actions/setup-python to a SHA hash. This makes pypi-publish compatible with new GitHub policies that allow organizations to mandate hash-pinning actions used in workflows.

🛠️ Internal Dependencies

@​webknjaz💰 made a bunch of updates to the action runtime which includes bumping it to Python 3.13 in #331 and updating the dependency tree across the board. pip-with-requires-python is no longer being installed (#332). Some related bumps were contributed by @​woodruffw💰 (#359) and @​kurtmckee💰 sent a contributor-facing PR, bumping the linting configuration via #335.

💪 New Contributors

🪞 Full Diff: pypa/gh-action-pypi-publish@v1.12.4...v1.13.0

🧔‍♂️ Release Manager: @​webknjaz 🇺🇦

💬 Discuss on Bluesky 🦋, on Mastodon 🐘 and on GitHub.

GH Sponsors badge

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore <dependency name> major version will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
  • @dependabot ignore <dependency name> minor version will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
  • @dependabot ignore <dependency name> will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
  • @dependabot unignore <dependency name> will remove all of the ignore conditions of the specified dependency
  • @dependabot unignore <dependency name> <ignore condition> will remove the ignore condition of the specified dependency and ignore conditions

Important

Bump versions of GitHub Actions and tools across multiple workflows for improved features and fixes.

  • Actions Version Updates:
    • Update actions/setup-python from v5 to v6 in arm-unit-tests.yml, check-sdist.yml, labeler-title-regex.yml, lint.yml, publish_pypi.yml, update_tracking_issue.yml, and wheels.yml.
    • Update actions/upload-artifact from v4 to v5 in cuda-ci.yml, emscripten.yml, lint.yml, and wheels.yml.
    • Update actions/download-artifact from v5 to v6 in cuda-ci.yml, emscripten.yml, and wheels.yml.
  • Tool Version Updates:
    • Update scientific-python/action-towncrier-changelog from v1 to v2 in check-changelog.yml.
    • Update github/codeql-action from v3 to v4 in codeql.yml.
    • Update pypa/cibuildwheel from 3.1.4 to 3.2.1 in cuda-ci.yml and emscripten.yml.
    • Update pypa/gh-action-pypi-publish from 1.12.4 to 1.13.0 in publish_pypi.yml.

This description was created by Ellipsis for 95f6e97. You can customize this summary. It will automatically update as commits are pushed.

Bumps the actions group with 7 updates in the / directory:

| Package | From | To |
| --- | --- | --- |
| [actions/setup-python](https://github.com/actions/setup-python) | `5` | `6` |
| [scientific-python/action-towncrier-changelog](https://github.com/scientific-python/action-towncrier-changelog) | `1` | `2` |
| [github/codeql-action](https://github.com/github/codeql-action) | `3` | `4` |
| [pypa/cibuildwheel](https://github.com/pypa/cibuildwheel) | `3.1.4` | `3.2.1` |
| [actions/upload-artifact](https://github.com/actions/upload-artifact) | `4` | `5` |
| [actions/download-artifact](https://github.com/actions/download-artifact) | `5` | `6` |
| [pypa/gh-action-pypi-publish](https://github.com/pypa/gh-action-pypi-publish) | `1.12.4` | `1.13.0` |



Updates `actions/setup-python` from 5 to 6
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](actions/setup-python@v5...v6)

Updates `scientific-python/action-towncrier-changelog` from 1 to 2
- [Release notes](https://github.com/scientific-python/action-towncrier-changelog/releases)
- [Changelog](https://github.com/scientific-python/action-towncrier-changelog/blob/main/check_changelog.py)
- [Commits](scientific-python/action-towncrier-changelog@v1...v2)

Updates `github/codeql-action` from 3 to 4
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](github/codeql-action@v3...v4)

Updates `pypa/cibuildwheel` from 3.1.4 to 3.2.1
- [Release notes](https://github.com/pypa/cibuildwheel/releases)
- [Changelog](https://github.com/pypa/cibuildwheel/blob/main/docs/changelog.md)
- [Commits](pypa/cibuildwheel@c923d83...9c00cb4)

Updates `actions/upload-artifact` from 4 to 5
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](actions/upload-artifact@v4...v5)

Updates `actions/download-artifact` from 5 to 6
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](actions/download-artifact@v5...v6)

Updates `pypa/gh-action-pypi-publish` from 1.12.4 to 1.13.0
- [Release notes](https://github.com/pypa/gh-action-pypi-publish/releases)
- [Commits](pypa/gh-action-pypi-publish@76f52bc...ed0c539)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: actions
- dependency-name: scientific-python/action-towncrier-changelog
  dependency-version: '2'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: actions
- dependency-name: github/codeql-action
  dependency-version: '4'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: actions
- dependency-name: pypa/cibuildwheel
  dependency-version: 3.2.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
- dependency-name: actions/upload-artifact
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: actions
- dependency-name: actions/download-artifact
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: actions
- dependency-name: pypa/gh-action-pypi-publish
  dependency-version: 1.13.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Nov 1, 2025

Labels

The following labels could not be found: Build / CI, dependencies. Please create them before Dependabot can add them to a pull request.

Please fix the above issues or remove invalid values from dependabot.yml.

@github-actions
Copy link

github-actions bot commented Nov 1, 2025

❌ Linting issues

This PR is introducing linting issues. Here's a summary of the issues. Note that you can avoid having linting issues by enabling pre-commit hooks. Instructions to enable them can be found here.

You can see the details of the linting issues under the lint job here


ruff check

ruff detected issues. Please run ruff check --fix --output-format=full locally, fix the remaining issues, and push the changes. Here you can see the detected issues. Note that the installed ruff version is ruff=0.11.7.


doc/conf.py:271:60: W291 [*] Trailing whitespace
    |
269 |         "alt_text": "JAX-sklearn homepage",
270 |         "image_relative": "logos/scikit-learn-logo-small.png",
271 |         "image_light": "logos/scikit-learn-logo-small.png", 
    |                                                            ^ W291
272 |         "image_dark": "logos/scikit-learn-logo-small.png",
273 |     },
    |
    = help: Remove trailing whitespace

examples/applications/plot_cyclical_feature_engineering.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_digits_denoising.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_face_recognition.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_model_complexity_influence.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_out_of_core_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_outlier_detection_wine.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_prediction_latency.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_species_distribution_modeling.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_stock_market.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_time_series_lagged_features.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_tomography_l1_reconstruction.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/plot_topics_extraction_with_nmf_lda.py:1:1: CPY001 Missing copyright notice at top of file
examples/applications/wikipedia_principal_eigenvector.py:1:1: CPY001 Missing copyright notice at top of file
examples/bicluster/plot_bicluster_newsgroups.py:1:1: CPY001 Missing copyright notice at top of file
examples/bicluster/plot_spectral_biclustering.py:1:1: CPY001 Missing copyright notice at top of file
examples/bicluster/plot_spectral_coclustering.py:1:1: CPY001 Missing copyright notice at top of file
examples/calibration/plot_calibration.py:1:1: CPY001 Missing copyright notice at top of file
examples/calibration/plot_calibration_curve.py:1:1: CPY001 Missing copyright notice at top of file
examples/calibration/plot_calibration_multiclass.py:1:1: CPY001 Missing copyright notice at top of file
examples/calibration/plot_compare_calibration.py:1:1: CPY001 Missing copyright notice at top of file
examples/classification/plot_classification_probability.py:1:1: CPY001 Missing copyright notice at top of file
examples/classification/plot_classifier_comparison.py:1:1: CPY001 Missing copyright notice at top of file
examples/classification/plot_digits_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/classification/plot_lda.py:1:1: CPY001 Missing copyright notice at top of file
examples/classification/plot_lda_qda.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_adjusted_for_chance_measures.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_affinity_propagation.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_agglomerative_clustering.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_agglomerative_clustering_metrics.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_agglomerative_dendrogram.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_birch_vs_minibatchkmeans.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_bisect_kmeans.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_cluster_comparison.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_coin_segmentation.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_coin_ward_segmentation.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_dbscan.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_dict_face_patches.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_digits_agglomeration.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_digits_linkage.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_face_compress.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_feature_agglomeration_vs_univariate_selection.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_hdbscan.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_inductive_clustering.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_kmeans_assumptions.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_kmeans_digits.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_kmeans_plusplus.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_kmeans_silhouette_analysis.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_kmeans_stability_low_dim_dense.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_linkage_comparison.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_mean_shift.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_mini_batch_kmeans.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_optics.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_segmentation_toy.py:1:1: CPY001 Missing copyright notice at top of file
examples/cluster/plot_ward_structured_vs_unstructured.py:1:1: CPY001 Missing copyright notice at top of file
examples/compose/plot_column_transformer.py:1:1: CPY001 Missing copyright notice at top of file
examples/compose/plot_column_transformer_mixed_types.py:1:1: CPY001 Missing copyright notice at top of file
examples/compose/plot_compare_reduction.py:1:1: CPY001 Missing copyright notice at top of file
examples/compose/plot_digits_pipe.py:1:1: CPY001 Missing copyright notice at top of file
examples/compose/plot_feature_union.py:1:1: CPY001 Missing copyright notice at top of file
examples/compose/plot_transformed_target.py:1:1: CPY001 Missing copyright notice at top of file
examples/covariance/plot_covariance_estimation.py:1:1: CPY001 Missing copyright notice at top of file
examples/covariance/plot_lw_vs_oas.py:1:1: CPY001 Missing copyright notice at top of file
examples/covariance/plot_mahalanobis_distances.py:1:1: CPY001 Missing copyright notice at top of file
examples/covariance/plot_robust_vs_empirical_covariance.py:1:1: CPY001 Missing copyright notice at top of file
examples/covariance/plot_sparse_cov.py:1:1: CPY001 Missing copyright notice at top of file
examples/cross_decomposition/plot_compare_cross_decomposition.py:1:1: CPY001 Missing copyright notice at top of file
examples/cross_decomposition/plot_pcr_vs_pls.py:1:1: CPY001 Missing copyright notice at top of file
examples/datasets/plot_random_multilabel_dataset.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_faces_decomposition.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_ica_blind_source_separation.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_ica_vs_pca.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_image_denoising.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_incremental_pca.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_kernel_pca.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_pca_iris.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_pca_vs_fa_model_selection.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_pca_vs_lda.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_sparse_coding.py:1:1: CPY001 Missing copyright notice at top of file
examples/decomposition/plot_varimax_fa.py:1:1: CPY001 Missing copyright notice at top of file
examples/developing_estimators/xlearn_is_fitted.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_adaboost_multiclass.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_adaboost_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_adaboost_twoclass.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_bias_variance.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_ensemble_oob.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_feature_transformation.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_forest_hist_grad_boosting_comparison.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_forest_importances.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_forest_iris.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_gradient_boosting_categorical.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_gradient_boosting_early_stopping.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_gradient_boosting_oob.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_gradient_boosting_quantile.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_gradient_boosting_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_gradient_boosting_regularization.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_hgbt_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_isolation_forest.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_monotonic_constraints.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_random_forest_embedding.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_random_forest_regression_multioutput.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_stack_predictors.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_voting_decision_regions.py:1:1: CPY001 Missing copyright notice at top of file
examples/ensemble/plot_voting_regressor.py:1:1: CPY001 Missing copyright notice at top of file
examples/feature_selection/plot_f_test_vs_mi.py:1:1: CPY001 Missing copyright notice at top of file
examples/feature_selection/plot_feature_selection.py:1:1: CPY001 Missing copyright notice at top of file
examples/feature_selection/plot_feature_selection_pipeline.py:1:1: CPY001 Missing copyright notice at top of file
examples/feature_selection/plot_rfe_digits.py:1:1: CPY001 Missing copyright notice at top of file
examples/feature_selection/plot_rfe_with_cross_validation.py:1:1: CPY001 Missing copyright notice at top of file
examples/feature_selection/plot_select_from_model_diabetes.py:1:1: CPY001 Missing copyright notice at top of file
examples/frozen/plot_frozen_examples.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_compare_gpr_krr.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpc.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpc_iris.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpc_isoprobability.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpc_xor.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpr_co2.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpr_noisy.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpr_noisy_targets.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpr_on_structured_data.py:1:1: CPY001 Missing copyright notice at top of file
examples/gaussian_process/plot_gpr_prior_posterior.py:1:1: CPY001 Missing copyright notice at top of file
examples/impute/plot_iterative_imputer_variants_comparison.py:1:1: CPY001 Missing copyright notice at top of file
examples/impute/plot_missing_values.py:1:1: CPY001 Missing copyright notice at top of file
examples/inspection/plot_causal_interpretation.py:1:1: CPY001 Missing copyright notice at top of file
examples/inspection/plot_linear_model_coefficient_interpretation.py:1:1: CPY001 Missing copyright notice at top of file
examples/inspection/plot_partial_dependence.py:1:1: CPY001 Missing copyright notice at top of file
examples/inspection/plot_permutation_importance.py:1:1: CPY001 Missing copyright notice at top of file
examples/inspection/plot_permutation_importance_multicollinear.py:1:1: CPY001 Missing copyright notice at top of file
examples/kernel_approximation/plot_scalable_poly_kernels.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_ard.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_bayesian_ridge_curvefit.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_elastic_net_precomputed_gram_matrix_with_weighted_samples.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_huber_vs_ridge.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_lasso_and_elasticnet.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_lasso_dense_vs_sparse_data.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_lasso_lars_ic.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_lasso_lasso_lars_elasticnet_path.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_lasso_model_selection.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_logistic.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_logistic_l1_l2_sparsity.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_logistic_multinomial.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_logistic_path.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_multi_task_lasso_support.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_nnls.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_ols_ridge.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_omp.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_poisson_regression_non_normal_loss.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_polynomial_interpolation.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_quantile_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_ransac.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_ridge_coeffs.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_ridge_path.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_robust_fit.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgd_early_stopping.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgd_iris.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgd_loss_functions.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgd_penalties.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgd_separating_hyperplane.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgd_weighted_samples.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sgdocsvm_vs_ocsvm.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sparse_logistic_regression_20newsgroups.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_sparse_logistic_regression_mnist.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_theilsen.py:1:1: CPY001 Missing copyright notice at top of file
examples/linear_model/plot_tweedie_regression_insurance_claims.py:1:1: CPY001 Missing copyright notice at top of file
examples/manifold/plot_compare_methods.py:1:1: CPY001 Missing copyright notice at top of file
examples/manifold/plot_lle_digits.py:1:1: CPY001 Missing copyright notice at top of file
examples/manifold/plot_manifold_sphere.py:1:1: CPY001 Missing copyright notice at top of file
examples/manifold/plot_mds.py:1:1: CPY001 Missing copyright notice at top of file
examples/manifold/plot_swissroll.py:1:1: CPY001 Missing copyright notice at top of file
examples/manifold/plot_t_sne_perplexity.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_anomaly_comparison.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_display_object_visualization.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_estimator_representation.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_isotonic_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_johnson_lindenstrauss_bound.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_kernel_approximation.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_kernel_ridge_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_metadata_routing.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_multilabel.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_multioutput_face_completion.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_outlier_detection_bench.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_partial_dependence_visualization_api.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_pipeline_display.py:1:1: CPY001 Missing copyright notice at top of file
examples/miscellaneous/plot_roc_curve_visualization_api.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_concentration_prior.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_gmm.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_gmm_covariances.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_gmm_init.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_gmm_pdf.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_gmm_selection.py:1:1: CPY001 Missing copyright notice at top of file
examples/mixture/plot_gmm_sin.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_confusion_matrix.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_cost_sensitive_learning.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_cv_indices.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_cv_predict.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_det.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_grid_search_digits.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_grid_search_refit_callable.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_grid_search_stats.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_grid_search_text_feature_extraction.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_learning_curve.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_likelihood_ratios.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_multi_metric_evaluation.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_nested_cross_validation_iris.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_permutation_tests_for_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_randomized_search.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_roc.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_roc_crossval.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_successive_halving_heatmap.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_successive_halving_iterations.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_train_error_vs_test_error.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_tuned_decision_threshold.py:1:1: CPY001 Missing copyright notice at top of file
examples/model_selection/plot_underfitting_overfitting.py:1:1: CPY001 Missing copyright notice at top of file
examples/multiclass/plot_multiclass_overview.py:1:1: CPY001 Missing copyright notice at top of file
examples/multioutput/plot_classifier_chain_yeast.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/approximate_nearest_neighbors.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_caching_nearest_neighbors.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_digits_kde_sampling.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_kde_1d.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_lof_novelty_detection.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_lof_outlier_detection.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_nca_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_nca_dim_reduction.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_nca_illustration.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_nearest_centroid.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/neighbors/plot_species_kde.py:1:1: CPY001 Missing copyright notice at top of file
examples/neural_networks/plot_mlp_alpha.py:1:1: CPY001 Missing copyright notice at top of file
examples/neural_networks/plot_mlp_training_curves.py:1:1: CPY001 Missing copyright notice at top of file
examples/neural_networks/plot_mnist_filters.py:1:1: CPY001 Missing copyright notice at top of file
examples/neural_networks/plot_rbm_logistic_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_all_scaling.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_discretization.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_discretization_classification.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_discretization_strategies.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_map_data_to_normal.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_scaling_importance.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_target_encoder.py:1:1: CPY001 Missing copyright notice at top of file
examples/preprocessing/plot_target_encoder_cross_val.py:1:1: CPY001 Missing copyright notice at top of file
examples/release_highlights/plot_release_highlights_0_22_0.py:1:1: CPY001 Missing copyright notice at top of file
examples/semi_supervised/plot_label_propagation_digits.py:1:1: CPY001 Missing copyright notice at top of file
examples/semi_supervised/plot_label_propagation_digits_active_learning.py:1:1: CPY001 Missing copyright notice at top of file
examples/semi_supervised/plot_label_propagation_structure.py:1:1: CPY001 Missing copyright notice at top of file
examples/semi_supervised/plot_self_training_varying_threshold.py:1:1: CPY001 Missing copyright notice at top of file
examples/semi_supervised/plot_semi_supervised_newsgroups.py:1:1: CPY001 Missing copyright notice at top of file
examples/semi_supervised/plot_semi_supervised_versus_svm_iris.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_custom_kernel.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_iris_svc.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_linearsvc_support_vectors.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_oneclass.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_separating_hyperplane.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_separating_hyperplane_unbalanced.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_svm_anova.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_svm_kernels.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_svm_margin.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_svm_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_svm_scale_c.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_svm_tie_breaking.py:1:1: CPY001 Missing copyright notice at top of file
examples/svm/plot_weighted_samples.py:1:1: CPY001 Missing copyright notice at top of file
examples/text/plot_document_classification_20newsgroups.py:1:1: CPY001 Missing copyright notice at top of file
examples/text/plot_document_clustering.py:1:1: CPY001 Missing copyright notice at top of file
examples/text/plot_hashing_vs_dict_vectorizer.py:1:1: CPY001 Missing copyright notice at top of file
examples/tree/plot_cost_complexity_pruning.py:1:1: CPY001 Missing copyright notice at top of file
examples/tree/plot_iris_dtc.py:1:1: CPY001 Missing copyright notice at top of file
examples/tree/plot_tree_regression.py:1:1: CPY001 Missing copyright notice at top of file
examples/tree/plot_unveil_tree_structure.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/__check_build/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/__init__.py:71:22: RUF100 [*] Unused `noqa` directive (unused: `E402`)
   |
69 | # Skip build check for development
70 | try:
71 |     from . import (  # noqa: F401 E402
   |                      ^^^^^^^^^^^^^^^^^ RUF100
72 |         __check_build,
73 |         _distributor_init,
   |
   = help: Remove unused `noqa` directive

xlearn/__init__.py:85:1: W293 [*] Blank line contains whitespace
   |
83 |     from . import _jax  # noqa: F401
84 |     _JAX_ENABLED = True
85 |     
   | ^^^^ W293
86 |     # Auto-enable JAX acceleration for all algorithms
87 |     from ._jax._proxy import create_intelligent_proxy
   |
   = help: Remove whitespace from blank line

xlearn/__init__.py:87:5: I001 [*] Import block is un-sorted or un-formatted
   |
86 |       # Auto-enable JAX acceleration for all algorithms
87 | /     from ._jax._proxy import create_intelligent_proxy
88 | |     from ._jax._accelerator import AcceleratorRegistry
   | |______________________________________________________^ I001
89 |       
90 |       # Create global registry
   |
   = help: Organize imports

xlearn/__init__.py:89:1: W293 [*] Blank line contains whitespace
   |
87 |     from ._jax._proxy import create_intelligent_proxy
88 |     from ._jax._accelerator import AcceleratorRegistry
89 |     
   | ^^^^ W293
90 |     # Create global registry
91 |     _jax_registry = AcceleratorRegistry()
   |
   = help: Remove whitespace from blank line

xlearn/__init__.py:92:1: W293 [*] Blank line contains whitespace
   |
90 |     # Create global registry
91 |     _jax_registry = AcceleratorRegistry()
92 |     
   | ^^^^ W293
93 |     def _auto_jax_accelerate_module(module_name):
94 |         """Automatically add JAX acceleration to all estimators in a module."""
   |
   = help: Remove whitespace from blank line

xlearn/__init__.py:97:1: W293 [*] Blank line contains whitespace
   |
95 |         try:
96 |             module = _importlib.import_module(f'.{module_name}', package=__name__)
97 |             
   | ^^^^^^^^^^^^ W293
98 |             # Get all classes that look like estimators
99 |             for attr_name in dir(module):
   |
   = help: Remove whitespace from blank line

xlearn/__init__.py:102:1: W293 [*] Blank line contains whitespace
    |
100 |                 if not attr_name.startswith('_'):
101 |                     attr = getattr(module, attr_name)
102 |                     
    | ^^^^^^^^^^^^^^^^^^^^ W293
103 |                     # Check if it's a class and looks like an estimator
104 |                     if (isinstance(attr, type) and 
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:104:51: W291 [*] Trailing whitespace
    |
103 |                     # Check if it's a class and looks like an estimator
104 |                     if (isinstance(attr, type) and 
    |                                                   ^ W291
105 |                         hasattr(attr, 'fit') and 
106 |                         attr.__module__.startswith('xlearn.')):
    |
    = help: Remove trailing whitespace

xlearn/__init__.py:105:49: W291 [*] Trailing whitespace
    |
103 |                     # Check if it's a class and looks like an estimator
104 |                     if (isinstance(attr, type) and 
105 |                         hasattr(attr, 'fit') and 
    |                                                 ^ W291
106 |                         attr.__module__.startswith('xlearn.')):
    |
    = help: Remove trailing whitespace

xlearn/__init__.py:107:1: W293 [*] Blank line contains whitespace
    |
105 |                         hasattr(attr, 'fit') and 
106 |                         attr.__module__.startswith('xlearn.')):
107 |                         
    | ^^^^^^^^^^^^^^^^^^^^^^^^ W293
108 |                         # Create intelligent proxy
109 |                         proxy_class = create_intelligent_proxy(attr)
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:110:1: W293 [*] Blank line contains whitespace
    |
108 |                         # Create intelligent proxy
109 |                         proxy_class = create_intelligent_proxy(attr)
110 |                         
    | ^^^^^^^^^^^^^^^^^^^^^^^^ W293
111 |                         # Replace in module
112 |                         setattr(module, attr_name, proxy_class)
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:113:1: W293 [*] Blank line contains whitespace
    |
111 |                         # Replace in module
112 |                         setattr(module, attr_name, proxy_class)
113 |                         
    | ^^^^^^^^^^^^^^^^^^^^^^^^ W293
114 |         except Exception as e:
115 |             logger.debug(f"Failed to auto-accelerate module {module_name}: {e}")
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:116:1: W293 [*] Blank line contains whitespace
    |
114 |         except Exception as e:
115 |             logger.debug(f"Failed to auto-accelerate module {module_name}: {e}")
116 |     
    | ^^^^ W293
117 | except ImportError:
118 |     _JAX_ENABLED = False
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:120:1: W293 [*] Blank line contains whitespace
    |
118 |     _JAX_ENABLED = False
119 |     _jax_registry = None
120 |     
    | ^^^^ W293
121 |     def _auto_jax_accelerate_module(module_name):
122 |         """No-op when JAX is not available."""
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:183:1: W293 [*] Blank line contains whitespace
    |
181 |     if name in _submodules:
182 |         module = _importlib.import_module(f"xlearn.{name}")
183 |         
    | ^^^^^^^^ W293
184 |         # Auto-apply JAX acceleration if enabled
185 |         if _JAX_ENABLED:
    |
    = help: Remove whitespace from blank line

xlearn/__init__.py:187:1: W293 [*] Blank line contains whitespace
    |
185 |         if _JAX_ENABLED:
186 |             _auto_jax_accelerate_module(name)
187 |             
    | ^^^^^^^^^^^^ W293
188 |         return module
189 |     else:
    |
    = help: Remove whitespace from blank line

xlearn/_config.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_distributor_init.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/__init__.py:9:1: W293 Blank line contains whitespace
   |
 7 |     import xlearn
 8 |     # JAX acceleration is automatically enabled if JAX is available
 9 |     
   | ^^^^ W293
10 |     # Or explicitly configure:
11 |     import xlearn._jax
   |
   = help: Remove whitespace from blank line

xlearn/_jax/__init__.py:20:30: F401 `typing.Union` imported but unused
   |
18 | import os
19 | import warnings
20 | from typing import Optional, Union
   |                              ^^^^^ F401
21 |
22 | # Try to import JAX
   |
   = help: Remove unused import: `typing.Union`

xlearn/_jax/__init__.py:32:1: I001 [*] Import block is un-sorted or un-formatted
   |
30 |       _JAX_AVAILABLE = False
31 |
32 | / from ._config import get_config, set_config, config_context
33 | | from ._accelerator import AcceleratorRegistry
   | |_____________________________________________^ I001
34 |
35 |   # Global accelerator registry
   |
   = help: Organize imports

xlearn/_jax/__init__.py:53:1: W293 Blank line contains whitespace
   |
51 | def enable_jax_acceleration(platform: Optional[str] = None) -> bool:
52 |     """Enable JAX acceleration.
53 |     
   | ^^^^ W293
54 |     Parameters
55 |     ----------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/__init__.py:58:1: W293 Blank line contains whitespace
   |
56 |     platform : str, optional
57 |         JAX platform to use ('cpu', 'gpu', 'tpu'). If None, uses default.
58 |         
   | ^^^^^^^^ W293
59 |     Returns
60 |     -------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/__init__.py:71:1: W293 [*] Blank line contains whitespace
   |
69 |         )
70 |         return False
71 |     
   | ^^^^ W293
72 |     try:
73 |         if platform:
   |
   = help: Remove whitespace from blank line

xlearn/_jax/__init__.py:76:1: W293 [*] Blank line contains whitespace
   |
74 |             # Configure JAX platform if specified
75 |             os.environ['JAX_PLATFORM_NAME'] = platform.lower()
76 |         
   | ^^^^^^^^ W293
77 |         # Update configuration
78 |         set_config(enable_jax=True, jax_platform=platform or "auto")
   |
   = help: Remove whitespace from blank line

xlearn/_jax/__init__.py:80:1: W293 [*] Blank line contains whitespace
   |
78 |         set_config(enable_jax=True, jax_platform=platform or "auto")
79 |         return True
80 |         
   | ^^^^^^^^ W293
81 |     except Exception as e:
82 |         warnings.warn(f"Failed to enable JAX acceleration: {e}", UserWarning)
   |
   = help: Remove whitespace from blank line

xlearn/_jax/__init__.py:90:23: W291 [*] Trailing whitespace
   |
89 | # Auto-enable JAX if available and not explicitly disabled
90 | if (_JAX_AVAILABLE and 
   |                       ^ W291
91 |     os.environ.get('XLEARN_DISABLE_JAX', '').lower() not in ('1', 'true', 'yes')):
92 |     enable_jax_acceleration()
   |
   = help: Remove trailing whitespace

xlearn/_jax/__init__.py:94:11: RUF022 [*] `__all__` is not sorted
    |
 92 |       enable_jax_acceleration()
 93 |
 94 |   __all__ = [
    |  ___________^
 95 | |     'is_jax_available',
 96 | |     'get_jax_platform', 
 97 | |     'enable_jax_acceleration',
 98 | |     'disable_jax_acceleration',
 99 | |     'get_config',
100 | |     'set_config',
101 | |     'config_context',
102 | | ]
    | |_^ RUF022
    |
    = help: Apply an isort-style sorting to `__all__`

xlearn/_jax/__init__.py:96:24: W291 [*] Trailing whitespace
   |
94 | __all__ = [
95 |     'is_jax_available',
96 |     'get_jax_platform', 
   |                        ^ W291
97 |     'enable_jax_acceleration',
98 |     'disable_jax_acceleration',
   |
   = help: Remove trailing whitespace

xlearn/_jax/_accelerator.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/_accelerator.py:8:20: F401 [*] `typing.Any` imported but unused
   |
 6 | import functools
 7 | import warnings
 8 | from typing import Any, Callable, Dict, Optional, Type, Union
   |                    ^^^ F401
 9 |
10 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_accelerator.py:8:57: F401 [*] `typing.Union` imported but unused
   |
 6 | import functools
 7 | import warnings
 8 | from typing import Any, Callable, Dict, Optional, Type, Union
   |                                                         ^^^^^ F401
 9 |
10 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_accelerator.py:15:1: W293 [*] Blank line contains whitespace
   |
13 | class AcceleratorRegistry:
14 |     """Registry for JAX-accelerated estimator implementations."""
15 |     
   | ^^^^ W293
16 |     def __init__(self):
17 |         self._accelerators: Dict[Type, Type] = {}
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:19:1: W293 [*] Blank line contains whitespace
   |
17 |         self._accelerators: Dict[Type, Type] = {}
18 |         self._enabled = True
19 |     
   | ^^^^ W293
20 |     def register(self, original_class: Type, accelerated_class: Type) -> None:
21 |         """Register a JAX-accelerated implementation.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:22:1: W293 Blank line contains whitespace
   |
20 |     def register(self, original_class: Type, accelerated_class: Type) -> None:
21 |         """Register a JAX-accelerated implementation.
22 |         
   | ^^^^^^^^ W293
23 |         Parameters
24 |         ----------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:31:1: W293 [*] Blank line contains whitespace
   |
29 |         """
30 |         self._accelerators[original_class] = accelerated_class
31 |     
   | ^^^^ W293
32 |     def get_accelerated(self, original_class: Type) -> Optional[Type]:
33 |         """Get the JAX-accelerated implementation for a class.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:34:1: W293 Blank line contains whitespace
   |
32 |     def get_accelerated(self, original_class: Type) -> Optional[Type]:
33 |         """Get the JAX-accelerated implementation for a class.
34 |         
   | ^^^^^^^^ W293
35 |         Parameters
36 |         ----------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:39:1: W293 Blank line contains whitespace
   |
37 |         original_class : type
38 |             The original jax-sklearn estimator class.
39 |             
   | ^^^^^^^^^^^^ W293
40 |         Returns
41 |         -------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:48:1: W293 [*] Blank line contains whitespace
   |
46 |             return None
47 |         return self._accelerators.get(original_class)
48 |     
   | ^^^^ W293
49 |     def is_registered(self, original_class: Type) -> bool:
50 |         """Check if a class has a registered JAX implementation."""
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:52:1: W293 [*] Blank line contains whitespace
   |
50 |         """Check if a class has a registered JAX implementation."""
51 |         return original_class in self._accelerators
52 |     
   | ^^^^ W293
53 |     def enable(self) -> None:
54 |         """Enable the accelerator registry."""
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:56:1: W293 [*] Blank line contains whitespace
   |
54 |         """Enable the accelerator registry."""
55 |         self._enabled = True
56 |     
   | ^^^^ W293
57 |     def disable(self) -> None:
58 |         """Disable the accelerator registry."""
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:60:1: W293 [*] Blank line contains whitespace
   |
58 |         """Disable the accelerator registry."""
59 |         self._enabled = False
60 |     
   | ^^^^ W293
61 |     def list_accelerated(self) -> Dict[str, str]:
62 |         """List all registered accelerated implementations.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:63:1: W293 Blank line contains whitespace
   |
61 |     def list_accelerated(self) -> Dict[str, str]:
62 |         """List all registered accelerated implementations.
63 |         
   | ^^^^^^^^ W293
64 |         Returns
65 |         -------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:77:1: W293 Blank line contains whitespace
   |
75 | def accelerated_estimator(original_class: Type):
76 |     """Decorator to register a JAX-accelerated estimator implementation.
77 |     
   | ^^^^ W293
78 |     Parameters
79 |     ----------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:82:1: W293 Blank line contains whitespace
   |
80 |     original_class : type
81 |         The original jax-sklearn estimator class to accelerate.
82 |         
   | ^^^^^^^^ W293
83 |     Examples
84 |     --------
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:100:1: W293 Blank line contains whitespace
    |
 98 | def create_accelerated_estimator(original_class: Type, *args, **kwargs):
 99 |     """Create an accelerated estimator instance if available.
100 |     
    | ^^^^ W293
101 |     Parameters
102 |     ----------
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:107:1: W293 Blank line contains whitespace
    |
105 |     *args, **kwargs
106 |         Arguments to pass to the estimator constructor.
107 |         
    | ^^^^^^^^ W293
108 |     Returns
109 |     -------
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:114:1: W293 [*] Blank line contains whitespace
    |
112 |     """
113 |     from . import _registry
114 |     
    | ^^^^ W293
115 |     config = get_config()
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:116:1: W293 [*] Blank line contains whitespace
    |
115 |     config = get_config()
116 |     
    | ^^^^ W293
117 |     # Check if JAX acceleration is enabled
118 |     if not config["enable_jax"]:
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:120:1: W293 [*] Blank line contains whitespace
    |
118 |     if not config["enable_jax"]:
119 |         return original_class(*args, **kwargs)
120 |     
    | ^^^^ W293
121 |     # Check if JAX is available
122 |     try:
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:123:16: F401 `jax` imported but unused; consider using `importlib.util.find_spec` to test for availability
    |
121 |     # Check if JAX is available
122 |     try:
123 |         import jax
    |                ^^^ F401
124 |     except ImportError:
125 |         if config["fallback_on_error"]:
    |
    = help: Remove unused import: `jax`

xlearn/_jax/_accelerator.py:132:1: W293 [*] Blank line contains whitespace
    |
130 |                 "Install JAX: pip install jax jaxlib"
131 |             )
132 |     
    | ^^^^ W293
133 |     # Get accelerated implementation
134 |     accelerated_class = _registry.get_accelerated(original_class)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:145:1: W293 [*] Blank line contains whitespace
    |
143 |             )
144 |             return original_class(*args, **kwargs)
145 |     
    | ^^^^ W293
146 |     # Try to create accelerated instance
147 |     try:
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:163:1: W293 Blank line contains whitespace
    |
161 | def jax_accelerate(func: Callable) -> Callable:
162 |     """Decorator to apply JAX acceleration to a function.
163 |     
    | ^^^^ W293
164 |     Parameters
165 |     ----------
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:168:1: W293 Blank line contains whitespace
    |
166 |     func : callable
167 |         Function to accelerate with JAX.
168 |         
    | ^^^^^^^^ W293
169 |     Returns
170 |     -------
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:177:1: W293 [*] Blank line contains whitespace
    |
175 |     def wrapper(*args, **kwargs):
176 |         config = get_config()
177 |         
    | ^^^^^^^^ W293
178 |         if not config["enable_jax"]:
179 |             return func(*args, **kwargs)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:180:1: W293 [*] Blank line contains whitespace
    |
178 |         if not config["enable_jax"]:
179 |             return func(*args, **kwargs)
180 |         
    | ^^^^^^^^ W293
181 |         try:
182 |             import jax
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:183:33: F401 [*] `jax.numpy` imported but unused
    |
181 |         try:
182 |             import jax
183 |             import jax.numpy as jnp
    |                                 ^^^ F401
184 |             
185 |             # Apply JIT compilation if enabled
    |
    = help: Remove unused import: `jax.numpy`

xlearn/_jax/_accelerator.py:184:1: W293 [*] Blank line contains whitespace
    |
182 |             import jax
183 |             import jax.numpy as jnp
184 |             
    | ^^^^^^^^^^^^ W293
185 |             # Apply JIT compilation if enabled
186 |             if config["jit_compilation"]:
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:191:1: W293 [*] Blank line contains whitespace
    |
189 |             else:
190 |                 return func(*args, **kwargs)
191 |                 
    | ^^^^^^^^^^^^^^^^ W293
192 |         except Exception as e:
193 |             if config["fallback_on_error"]:
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_accelerator.py:202:1: W293 [*] Blank line contains whitespace
    |
200 |             else:
201 |                 raise
202 |     
    | ^^^^ W293
203 |     return wrapper
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_config.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/_config.py:3:37: W291 [*] Trailing whitespace
  |
1 | """Configuration management for JAX acceleration."""
2 |
3 | # Authors: The JAX-xlearn developers  
  |                                     ^^ W291
4 | # SPDX-License-Identifier: BSD-3-Clause
  |
  = help: Remove trailing whitespace

xlearn/_jax/_config.py:8:41: F401 [*] `typing.Union` imported but unused
   |
 6 | import threading
 7 | from contextlib import contextmanager
 8 | from typing import Any, Dict, Optional, Union
   |                                         ^^^^^ F401
 9 |
10 | # Global configuration for JAX acceleration
   |
   = help: Remove unused import: `typing.Union`

xlearn/_jax/_config.py:26:1: W293 Blank line contains whitespace
   |
24 | def _get_threadlocal_config() -> Dict[str, Any]:
25 |     """Get a threadlocal **mutable** configuration.
26 |     
   | ^^^^ W293
27 |     If the configuration does not exist, copy the default global configuration.
28 |     """
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:67:1: W293 Blank line contains whitespace
   |
65 |     enable_jax : bool, default=None
66 |         Enable or disable JAX acceleration.
67 |         
   | ^^^^^^^^ W293
68 |     jax_platform : str, default=None
69 |         JAX platform to use ('auto', 'cpu', 'gpu', 'tpu').
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:70:1: W293 Blank line contains whitespace
   |
68 |     jax_platform : str, default=None
69 |         JAX platform to use ('auto', 'cpu', 'gpu', 'tpu').
70 |         
   | ^^^^^^^^ W293
71 |     fallback_on_error : bool, default=None
72 |         Whether to fallback to CPU implementation on JAX errors.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:73:1: W293 Blank line contains whitespace
   |
71 |     fallback_on_error : bool, default=None
72 |         Whether to fallback to CPU implementation on JAX errors.
73 |         
   | ^^^^^^^^ W293
74 |     memory_limit_gpu : int, default=None
75 |         GPU memory limit in MB. None for auto-detection.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:76:1: W293 Blank line contains whitespace
   |
74 |     memory_limit_gpu : int, default=None
75 |         GPU memory limit in MB. None for auto-detection.
76 |         
   | ^^^^^^^^ W293
77 |     jit_compilation : bool, default=None
78 |         Enable JIT compilation of JAX functions.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:79:1: W293 Blank line contains whitespace
   |
77 |     jit_compilation : bool, default=None
78 |         Enable JIT compilation of JAX functions.
79 |         
   | ^^^^^^^^ W293
80 |     precision : str, default=None
81 |         Numerical precision ('float32', 'float64').
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:82:1: W293 Blank line contains whitespace
   |
80 |     precision : str, default=None
81 |         Numerical precision ('float32', 'float64').
82 |         
   | ^^^^^^^^ W293
83 |     debug_mode : bool, default=None
84 |         Enable debug mode with additional checks and logging.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:85:1: W293 Blank line contains whitespace
   |
83 |     debug_mode : bool, default=None
84 |         Enable debug mode with additional checks and logging.
85 |         
   | ^^^^^^^^ W293
86 |     cache_compiled_functions : bool, default=None
87 |         Cache compiled JAX functions for reuse.
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:95:1: W293 [*] Blank line contains whitespace
   |
93 |     """
94 |     local_config = _get_threadlocal_config()
95 |     
   | ^^^^ W293
96 |     if enable_jax is not None:
97 |         local_config["enable_jax"] = enable_jax
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_config.py:143:1: W293 [*] Blank line contains whitespace
    |
141 |     old_config = get_config()
142 |     set_config(**kwargs)
143 |     
    | ^^^^ W293
144 |     try:
145 |         yield
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_config.py:152:1: W293 [*] Blank line contains whitespace
    |
150 |     """Validate current configuration."""
151 |     config = get_config()
152 |     
    | ^^^^ W293
153 |     # Check JAX availability if enabled
154 |     if config["enable_jax"]:
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_config.py:156:20: F401 `jax` imported but unused; consider using `importlib.util.find_spec` to test for availability
    |
154 |     if config["enable_jax"]:
155 |         try:
156 |             import jax
    |                    ^^^ F401
157 |         except ImportError:
158 |             raise ImportError(
    |
    = help: Remove unused import: `jax`

xlearn/_jax/_config.py:162:1: W293 [*] Blank line contains whitespace
    |
160 |                 "Install JAX: pip install jax jaxlib"
161 |             )
162 |     
    | ^^^^ W293
163 |     # Validate platform
164 |     platform = config["jax_platform"]
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_config.py:167:1: W293 [*] Blank line contains whitespace
    |
165 |     if platform not in ("auto", "cpu", "gpu", "tpu"):
166 |         raise ValueError(f"Invalid jax_platform: {platform}")
167 |     
    | ^^^^ W293
168 |     # Validate precision
169 |     precision = config["precision"]
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_data_conversion.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/_data_conversion.py:6:1: I001 [*] Import block is un-sorted or un-formatted
   |
 4 |   # SPDX-License-Identifier: BSD-3-Clause
 5 |
 6 | / import functools
 7 | | import numpy as np
 8 | | from typing import Any, Union, Tuple, Optional
 9 | |
10 | | from ._config import get_config
   | |_______________________________^ I001
11 |
12 |   # Type hints
   |
   = help: Organize imports

xlearn/_jax/_data_conversion.py:200:89: E501 Line too long (95 > 88)
    |
198 |                 converted_args = []
199 |                 for arg in args:
200 |                     if isinstance(arg, (np.ndarray, list, tuple)) and not isinstance(arg, str):
    |                                                                                         ^^^^^^^ E501
201 |                         try:
202 |                             converted_args.append(to_jax(arg))
    |

xlearn/_jax/_data_conversion.py:211:89: E501 Line too long (99 > 88)
    |
209 |                 converted_kwargs = {}
210 |                 for key, value in kwargs.items():
211 |                     if isinstance(value, (np.ndarray, list, tuple)) and not isinstance(value, str):
    |                                                                                         ^^^^^^^^^^^ E501
212 |                         try:
213 |                             converted_kwargs[key] = to_jax(value)
    |

xlearn/_jax/_proxy.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/_proxy.py:6:1: I001 [*] Import block is un-sorted or un-formatted
   |
 4 |   # SPDX-License-Identifier: BSD-3-Clause
 5 |
 6 | / import functools
 7 | | import warnings
 8 | | from typing import Any, Type, Optional
 9 | |
10 | | from ._config import get_config
11 | | from ._accelerator import create_accelerated_estimator
12 | | from ._data_conversion import to_numpy, is_jax_array
13 | | from ._universal_jax import (
14 | |     JAXLinearModelMixin,
15 | |     JAXClusterMixin,
16 | |     JAXDecompositionMixin,
17 | |     create_jax_accelerated_class
18 | | )
   | |_^ I001
   |
   = help: Organize imports

xlearn/_jax/_proxy.py:8:20: F401 [*] `typing.Any` imported but unused
   |
 6 | import functools
 7 | import warnings
 8 | from typing import Any, Type, Optional
   |                    ^^^ F401
 9 |
10 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_proxy.py:8:31: F401 [*] `typing.Optional` imported but unused
   |
 6 | import functools
 7 | import warnings
 8 | from typing import Any, Type, Optional
   |                               ^^^^^^^^ F401
 9 |
10 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_proxy.py:22:89: E501 Line too long (91 > 88)
   |
21 | class EstimatorProxy:
22 |     """Proxy class that transparently switches between JAX and original implementations."""
   |                                                                                         ^^^ E501
23 |
24 |     def __init__(self, original_class: Type, *args, **kwargs):
   |

xlearn/_jax/_proxy.py:57:17: I001 [*] Import block is un-sorted or un-formatted
   |
56 |                   # Check if we got a JAX implementation
57 | /                 from ._accelerator import AcceleratorRegistry
58 | |                 from . import _registry
   | |_______________________________________^ I001
59 |                   accelerated_class = _registry.get_accelerated(self._original_class)
60 |                   self._using_jax = (accelerated_class is not None and
   |
   = help: Organize imports

xlearn/_jax/_proxy.py:57:43: F401 [*] `._accelerator.AcceleratorRegistry` imported but unused
   |
56 |                 # Check if we got a JAX implementation
57 |                 from ._accelerator import AcceleratorRegistry
   |                                           ^^^^^^^^^^^^^^^^^^^ F401
58 |                 from . import _registry
59 |                 accelerated_class = _registry.get_accelerated(self._original_class)
   |
   = help: Remove unused import: `._accelerator.AcceleratorRegistry`

xlearn/_jax/_proxy.py:70:89: E501 Line too long (92 > 88)
   |
68 |                         UserWarning
69 |                     )
70 |                     self._impl = self._original_class(*self._init_args, **self._init_kwargs)
   |                                                                                         ^^^^ E501
71 |                     self._using_jax = False
72 |                 else:
   |

xlearn/_jax/_proxy.py:94:89: E501 Line too long (93 > 88)
   |
92 |         if name.startswith('_'):
93 |             # Private attributes should be handled by this class
94 |             raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
   |                                                                                         ^^^^^ E501
95 |
96 |         attr = getattr(self._impl, name)
   |

xlearn/_jax/_proxy.py:103:30: F823 Local variable `attr` referenced before assignment
    |
101 |             def wrapper(*args, **kwargs):
102 |                 try:
103 |                     result = attr(*args, **kwargs)
    |                              ^^^^ F823
104 |
105 |                     # Convert JAX outputs to NumPy for compatibility
    |

xlearn/_jax/_proxy.py:124:89: E501 Line too long (96 > 88)
    |
122 |                         )
123 |                         # Recreate with original implementation
124 |                         self._impl = self._original_class(*self._init_args, **self._init_kwargs)
    |                                                                                         ^^^^^^^^ E501
125 |                         self._using_jax = False
    |

xlearn/_jax/_proxy.py:140:89: E501 Line too long (116 > 88)
    |
138 |     def __setattr__(self, name, value):
139 |         """Handle attribute setting."""
140 |         if name.startswith('_') or name in ('_original_class', '_init_args', '_init_kwargs', '_impl', '_using_jax'):
    |                                                                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E501
141 |             # Private attributes of the proxy
142 |             super().__setattr__(name, value)
    |

xlearn/_jax/_proxy.py:221:89: E501 Line too long (110 > 88)
    |
219 |     # Select appropriate mixin
220 |     if 'linear_model' in module_name or any(keyword in class_name.lower() for keyword in
221 |                                            ['linear', 'regression', 'ridge', 'lasso', 'elastic', 'logistic']):
    |                                                                                         ^^^^^^^^^^^^^^^^^^^^^^ E501
222 |         mixin_class = JAXLinearModelMixin
223 |     elif 'cluster' in module_name or any(keyword in class_name.lower() for keyword in
    |

xlearn/_jax/_proxy.py:224:89: E501 Line too long (90 > 88)
    |
222 |         mixin_class = JAXLinearModelMixin
223 |     elif 'cluster' in module_name or any(keyword in class_name.lower() for keyword in
224 |                                         ['kmeans', 'cluster', 'dbscan', 'agglomerative']):
    |                                                                                         ^^ E501
225 |         mixin_class = JAXClusterMixin
226 |     elif 'decomposition' in module_name or any(keyword in class_name.lower() for keyword in
    |

xlearn/_jax/_proxy.py:226:89: E501 Line too long (91 > 88)
    |
224 |                                         ['kmeans', 'cluster', 'dbscan', 'agglomerative']):
225 |         mixin_class = JAXClusterMixin
226 |     elif 'decomposition' in module_name or any(keyword in class_name.lower() for keyword in
    |                                                                                         ^^^ E501
227 |                                               ['pca', 'svd', 'nmf', 'ica', 'decomposition']):
228 |         mixin_class = JAXDecompositionMixin
    |

xlearn/_jax/_proxy.py:227:89: E501 Line too long (93 > 88)
    |
225 |         mixin_class = JAXClusterMixin
226 |     elif 'decomposition' in module_name or any(keyword in class_name.lower() for keyword in
227 |                                               ['pca', 'svd', 'nmf', 'ica', 'decomposition']):
    |                                                                                         ^^^^^ E501
228 |         mixin_class = JAXDecompositionMixin
229 |     else:
    |

xlearn/_jax/_proxy.py:254:9: I001 [*] Import block is un-sorted or un-formatted
    |
253 |           # Register it with the accelerator system
254 | /         from ._accelerator import AcceleratorRegistry
255 | |         from . import _registry
    | |_______________________________^ I001
256 |           _registry.register(original_class, jax_class)
    |
    = help: Organize imports

xlearn/_jax/_proxy.py:254:35: F401 [*] `._accelerator.AcceleratorRegistry` imported but unused
    |
253 |         # Register it with the accelerator system
254 |         from ._accelerator import AcceleratorRegistry
    |                                   ^^^^^^^^^^^^^^^^^^^ F401
255 |         from . import _registry
256 |         _registry.register(original_class, jax_class)
    |
    = help: Remove unused import: `._accelerator.AcceleratorRegistry`

xlearn/_jax/_proxy.py:259:89: E501 Line too long (94 > 88)
    |
258 |     except Exception as e:
259 |         warnings.warn(f"Failed to create JAX acceleration for {original_class.__name__}: {e}")
    |                                                                                         ^^^^^^ E501
260 |
261 |     # Create and return proxy class
    |

xlearn/_jax/_universal_jax.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_jax/_universal_jax.py:6:1: I001 [*] Import block is un-sorted or un-formatted
   |
 4 |   # SPDX-License-Identifier: BSD-3-Clause
 5 |
 6 | / import jax
 7 | | import jax.numpy as jnp
 8 | | import numpy as np
 9 | | from jax.scipy import linalg
10 | | from typing import Any, Dict, Optional, Tuple, Union
11 | |
12 | | from ._config import get_config
13 | | from ._data_conversion import to_jax, to_numpy
   | |______________________________________________^ I001
   |
   = help: Organize imports

xlearn/_jax/_universal_jax.py:10:20: F401 [*] `typing.Any` imported but unused
   |
 8 | import numpy as np
 9 | from jax.scipy import linalg
10 | from typing import Any, Dict, Optional, Tuple, Union
   |                    ^^^ F401
11 |
12 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_universal_jax.py:10:31: F401 [*] `typing.Optional` imported but unused
   |
 8 | import numpy as np
 9 | from jax.scipy import linalg
10 | from typing import Any, Dict, Optional, Tuple, Union
   |                               ^^^^^^^^ F401
11 |
12 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_universal_jax.py:10:48: F401 [*] `typing.Union` imported but unused
   |
 8 | import numpy as np
 9 | from jax.scipy import linalg
10 | from typing import Any, Dict, Optional, Tuple, Union
   |                                                ^^^^^ F401
11 |
12 | from ._config import get_config
   |
   = help: Remove unused import

xlearn/_jax/_universal_jax.py:18:1: W293 [*] Blank line contains whitespace
   |
16 | class UniversalJAXMixin:
17 |     """Mixin class that provides universal JAX acceleration for common operations."""
18 |     
   | ^^^^ W293
19 |     def __init__(self):
20 |         self._jax_compiled_functions = {}
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:22:1: W293 [*] Blank line contains whitespace
   |
20 |         self._jax_compiled_functions = {}
21 |         self._performance_cache = {}
22 |     
   | ^^^^ W293
23 |     def _should_use_jax(self, X: np.ndarray, algorithm_name: str = None) -> bool:
24 |         """Determine if JAX should be used based on data characteristics."""
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:23:62: RUF013 PEP 484 prohibits implicit `Optional`
   |
21 |         self._performance_cache = {}
22 |     
23 |     def _should_use_jax(self, X: np.ndarray, algorithm_name: str = None) -> bool:
   |                                                              ^^^ RUF013
24 |         """Determine if JAX should be used based on data characteristics."""
25 |         config = get_config()
   |
   = help: Convert to `T | None`

xlearn/_jax/_universal_jax.py:28:1: W293 [*] Blank line contains whitespace
   |
26 |         if not config.get("enable_jax", True):
27 |             return False
28 |         
   | ^^^^^^^^ W293
29 |         n_samples, n_features = X.shape
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:30:1: W293 [*] Blank line contains whitespace
   |
29 |         n_samples, n_features = X.shape
30 |         
   | ^^^^^^^^ W293
31 |         # Cache key for performance decision
32 |         cache_key = (n_samples, n_features, algorithm_name or self.__class__.__name__)
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:35:1: W293 [*] Blank line contains whitespace
   |
33 |         if cache_key in self._performance_cache:
34 |             return self._performance_cache[cache_key]
35 |         
   | ^^^^^^^^ W293
36 |         # Heuristic decision based on data size and algorithm type
37 |         decision = self._performance_heuristic(n_samples, n_features, algorithm_name)
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:39:1: W293 [*] Blank line contains whitespace
   |
37 |         decision = self._performance_heuristic(n_samples, n_features, algorithm_name)
38 |         self._performance_cache[cache_key] = decision
39 |         
   | ^^^^^^^^ W293
40 |         return decision
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:41:1: W293 [*] Blank line contains whitespace
   |
40 |         return decision
41 |     
   | ^^^^ W293
42 |     def _performance_heuristic(self, n_samples: int, n_features: int, algorithm_name: str = None) -> bool:
43 |         """Heuristic to decide whether to use JAX based on problem characteristics."""
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:42:87: RUF013 PEP 484 prohibits implicit `Optional`
   |
40 |         return decision
41 |     
42 |     def _performance_heuristic(self, n_samples: int, n_features: int, algorithm_name: str = None) -> bool:
   |                                                                                       ^^^ RUF013
43 |         """Heuristic to decide whether to use JAX based on problem characteristics."""
44 |         complexity = n_samples * n_features
   |
   = help: Convert to `T | None`

xlearn/_jax/_universal_jax.py:42:89: E501 Line too long (106 > 88)
   |
40 |         return decision
41 |     
42 |     def _performance_heuristic(self, n_samples: int, n_features: int, algorithm_name: str = None) -> bool:
   |                                                                                         ^^^^^^^^^^^^^^^^^^ E501
43 |         """Heuristic to decide whether to use JAX based on problem characteristics."""
44 |         complexity = n_samples * n_features
   |

xlearn/_jax/_universal_jax.py:45:1: W293 [*] Blank line contains whitespace
   |
43 |         """Heuristic to decide whether to use JAX based on problem characteristics."""
44 |         complexity = n_samples * n_features
45 |         
   | ^^^^^^^^ W293
46 |         # Algorithm-specific thresholds based on our testing
47 |         thresholds = {
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:51:89: E501 Line too long (97 > 88)
   |
49 |             'LinearRegression': {'min_complexity': 1e8, 'min_samples': 10000},
50 |             'Ridge': {'min_complexity': 1e8, 'min_samples': 10000},
51 |             'Lasso': {'min_complexity': 5e7, 'min_samples': 5000},  # Iterative, benefits earlier
   |                                                                                         ^^^^^^^^^ E501
52 |             'LogisticRegression': {'min_complexity': 5e7, 'min_samples': 5000},
   |

xlearn/_jax/_universal_jax.py:53:1: W293 [*] Blank line contains whitespace
   |
51 |             'Lasso': {'min_complexity': 5e7, 'min_samples': 5000},  # Iterative, benefits earlier
52 |             'LogisticRegression': {'min_complexity': 5e7, 'min_samples': 5000},
53 |             
   | ^^^^^^^^^^^^ W293
54 |             # Clustering - benefit from vectorization
55 |             'KMeans': {'min_complexity': 1e6, 'min_samples': 5000},
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:57:1: W293 [*] Blank line contains whitespace
   |
55 |             'KMeans': {'min_complexity': 1e6, 'min_samples': 5000},
56 |             'DBSCAN': {'min_complexity': 1e6, 'min_samples': 1000},
57 |             
   | ^^^^^^^^^^^^ W293
58 |             # Decomposition - matrix operations benefit greatly
59 |             'PCA': {'min_complexity': 1e7, 'min_samples': 5000},
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:62:1: W293 [*] Blank line contains whitespace
   |
60 |             'TruncatedSVD': {'min_complexity': 1e7, 'min_samples': 5000},
61 |             'NMF': {'min_complexity': 5e6, 'min_samples': 2000},
62 |             
   | ^^^^^^^^^^^^ W293
63 |             # Tree-based - limited JAX benefit but some operations can be accelerated
64 |             'RandomForestClassifier': {'min_complexity': 1e5, 'min_samples': 1000},
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:66:1: W293 [*] Blank line contains whitespace
   |
64 |             'RandomForestClassifier': {'min_complexity': 1e5, 'min_samples': 1000},
65 |             'RandomForestRegressor': {'min_complexity': 1e5, 'min_samples': 1000},
66 |             
   | ^^^^^^^^^^^^ W293
67 |             # Default for unknown algorithms
68 |             'default': {'min_complexity': 1e7, 'min_samples': 10000}
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:70:1: W293 [*] Blank line contains whitespace
   |
68 |             'default': {'min_complexity': 1e7, 'min_samples': 10000}
69 |         }
70 |         
   | ^^^^^^^^ W293
71 |         # Get threshold for this algorithm
72 |         algo_name = algorithm_name or self.__class__.__name__
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:74:1: W293 [*] Blank line contains whitespace
   |
72 |         algo_name = algorithm_name or self.__class__.__name__
73 |         threshold = thresholds.get(algo_name, thresholds['default'])
74 |         
   | ^^^^^^^^ W293
75 |         return (complexity >= threshold['min_complexity'] and 
76 |                 n_samples >= threshold['min_samples'])
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:75:62: W291 [*] Trailing whitespace
   |
73 |         threshold = thresholds.get(algo_name, thresholds['default'])
74 |         
75 |         return (complexity >= threshold['min_complexity'] and 
   |                                                              ^ W291
76 |                 n_samples >= threshold['min_samples'])
   |
   = help: Remove trailing whitespace

xlearn/_jax/_universal_jax.py:77:1: W293 [*] Blank line contains whitespace
   |
75 |         return (complexity >= threshold['min_complexity'] and 
76 |                 n_samples >= threshold['min_samples'])
77 |     
   | ^^^^ W293
78 |     @staticmethod
79 |     @jax.jit
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:80:89: E501 Line too long (111 > 88)
   |
78 |     @staticmethod
79 |     @jax.jit
80 |     def _jax_solve_linear_system(A: jnp.ndarray, b: jnp.ndarray, regularization: float = 1e-10) -> jnp.ndarray:
   |                                                                                         ^^^^^^^^^^^^^^^^^^^^^^^ E501
81 |         """JAX-compiled function to solve linear system Ax = b."""
82 |         # Add regularization for numerical stability
   |

xlearn/_jax/_universal_jax.py:90:1: W293 [*] Blank line contains whitespace
   |
88 |             A_reg = AtA + regularization * jnp.eye(AtA.shape[0])
89 |             b = A.T @ b
90 |         
   | ^^^^^^^^ W293
91 |         return linalg.solve(A_reg, b)
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:92:1: W293 [*] Blank line contains whitespace
   |
91 |         return linalg.solve(A_reg, b)
92 |     
   | ^^^^ W293
93 |     @staticmethod
94 |     @jax.jit
   |
   = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:95:89: E501 Line too long (102 > 88)
   |
93 |     @staticmethod
94 |     @jax.jit
95 |     def _jax_linear_regression_fit(X: jnp.ndarray, y: jnp.ndarray) -> Tuple[jnp.ndarray, jnp.ndarray]:
   |                                                                                         ^^^^^^^^^^^^^^ E501
96 |         """JAX-compiled linear regression fitting."""
97 |         n_samples, n_features = X.shape
   |

xlearn/_jax/_universal_jax.py:98:1: W293 [*] Blank line contains whitespace
    |
 96 |         """JAX-compiled linear regression fitting."""
 97 |         n_samples, n_features = X.shape
 98 |         
    | ^^^^^^^^ W293
 99 |         # Center the data
100 |         X_mean = jnp.mean(X, axis=0)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:104:1: W293 [*] Blank line contains whitespace
    |
102 |         X_centered = X - X_mean
103 |         y_centered = y - y_mean
104 |         
    | ^^^^^^^^ W293
105 |         # Solve normal equations: (X^T X) coef = X^T y
106 |         XtX = X_centered.T @ X_centered
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:108:1: W293 [*] Blank line contains whitespace
    |
106 |         XtX = X_centered.T @ X_centered
107 |         Xty = X_centered.T @ y_centered
108 |         
    | ^^^^^^^^ W293
109 |         # Add small regularization for numerical stability
110 |         regularization = 1e-10 * jnp.trace(XtX) / n_features
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:112:1: W293 [*] Blank line contains whitespace
    |
110 |         regularization = 1e-10 * jnp.trace(XtX) / n_features
111 |         coef = linalg.solve(XtX + regularization * jnp.eye(n_features), Xty)
112 |         
    | ^^^^^^^^ W293
113 |         # Calculate intercept
114 |         intercept = y_mean - X_mean @ coef
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:115:1: W293 [*] Blank line contains whitespace
    |
113 |         # Calculate intercept
114 |         intercept = y_mean - X_mean @ coef
115 |         
    | ^^^^^^^^ W293
116 |         return coef, intercept
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:117:1: W293 [*] Blank line contains whitespace
    |
116 |         return coef, intercept
117 |     
    | ^^^^ W293
118 |     @staticmethod
119 |     @jax.jit
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:120:89: E501 Line too long (115 > 88)
    |
118 |     @staticmethod
119 |     @jax.jit
120 |     def _jax_ridge_regression_fit(X: jnp.ndarray, y: jnp.ndarray, alpha: float) -> Tuple[jnp.ndarray, jnp.ndarray]:
    |                                                                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^ E501
121 |         """JAX-compiled Ridge regression fitting."""
122 |         n_samples, n_features = X.shape
    |

xlearn/_jax/_universal_jax.py:123:1: W293 [*] Blank line contains whitespace
    |
121 |         """JAX-compiled Ridge regression fitting."""
122 |         n_samples, n_features = X.shape
123 |         
    | ^^^^^^^^ W293
124 |         # Center the data
125 |         X_mean = jnp.mean(X, axis=0)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:129:1: W293 [*] Blank line contains whitespace
    |
127 |         X_centered = X - X_mean
128 |         y_centered = y - y_mean
129 |         
    | ^^^^^^^^ W293
130 |         # Solve regularized normal equations: (X^T X + alpha*I) coef = X^T y
131 |         XtX = X_centered.T @ X_centered
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:133:1: W293 [*] Blank line contains whitespace
    |
131 |         XtX = X_centered.T @ X_centered
132 |         Xty = X_centered.T @ y_centered
133 |         
    | ^^^^^^^^ W293
134 |         coef = linalg.solve(XtX + alpha * jnp.eye(n_features), Xty)
135 |         intercept = y_mean - X_mean @ coef
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:136:1: W293 [*] Blank line contains whitespace
    |
134 |         coef = linalg.solve(XtX + alpha * jnp.eye(n_features), Xty)
135 |         intercept = y_mean - X_mean @ coef
136 |         
    | ^^^^^^^^ W293
137 |         return coef, intercept
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:138:1: W293 [*] Blank line contains whitespace
    |
137 |         return coef, intercept
138 |     
    | ^^^^ W293
139 |     @staticmethod
140 |     @jax.jit
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:141:89: E501 Line too long (104 > 88)
    |
139 |     @staticmethod
140 |     @jax.jit
141 |     def _jax_pca_fit(X: jnp.ndarray, n_components: int) -> Tuple[jnp.ndarray, jnp.ndarray, jnp.ndarray]:
    |                                                                                         ^^^^^^^^^^^^^^^^ E501
142 |         """JAX-compiled PCA fitting."""
143 |         n_samples, n_features = X.shape
    |

xlearn/_jax/_universal_jax.py:144:1: W293 [*] Blank line contains whitespace
    |
142 |         """JAX-compiled PCA fitting."""
143 |         n_samples, n_features = X.shape
144 |         
    | ^^^^^^^^ W293
145 |         # Center the data
146 |         X_mean = jnp.mean(X, axis=0)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:148:1: W293 [*] Blank line contains whitespace
    |
146 |         X_mean = jnp.mean(X, axis=0)
147 |         X_centered = X - X_mean
148 |         
    | ^^^^^^^^ W293
149 |         # Compute SVD
150 |         U, s, Vt = jnp.linalg.svd(X_centered, full_matrices=False)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:151:1: W293 [*] Blank line contains whitespace
    |
149 |         # Compute SVD
150 |         U, s, Vt = jnp.linalg.svd(X_centered, full_matrices=False)
151 |         
    | ^^^^^^^^ W293
152 |         # Select top components
153 |         components = Vt[:n_components]
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:155:1: W293 [*] Blank line contains whitespace
    |
153 |         components = Vt[:n_components]
154 |         explained_variance = (s[:n_components] ** 2) / (n_samples - 1)
155 |         
    | ^^^^^^^^ W293
156 |         return components, explained_variance, X_mean
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:157:1: W293 [*] Blank line contains whitespace
    |
156 |         return components, explained_variance, X_mean
157 |     
    | ^^^^ W293
158 |     @staticmethod
159 |     @jax.jit
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:160:89: E501 Line too long (98 > 88)
    |
158 |     @staticmethod
159 |     @jax.jit
160 |     def _jax_kmeans_step(X: jnp.ndarray, centers: jnp.ndarray) -> Tuple[jnp.ndarray, jnp.ndarray]:
    |                                                                                         ^^^^^^^^^^ E501
161 |         """JAX-compiled K-means iteration step."""
162 |         # Compute distances to all centers
    |

xlearn/_jax/_universal_jax.py:164:1: W293 [*] Blank line contains whitespace
    |
162 |         # Compute distances to all centers
163 |         distances = jnp.linalg.norm(X[:, None, :] - centers[None, :, :], axis=2)
164 |         
    | ^^^^^^^^ W293
165 |         # Assign points to closest centers
166 |         labels = jnp.argmin(distances, axis=1)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:167:1: W293 [*] Blank line contains whitespace
    |
165 |         # Assign points to closest centers
166 |         labels = jnp.argmin(distances, axis=1)
167 |         
    | ^^^^^^^^ W293
168 |         # Update centers
169 |         new_centers = jnp.array([
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:170:45: W291 [*] Trailing whitespace
    |
168 |         # Update centers
169 |         new_centers = jnp.array([
170 |             jnp.mean(X[labels == k], axis=0) 
    |                                             ^ W291
171 |             for k in range(centers.shape[0])
172 |         ])
    |
    = help: Remove trailing whitespace

xlearn/_jax/_universal_jax.py:173:1: W293 [*] Blank line contains whitespace
    |
171 |             for k in range(centers.shape[0])
172 |         ])
173 |         
    | ^^^^^^^^ W293
174 |         return new_centers, labels
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:175:1: W293 [*] Blank line contains whitespace
    |
174 |         return new_centers, labels
175 |     
    | ^^^^ W293
176 |     def _apply_jax_linear_regression(self, X: np.ndarray, y: np.ndarray) -> Dict[str, np.ndarray]:
177 |         """Apply JAX-accelerated linear regression."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:176:89: E501 Line too long (98 > 88)
    |
174 |         return new_centers, labels
175 |     
176 |     def _apply_jax_linear_regression(self, X: np.ndarray, y: np.ndarray) -> Dict[str, np.ndarray]:
    |                                                                                         ^^^^^^^^^^ E501
177 |         """Apply JAX-accelerated linear regression."""
178 |         X_jax = to_jax(X)
    |

xlearn/_jax/_universal_jax.py:180:1: W293 [*] Blank line contains whitespace
    |
178 |         X_jax = to_jax(X)
179 |         y_jax = to_jax(y)
180 |         
    | ^^^^^^^^ W293
181 |         coef_jax, intercept_jax = self._jax_linear_regression_fit(X_jax, y_jax)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:182:1: W293 [*] Blank line contains whitespace
    |
181 |         coef_jax, intercept_jax = self._jax_linear_regression_fit(X_jax, y_jax)
182 |         
    | ^^^^^^^^ W293
183 |         return {
184 |             'coef_': to_numpy(coef_jax),
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:187:1: W293 [*] Blank line contains whitespace
    |
185 |             'intercept_': to_numpy(intercept_jax)
186 |         }
187 |     
    | ^^^^ W293
188 |     def _apply_jax_ridge_regression(self, X: np.ndarray, y: np.ndarray, alpha: float = 1.0) -> Dict[str, np.ndarray]:
189 |         """Apply JAX-accelerated Ridge regression."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:188:89: E501 Line too long (117 > 88)
    |
186 |         }
187 |     
188 |     def _apply_jax_ridge_regression(self, X: np.ndarray, y: np.ndarray, alpha: float = 1.0) -> Dict[str, np.ndarray]:
    |                                                                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E501
189 |         """Apply JAX-accelerated Ridge regression."""
190 |         X_jax = to_jax(X)
    |

xlearn/_jax/_universal_jax.py:192:1: W293 [*] Blank line contains whitespace
    |
190 |         X_jax = to_jax(X)
191 |         y_jax = to_jax(y)
192 |         
    | ^^^^^^^^ W293
193 |         coef_jax, intercept_jax = self._jax_ridge_regression_fit(X_jax, y_jax, alpha)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:194:1: W293 [*] Blank line contains whitespace
    |
193 |         coef_jax, intercept_jax = self._jax_ridge_regression_fit(X_jax, y_jax, alpha)
194 |         
    | ^^^^^^^^ W293
195 |         return {
196 |             'coef_': to_numpy(coef_jax),
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:199:1: W293 [*] Blank line contains whitespace
    |
197 |             'intercept_': to_numpy(intercept_jax)
198 |         }
199 |     
    | ^^^^ W293
200 |     def _apply_jax_pca(self, X: np.ndarray, n_components: int) -> Dict[str, np.ndarray]:
201 |         """Apply JAX-accelerated PCA."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:203:1: W293 [*] Blank line contains whitespace
    |
201 |         """Apply JAX-accelerated PCA."""
202 |         X_jax = to_jax(X)
203 |         
    | ^^^^^^^^ W293
204 |         components_jax, explained_variance_jax, mean_jax = self._jax_pca_fit(X_jax, n_components)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:204:89: E501 Line too long (97 > 88)
    |
202 |         X_jax = to_jax(X)
203 |         
204 |         components_jax, explained_variance_jax, mean_jax = self._jax_pca_fit(X_jax, n_components)
    |                                                                                         ^^^^^^^^^ E501
205 |         
206 |         return {
    |

xlearn/_jax/_universal_jax.py:205:1: W293 [*] Blank line contains whitespace
    |
204 |         components_jax, explained_variance_jax, mean_jax = self._jax_pca_fit(X_jax, n_components)
205 |         
    | ^^^^^^^^ W293
206 |         return {
207 |             'components_': to_numpy(components_jax),
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:211:1: W293 [*] Blank line contains whitespace
    |
209 |             'mean_': to_numpy(mean_jax)
210 |         }
211 |     
    | ^^^^ W293
212 |     def _apply_jax_kmeans_iteration(self, X: np.ndarray, centers: np.ndarray) -> Dict[str, np.ndarray]:
213 |         """Apply JAX-accelerated K-means iteration."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:212:89: E501 Line too long (103 > 88)
    |
210 |         }
211 |     
212 |     def _apply_jax_kmeans_iteration(self, X: np.ndarray, centers: np.ndarray) -> Dict[str, np.ndarray]:
    |                                                                                         ^^^^^^^^^^^^^^^ E501
213 |         """Apply JAX-accelerated K-means iteration."""
214 |         X_jax = to_jax(X)
    |

xlearn/_jax/_universal_jax.py:216:1: W293 [*] Blank line contains whitespace
    |
214 |         X_jax = to_jax(X)
215 |         centers_jax = to_jax(centers)
216 |         
    | ^^^^^^^^ W293
217 |         new_centers_jax, labels_jax = self._jax_kmeans_step(X_jax, centers_jax)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:218:1: W293 [*] Blank line contains whitespace
    |
217 |         new_centers_jax, labels_jax = self._jax_kmeans_step(X_jax, centers_jax)
218 |         
    | ^^^^^^^^ W293
219 |         return {
220 |             'cluster_centers_': to_numpy(new_centers_jax),
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:227:1: W293 [*] Blank line contains whitespace
    |
225 | class JAXLinearModelMixin(UniversalJAXMixin):
226 |     """Mixin for JAX-accelerated linear models."""
227 |     
    | ^^^^ W293
228 |     def jax_fit(self, X: np.ndarray, y: np.ndarray, algorithm: str = 'linear') -> 'JAXLinearModelMixin':
229 |         """JAX-accelerated fitting for linear models."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:228:89: E501 Line too long (104 > 88)
    |
226 |     """Mixin for JAX-accelerated linear models."""
227 |     
228 |     def jax_fit(self, X: np.ndarray, y: np.ndarray, algorithm: str = 'linear') -> 'JAXLinearModelMixin':
    |                                                                                         ^^^^^^^^^^^^^^^^ E501
229 |         """JAX-accelerated fitting for linear models."""
230 |         if not self._should_use_jax(X, algorithm):
    |

xlearn/_jax/_universal_jax.py:233:1: W293 [*] Blank line contains whitespace
    |
231 |             # Fallback to original implementation
232 |             return self._original_fit(X, y)
233 |         
    | ^^^^^^^^ W293
234 |         try:
235 |             if algorithm == 'linear':
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:243:1: W293 [*] Blank line contains whitespace
    |
241 |                 # Fallback for unsupported algorithms
242 |                 return self._original_fit(X, y)
243 |             
    | ^^^^^^^^^^^^ W293
244 |             # Set attributes
245 |             for attr_name, attr_value in results.items():
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:247:1: W293 [*] Blank line contains whitespace
    |
245 |             for attr_name, attr_value in results.items():
246 |                 setattr(self, attr_name, attr_value)
247 |             
    | ^^^^^^^^^^^^ W293
248 |             return self
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:249:1: W293 [*] Blank line contains whitespace
    |
248 |             return self
249 |             
    | ^^^^^^^^^^^^ W293
250 |         except Exception as e:
251 |             config = get_config()
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:254:89: E501 Line too long (89 > 88)
    |
252 |             if config.get("fallback_on_error", True):
253 |                 import warnings
254 |                 warnings.warn(f"JAX fitting failed: {e}. Using original implementation.")
    |                                                                                         ^ E501
255 |                 return self._original_fit(X, y)
256 |             else:
    |

xlearn/_jax/_universal_jax.py:262:1: W293 [*] Blank line contains whitespace
    |
260 | class JAXClusterMixin(UniversalJAXMixin):
261 |     """Mixin for JAX-accelerated clustering algorithms."""
262 |     
    | ^^^^ W293
263 |     def jax_fit(self, X: np.ndarray) -> 'JAXClusterMixin':
264 |         """JAX-accelerated fitting for clustering algorithms."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:267:1: W293 [*] Blank line contains whitespace
    |
265 |         if not self._should_use_jax(X, 'KMeans'):
266 |             return self._original_fit(X)
267 |         
    | ^^^^^^^^ W293
268 |         try:
269 |             # Initialize centers (this is algorithm-specific)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:272:1: W293 [*] Blank line contains whitespace
    |
270 |             n_clusters = getattr(self, 'n_clusters', 8)
271 |             centers = self._initialize_centers(X, n_clusters)
272 |             
    | ^^^^^^^^^^^^ W293
273 |             # Iterative K-means with JAX acceleration
274 |             max_iter = getattr(self, 'max_iter', 300)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:276:1: W293 [*] Blank line contains whitespace
    |
274 |             max_iter = getattr(self, 'max_iter', 300)
275 |             tol = getattr(self, 'tol', 1e-4)
276 |             
    | ^^^^^^^^^^^^ W293
277 |             for i in range(max_iter):
278 |                 results = self._apply_jax_kmeans_iteration(X, centers)
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:280:1: W293 [*] Blank line contains whitespace
    |
278 |                 results = self._apply_jax_kmeans_iteration(X, centers)
279 |                 new_centers = results['cluster_centers_']
280 |                 
    | ^^^^^^^^^^^^^^^^ W293
281 |                 # Check convergence
282 |                 if np.allclose(centers, new_centers, atol=tol):
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:284:1: W293 [*] Blank line contains whitespace
    |
282 |                 if np.allclose(centers, new_centers, atol=tol):
283 |                     break
284 |                     
    | ^^^^^^^^^^^^^^^^^^^^ W293
285 |                 centers = new_centers
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:286:1: W293 [*] Blank line contains whitespace
    |
285 |                 centers = new_centers
286 |             
    | ^^^^^^^^^^^^ W293
287 |             # Set final results
288 |             self.cluster_centers_ = centers
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:291:1: W293 [*] Blank line contains whitespace
    |
289 |             self.labels_ = results['labels_']
290 |             self.n_iter_ = i + 1
291 |             
    | ^^^^^^^^^^^^ W293
292 |             return self
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:293:1: W293 [*] Blank line contains whitespace
    |
292 |             return self
293 |             
    | ^^^^^^^^^^^^ W293
294 |         except Exception as e:
295 |             config = get_config()
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:298:89: E501 Line too long (92 > 88)
    |
296 |             if config.get("fallback_on_error", True):
297 |                 import warnings
298 |                 warnings.warn(f"JAX clustering failed: {e}. Using original implementation.")
    |                                                                                         ^^^^ E501
299 |                 return self._original_fit(X)
300 |             else:
    |

xlearn/_jax/_universal_jax.py:302:1: W293 [*] Blank line contains whitespace
    |
300 |             else:
301 |                 raise
302 |     
    | ^^^^ W293
303 |     def _initialize_centers(self, X: np.ndarray, n_clusters: int) -> np.ndarray:
304 |         """Initialize cluster centers."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:314:1: W293 [*] Blank line contains whitespace
    |
312 | class JAXDecompositionMixin(UniversalJAXMixin):
313 |     """Mixin for JAX-accelerated decomposition algorithms."""
314 |     
    | ^^^^ W293
315 |     def jax_fit(self, X: np.ndarray) -> 'JAXDecompositionMixin':
316 |         """JAX-accelerated fitting for decomposition algorithms."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:319:1: W293 [*] Blank line contains whitespace
    |
317 |         if not self._should_use_jax(X, 'PCA'):
318 |             return self._original_fit(X)
319 |         
    | ^^^^^^^^ W293
320 |         try:
321 |             n_components = getattr(self, 'n_components', min(X.shape))
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:323:1: W293 [*] Blank line contains whitespace
    |
321 |             n_components = getattr(self, 'n_components', min(X.shape))
322 |             results = self._apply_jax_pca(X, n_components)
323 |             
    | ^^^^^^^^^^^^ W293
324 |             # Set attributes
325 |             for attr_name, attr_value in results.items():
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:327:1: W293 [*] Blank line contains whitespace
    |
325 |             for attr_name, attr_value in results.items():
326 |                 setattr(self, attr_name, attr_value)
327 |             
    | ^^^^^^^^^^^^ W293
328 |             # Calculate explained variance ratio
329 |             total_var = np.sum(results['explained_variance_'])
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:331:1: W293 [*] Blank line contains whitespace
    |
329 |             total_var = np.sum(results['explained_variance_'])
330 |             self.explained_variance_ratio_ = results['explained_variance_'] / total_var
331 |             
    | ^^^^^^^^^^^^ W293
332 |             return self
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:333:1: W293 [*] Blank line contains whitespace
    |
332 |             return self
333 |             
    | ^^^^^^^^^^^^ W293
334 |         except Exception as e:
335 |             config = get_config()
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:338:89: E501 Line too long (95 > 88)
    |
336 |             if config.get("fallback_on_error", True):
337 |                 import warnings
338 |                 warnings.warn(f"JAX decomposition failed: {e}. Using original implementation.")
    |                                                                                         ^^^^^^^ E501
339 |                 return self._original_fit(X)
340 |             else:
    |

xlearn/_jax/_universal_jax.py:346:1: W293 Blank line contains whitespace
    |
344 | def create_jax_accelerated_class(original_class: type, mixin_class: type) -> type:
345 |     """Create a JAX-accelerated version of a class using a mixin.
346 |     
    | ^^^^ W293
347 |     Parameters
348 |     ----------
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:353:1: W293 Blank line contains whitespace
    |
351 |     mixin_class : type
352 |         The JAX mixin class to use
353 |     
    | ^^^^ W293
354 |     Returns
355 |     -------
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:363:1: W293 [*] Blank line contains whitespace
    |
361 |             original_class.__init__(self, *args, **kwargs)
362 |             mixin_class.__init__(self)
363 |             
    | ^^^^^^^^^^^^ W293
364 |             # Store original fit method
365 |             self._original_fit = original_class.fit
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:366:1: W293 [*] Blank line contains whitespace
    |
364 |             # Store original fit method
365 |             self._original_fit = original_class.fit
366 |         
    | ^^^^^^^^ W293
367 |         def fit(self, X, y=None, **kwargs):
368 |             """Override fit to use JAX acceleration when beneficial."""
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:369:89: E501 Line too long (95 > 88)
    |
367 |         def fit(self, X, y=None, **kwargs):
368 |             """Override fit to use JAX acceleration when beneficial."""
369 |             return self.jax_fit(X, y, **kwargs) if y is not None else self.jax_fit(X, **kwargs)
    |                                                                                         ^^^^^^^ E501
370 |     
371 |     # Copy metadata
    |

xlearn/_jax/_universal_jax.py:370:1: W293 [*] Blank line contains whitespace
    |
368 |             """Override fit to use JAX acceleration when beneficial."""
369 |             return self.jax_fit(X, y, **kwargs) if y is not None else self.jax_fit(X, **kwargs)
370 |     
    | ^^^^ W293
371 |     # Copy metadata
372 |     JAXAcceleratedClass.__name__ = f"JAX{original_class.__name__}"
    |
    = help: Remove whitespace from blank line

xlearn/_jax/_universal_jax.py:375:1: W293 [*] Blank line contains whitespace
    |
373 |     JAXAcceleratedClass.__qualname__ = f"JAX{original_class.__qualname__}"
374 |     JAXAcceleratedClass.__module__ = original_class.__module__
375 |     
    | ^^^^ W293
376 |     return JAXAcceleratedClass
    |
    = help: Remove whitespace from blank line

xlearn/_loss/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_loss/link.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_loss/loss.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/_min_dependencies.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/calibration.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_affinity_propagation.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_agglomerative.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_bicluster.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_birch.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_bisect_k_means.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_dbscan.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_feature_agglomeration.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_hdbscan/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_hdbscan/hdbscan.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_kmeans.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_mean_shift.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_optics.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cluster/_spectral.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/compose/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/compose/_column_transformer.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/compose/_target.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/conftest.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/covariance/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/covariance/_elliptic_envelope.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/covariance/_empirical_covariance.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/covariance/_graph_lasso.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/covariance/_robust_covariance.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/covariance/_shrunk_covariance.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cross_decomposition/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/cross_decomposition/_pls.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_arff_parser.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_california_housing.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_covtype.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_kddcup99.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_lfw.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_olivetti_faces.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_openml.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_rcv1.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_samples_generator.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_species_distributions.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_svmlight_format_io.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/_twenty_newsgroups.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/data/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/descr/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/datasets/images/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_dict_learning.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_factor_analysis.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_fastica.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_incremental_pca.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_kernel_pca.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_lda.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_nmf.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_pca.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_sparse_pca.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/decomposition/_truncated_svd.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/discriminant_analysis.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/dummy.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_bagging.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_forest.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_gb.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_hist_gradient_boosting/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_hist_gradient_boosting/binning.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_hist_gradient_boosting/gradient_boosting.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_hist_gradient_boosting/grower.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_hist_gradient_boosting/predictor.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_hist_gradient_boosting/utils.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_iforest.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_stacking.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_voting.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/ensemble/_weight_boosting.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/exceptions.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/experimental/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/experimental/enable_halving_search_cv.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/experimental/enable_hist_gradient_boosting.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/experimental/enable_iterative_imputer.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_extraction/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_extraction/_dict_vectorizer.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_extraction/_hash.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_extraction/_stop_words.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_extraction/image.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_extraction/text.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_from_model.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_mutual_info.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_rfe.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_sequential.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_univariate_selection.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/feature_selection/_variance_threshold.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/frozen/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/frozen/_frozen.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/gaussian_process/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/gaussian_process/_gpc.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/gaussian_process/_gpr.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/gaussian_process/kernels.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/impute/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/impute/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/impute/_iterative.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/impute/_knn.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/_partial_dependence.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/_pd_utils.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/_permutation_importance.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/_plot/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/_plot/decision_boundary.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/inspection/_plot/partial_dependence.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/isotonic.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/kernel_approximation.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/kernel_ridge.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_bayes.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_coordinate_descent.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_glm/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_glm/_newton_solver.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_glm/glm.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_huber.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_least_angle.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_linear_loss.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_logistic.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_omp.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_passive_aggressive.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_perceptron.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_quantile.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_ransac.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_ridge.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_sag.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_stochastic_gradient.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/linear_model/_theil_sen.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/manifold/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/manifold/_isomap.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/manifold/_locally_linear.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/manifold/_mds.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/manifold/_spectral_embedding.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/manifold/_t_sne.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_classification.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_pairwise_distances_reduction/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_pairwise_distances_reduction/_dispatcher.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_plot/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_plot/confusion_matrix.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_plot/det_curve.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_plot/precision_recall_curve.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_plot/regression.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_plot/roc_curve.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_ranking.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_regression.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/_scorer.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/cluster/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/cluster/_bicluster.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/cluster/_supervised.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/cluster/_unsupervised.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/metrics/pairwise.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/mixture/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/mixture/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/mixture/_bayesian_mixture.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/mixture/_gaussian_mixture.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/_classification_threshold.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/_plot.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/_search.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/_search_successive_halving.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/_split.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/model_selection/_validation.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/multiclass.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/multioutput.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/naive_bayes.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_classification.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_graph.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_kde.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_lof.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_nca.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_nearest_centroid.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_regression.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neighbors/_unsupervised.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neural_network/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neural_network/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neural_network/_multilayer_perceptron.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neural_network/_rbm.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/neural_network/_stochastic_optimizers.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/pipeline.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_data.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_discretization.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_encoders.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_function_transformer.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_label.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_polynomial.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/preprocessing/_target_encoder.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/random_projection.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/semi_supervised/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/semi_supervised/_label_propagation.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/semi_supervised/_self_training.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/svm/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/svm/_base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/svm/_bounds.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/svm/_classes.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/tree/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/tree/_classes.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/tree/_export.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/tree/_reingold_tilford.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_arpack.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_array_api.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_available_if.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_bunch.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_chunking.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_encode.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_estimator_html_repr.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_indexing.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_mask.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_missing.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_mocking.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_optional_dependencies.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_param_validation.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_plotting.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_pprint.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_repr_html/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_repr_html/base.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_repr_html/estimator.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_repr_html/params.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_response.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_set_output.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_show_versions.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_tags.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_test_common/__init__.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_test_common/instance_generator.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_testing.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_unique.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/_user_interface.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/class_weight.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/deprecation.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/discovery.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/estimator_checks.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/extmath.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/fixes.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/graph.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/metadata_routing.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/metaestimators.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/multiclass.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/optimize.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/parallel.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/random.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/sparsefuncs.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/stats.py:1:1: CPY001 Missing copyright notice at top of file
xlearn/utils/validation.py:1:1: CPY001 Missing copyright notice at top of file
Found 728 errors.
[*] 138 fixable with the `--fix` option (26 hidden fixes can be enabled with the `--unsafe-fixes` option).

ruff format

ruff detected issues. Please run ruff format locally and push the changes. Here you can see the detected issues. Note that the installed ruff version is ruff=0.11.7.


--- doc/conf.py
+++ doc/conf.py
@@ -25,9 +25,11 @@
 except ImportError:
     # Fallback for development environment
     from packaging.version import parse
+
     def turn_warnings_into_errors():
         pass
 
+
 # If extensions (or modules to document with autodoc) are in another
 # directory, add these directories to sys.path here. If the directory
 # is relative to the documentation root, use os.path.abspath to make it
@@ -268,7 +270,7 @@
     "logo": {
         "alt_text": "JAX-sklearn homepage",
         "image_relative": "logos/scikit-learn-logo-small.png",
-        "image_light": "logos/scikit-learn-logo-small.png", 
+        "image_light": "logos/scikit-learn-logo-small.png",
         "image_dark": "logos/scikit-learn-logo-small.png",
     },
     "surface_warnings": True,

--- xlearn/__init__.py
+++ xlearn/__init__.py
@@ -81,47 +81,50 @@
 # Initialize JAX acceleration if available
 try:
     from . import _jax  # noqa: F401
+
     _JAX_ENABLED = True
-    
+
     # Auto-enable JAX acceleration for all algorithms
     from ._jax._proxy import create_intelligent_proxy
     from ._jax._accelerator import AcceleratorRegistry
-    
+
     # Create global registry
     _jax_registry = AcceleratorRegistry()
-    
+
     def _auto_jax_accelerate_module(module_name):
         """Automatically add JAX acceleration to all estimators in a module."""
         try:
-            module = _importlib.import_module(f'.{module_name}', package=__name__)
-            
+            module = _importlib.import_module(f".{module_name}", package=__name__)
+
             # Get all classes that look like estimators
             for attr_name in dir(module):
-                if not attr_name.startswith('_'):
+                if not attr_name.startswith("_"):
                     attr = getattr(module, attr_name)
-                    
+
                     # Check if it's a class and looks like an estimator
-                    if (isinstance(attr, type) and 
-                        hasattr(attr, 'fit') and 
-                        attr.__module__.startswith('xlearn.')):
-                        
+                    if (
+                        isinstance(attr, type)
+                        and hasattr(attr, "fit")
+                        and attr.__module__.startswith("xlearn.")
+                    ):
                         # Create intelligent proxy
                         proxy_class = create_intelligent_proxy(attr)
-                        
+
                         # Replace in module
                         setattr(module, attr_name, proxy_class)
-                        
+
         except Exception as e:
             logger.debug(f"Failed to auto-accelerate module {module_name}: {e}")
-    
+
 except ImportError:
     _JAX_ENABLED = False
     _jax_registry = None
-    
+
     def _auto_jax_accelerate_module(module_name):
         """No-op when JAX is not available."""
         pass
 
+
 _submodules = [
     "calibration",
     "cluster",
@@ -180,11 +183,11 @@
 def __getattr__(name):
     if name in _submodules:
         module = _importlib.import_module(f"xlearn.{name}")
-        
+
         # Auto-apply JAX acceleration if enabled
         if _JAX_ENABLED:
             _auto_jax_accelerate_module(name)
-            
+
         return module
     else:
         try:

--- xlearn/_jax/__init__.py
+++ xlearn/_jax/__init__.py
@@ -6,7 +6,7 @@
 Usage:
     import xlearn
     # JAX acceleration is automatically enabled if JAX is available
-    
+
     # Or explicitly configure:
     import xlearn._jax
     xlearn._jax.set_config(enable_jax=True, platform="gpu")
@@ -23,6 +23,7 @@
 try:
     import jax
     import jax.numpy as jnp
+
     _JAX_AVAILABLE = True
 except ImportError:
     jax = None
@@ -35,10 +36,12 @@
 # Global accelerator registry
 _registry = AcceleratorRegistry()
 
+
 def is_jax_available() -> bool:
     """Check if JAX is available."""
     return _JAX_AVAILABLE
 
+
 def get_jax_platform() -> Optional[str]:
     """Get the current JAX platform."""
     if not _JAX_AVAILABLE:
@@ -48,14 +51,15 @@
     except Exception:
         return None
 
+
 def enable_jax_acceleration(platform: Optional[str] = None) -> bool:
     """Enable JAX acceleration.
-    
+
     Parameters
     ----------
     platform : str, optional
         JAX platform to use ('cpu', 'gpu', 'tpu'). If None, uses default.
-        
+
     Returns
     -------
     bool
@@ -65,38 +69,43 @@
         warnings.warn(
             "JAX is not available. Install JAX to enable acceleration: "
             "pip install jax jaxlib",
-            UserWarning
+            UserWarning,
         )
         return False
-    
+
     try:
         if platform:
             # Configure JAX platform if specified
-            os.environ['JAX_PLATFORM_NAME'] = platform.lower()
-        
+            os.environ["JAX_PLATFORM_NAME"] = platform.lower()
+
         # Update configuration
         set_config(enable_jax=True, jax_platform=platform or "auto")
         return True
-        
+
     except Exception as e:
         warnings.warn(f"Failed to enable JAX acceleration: {e}", UserWarning)
         return False
 
+
 def disable_jax_acceleration():
     """Disable JAX acceleration."""
     set_config(enable_jax=False)
 
+
 # Auto-enable JAX if available and not explicitly disabled
-if (_JAX_AVAILABLE and 
-    os.environ.get('XLEARN_DISABLE_JAX', '').lower() not in ('1', 'true', 'yes')):
+if _JAX_AVAILABLE and os.environ.get("XLEARN_DISABLE_JAX", "").lower() not in (
+    "1",
+    "true",
+    "yes",
+):
     enable_jax_acceleration()
 
 __all__ = [
-    'is_jax_available',
-    'get_jax_platform', 
-    'enable_jax_acceleration',
-    'disable_jax_acceleration',
-    'get_config',
-    'set_config',
-    'config_context',
+    "is_jax_available",
+    "get_jax_platform",
+    "enable_jax_acceleration",
+    "disable_jax_acceleration",
+    "get_config",
+    "set_config",
+    "config_context",
 ]

--- xlearn/_jax/_accelerator.py
+++ xlearn/_jax/_accelerator.py
@@ -12,14 +12,14 @@
 
 class AcceleratorRegistry:
     """Registry for JAX-accelerated estimator implementations."""
-    
+
     def __init__(self):
         self._accelerators: Dict[Type, Type] = {}
         self._enabled = True
-    
+
     def register(self, original_class: Type, accelerated_class: Type) -> None:
         """Register a JAX-accelerated implementation.
-        
+
         Parameters
         ----------
         original_class : type
@@ -28,15 +28,15 @@
             The JAX-accelerated implementation class.
         """
         self._accelerators[original_class] = accelerated_class
-    
+
     def get_accelerated(self, original_class: Type) -> Optional[Type]:
         """Get the JAX-accelerated implementation for a class.
-        
+
         Parameters
         ----------
         original_class : type
             The original jax-sklearn estimator class.
-            
+
         Returns
         -------
         accelerated_class : type or None
@@ -45,22 +45,22 @@
         if not self._enabled:
             return None
         return self._accelerators.get(original_class)
-    
+
     def is_registered(self, original_class: Type) -> bool:
         """Check if a class has a registered JAX implementation."""
         return original_class in self._accelerators
-    
+
     def enable(self) -> None:
         """Enable the accelerator registry."""
         self._enabled = True
-    
+
     def disable(self) -> None:
         """Disable the accelerator registry."""
         self._enabled = False
-    
+
     def list_accelerated(self) -> Dict[str, str]:
         """List all registered accelerated implementations.
-        
+
         Returns
         -------
         accelerated : dict
@@ -74,12 +74,12 @@
 
 def accelerated_estimator(original_class: Type):
     """Decorator to register a JAX-accelerated estimator implementation.
-    
+
     Parameters
     ----------
     original_class : type
         The original jax-sklearn estimator class to accelerate.
-        
+
     Examples
     --------
     >>> from xlearn.linear_model import LinearRegression
@@ -87,37 +87,40 @@
     ... class LinearRegressionJAX:
     ...     pass  # JAX implementation
     """
+
     def decorator(accelerated_class: Type) -> Type:
         # Import registry here to avoid circular imports
         from . import _registry
+
         _registry.register(original_class, accelerated_class)
         return accelerated_class
+
     return decorator
 
 
 def create_accelerated_estimator(original_class: Type, *args, **kwargs):
     """Create an accelerated estimator instance if available.
-    
+
     Parameters
     ----------
     original_class : type
         The original jax-sklearn estimator class.
     *args, **kwargs
         Arguments to pass to the estimator constructor.
-        
+
     Returns
     -------
     estimator
         Either a JAX-accelerated instance or the original instance.
     """
     from . import _registry
-    
+
     config = get_config()
-    
+
     # Check if JAX acceleration is enabled
     if not config["enable_jax"]:
         return original_class(*args, **kwargs)
-    
+
     # Check if JAX is available
     try:
         import jax
@@ -129,7 +132,7 @@
                 "JAX is not available but enable_jax=True. "
                 "Install JAX: pip install jax jaxlib"
             )
-    
+
     # Get accelerated implementation
     accelerated_class = _registry.get_accelerated(original_class)
     if accelerated_class is None:
@@ -139,10 +142,10 @@
             warnings.warn(
                 f"No JAX acceleration available for {original_class.__name__}. "
                 "Using original implementation.",
-                UserWarning
+                UserWarning,
             )
             return original_class(*args, **kwargs)
-    
+
     # Try to create accelerated instance
     try:
         return accelerated_class(*args, **kwargs)
@@ -151,7 +154,7 @@
             warnings.warn(
                 f"Failed to create JAX-accelerated {original_class.__name__}: {e}. "
                 "Falling back to original implementation.",
-                UserWarning
+                UserWarning,
             )
             return original_class(*args, **kwargs)
         else:
@@ -160,44 +163,45 @@
 
 def jax_accelerate(func: Callable) -> Callable:
     """Decorator to apply JAX acceleration to a function.
-    
+
     Parameters
     ----------
     func : callable
         Function to accelerate with JAX.
-        
+
     Returns
     -------
     accelerated_func : callable
         JAX-accelerated version of the function.
     """
+
     @functools.wraps(func)
     def wrapper(*args, **kwargs):
         config = get_config()
-        
+
         if not config["enable_jax"]:
             return func(*args, **kwargs)
-        
+
         try:
             import jax
             import jax.numpy as jnp
-            
+
             # Apply JIT compilation if enabled
             if config["jit_compilation"]:
                 jit_func = jax.jit(func)
                 return jit_func(*args, **kwargs)
             else:
                 return func(*args, **kwargs)
-                
+
         except Exception as e:
             if config["fallback_on_error"]:
                 warnings.warn(
                     f"JAX acceleration failed for {func.__name__}: {e}. "
                     "Using original implementation.",
-                    UserWarning
+                    UserWarning,
                 )
                 return func(*args, **kwargs)
             else:
                 raise
-    
+
     return wrapper

--- xlearn/_jax/_config.py
+++ xlearn/_jax/_config.py
@@ -1,6 +1,6 @@
 """Configuration management for JAX acceleration."""
 
-# Authors: The JAX-xlearn developers  
+# Authors: The JAX-xlearn developers
 # SPDX-License-Identifier: BSD-3-Clause
 
 import threading
@@ -21,15 +21,17 @@
 
 _threadlocal = threading.local()
 
+
 def _get_threadlocal_config() -> Dict[str, Any]:
     """Get a threadlocal **mutable** configuration.
-    
+
     If the configuration does not exist, copy the default global configuration.
     """
-    if not hasattr(_threadlocal, 'jax_config'):
+    if not hasattr(_threadlocal, "jax_config"):
         _threadlocal.jax_config = _global_config.copy()
     return _threadlocal.jax_config
 
+
 def get_config() -> Dict[str, Any]:
     """Retrieve current values for JAX acceleration configuration.
 
@@ -48,6 +50,7 @@
     # Return a copy so users can't modify the configuration directly
     return _get_threadlocal_config().copy()
 
+
 def set_config(
     enable_jax: Optional[bool] = None,
     jax_platform: Optional[str] = None,
@@ -64,25 +67,25 @@
     ----------
     enable_jax : bool, default=None
         Enable or disable JAX acceleration.
-        
+
     jax_platform : str, default=None
         JAX platform to use ('auto', 'cpu', 'gpu', 'tpu').
-        
+
     fallback_on_error : bool, default=None
         Whether to fallback to CPU implementation on JAX errors.
-        
+
     memory_limit_gpu : int, default=None
         GPU memory limit in MB. None for auto-detection.
-        
+
     jit_compilation : bool, default=None
         Enable JIT compilation of JAX functions.
-        
+
     precision : str, default=None
         Numerical precision ('float32', 'float64').
-        
+
     debug_mode : bool, default=None
         Enable debug mode with additional checks and logging.
-        
+
     cache_compiled_functions : bool, default=None
         Cache compiled JAX functions for reuse.
 
@@ -92,7 +95,7 @@
     >>> jax_xlearn.set_config(enable_jax=True, jax_platform="gpu")
     """
     local_config = _get_threadlocal_config()
-    
+
     if enable_jax is not None:
         local_config["enable_jax"] = enable_jax
     if jax_platform is not None:
@@ -113,8 +116,7 @@
     if precision is not None:
         if precision not in ("float32", "float64"):
             raise ValueError(
-                f"Invalid precision: {precision}. "
-                "Must be one of: 'float32', 'float64'"
+                f"Invalid precision: {precision}. Must be one of: 'float32', 'float64'"
             )
         local_config["precision"] = precision
     if debug_mode is not None:
@@ -122,6 +124,7 @@
     if cache_compiled_functions is not None:
         local_config["cache_compiled_functions"] = cache_compiled_functions
 
+
 @contextmanager
 def config_context(**kwargs):
     """Context manager for JAX acceleration configuration.
@@ -140,16 +143,17 @@
     """
     old_config = get_config()
     set_config(**kwargs)
-    
+
     try:
         yield
     finally:
         set_config(**old_config)
 
+
 def _validate_config() -> None:
     """Validate current configuration."""
     config = get_config()
-    
+
     # Check JAX availability if enabled
     if config["enable_jax"]:
         try:
@@ -159,12 +163,12 @@
                 "JAX is not available but enable_jax=True. "
                 "Install JAX: pip install jax jaxlib"
             )
-    
+
     # Validate platform
     platform = config["jax_platform"]
     if platform not in ("auto", "cpu", "gpu", "tpu"):
         raise ValueError(f"Invalid jax_platform: {platform}")
-    
+
     # Validate precision
     precision = config["precision"]
     if precision not in ("float32", "float64"):

--- xlearn/_jax/_data_conversion.py
+++ xlearn/_jax/_data_conversion.py
@@ -12,15 +12,18 @@
 # Type hints
 ArrayLike = Union[np.ndarray, Any]  # Any for JAX arrays when available
 
+
 def _get_jax_modules():
     """Get JAX modules if available."""
     try:
         import jax
         import jax.numpy as jnp
+
         return jax, jnp
     except ImportError:
         return None, None
 
+
 def is_jax_array(data: Any) -> bool:
     """Check if data is a JAX array."""
     jax, jnp = _get_jax_modules()
@@ -28,10 +31,12 @@
         return False
     return isinstance(data, jnp.ndarray)
 
+
 def is_numpy_array(data: Any) -> bool:
     """Check if data is a NumPy array."""
     return isinstance(data, np.ndarray)
 
+
 def to_jax(data: ArrayLike, dtype: Optional[str] = None) -> Any:
     """Convert data to JAX array.
 
@@ -64,7 +69,7 @@
     elif is_numpy_array(data):
         # NumPy array
         return jnp.asarray(data, dtype=dtype)
-    elif hasattr(data, '__array__'):
+    elif hasattr(data, "__array__"):
         # Array-like (pandas, etc.)
         np_array = np.asarray(data)
         return jnp.asarray(np_array, dtype=dtype)
@@ -76,6 +81,7 @@
         except Exception as e:
             raise ValueError(f"Cannot convert data to JAX array: {e}")
 
+
 def to_numpy(data: ArrayLike) -> np.ndarray:
     """Convert data to NumPy array.
 
@@ -99,6 +105,7 @@
         # Try to convert to NumPy
         return np.asarray(data)
 
+
 def ensure_2d(data: ArrayLike) -> ArrayLike:
     """Ensure data is 2D array.
 
@@ -123,6 +130,7 @@
             return data.reshape(1, -1)
         return data
 
+
 def get_array_module(data: ArrayLike):
     """Get the appropriate array module for the data.
 
@@ -142,6 +150,7 @@
     else:
         return np
 
+
 def convert_input_data(*arrays, **kwargs) -> Tuple[ArrayLike, ...]:
     """Convert input data arrays to appropriate format.
 
@@ -159,8 +168,8 @@
     converted_arrays : tuple
         Tuple of converted arrays.
     """
-    to_jax_flag = kwargs.get('to_jax', True)
-    dtype = kwargs.get('dtype', None)
+    to_jax_flag = kwargs.get("to_jax", True)
+    dtype = kwargs.get("dtype", None)
 
     config = get_config()
     if not config["enable_jax"] or not to_jax_flag:
@@ -176,6 +185,7 @@
         else:
             raise
 
+
 def auto_convert_arrays(to_jax_arrays: bool = True):
     """Decorator for automatic array conversion.
 
@@ -184,6 +194,7 @@
     to_jax_arrays : bool, default=True
         Whether to convert arrays to JAX format.
     """
+
     def decorator(func):
         @functools.wraps(func)
         def wrapper(*args, **kwargs):
@@ -197,7 +208,9 @@
                 # Convert array arguments
                 converted_args = []
                 for arg in args:
-                    if isinstance(arg, (np.ndarray, list, tuple)) and not isinstance(arg, str):
+                    if isinstance(arg, (np.ndarray, list, tuple)) and not isinstance(
+                        arg, str
+                    ):
                         try:
                             converted_args.append(to_jax(arg))
                         except Exception:
@@ -208,7 +221,9 @@
                 # Convert array keyword arguments
                 converted_kwargs = {}
                 for key, value in kwargs.items():
-                    if isinstance(value, (np.ndarray, list, tuple)) and not isinstance(value, str):
+                    if isinstance(value, (np.ndarray, list, tuple)) and not isinstance(
+                        value, str
+                    ):
                         try:
                             converted_kwargs[key] = to_jax(value)
                         except Exception:
@@ -238,8 +253,10 @@
                     raise
 
         return wrapper
+
     return decorator
 
+
 class DataConverter:
     """Utility class for data conversion operations."""
 

--- xlearn/_jax/_proxy.py
+++ xlearn/_jax/_proxy.py
@@ -14,7 +14,7 @@
     JAXLinearModelMixin,
     JAXClusterMixin,
     JAXDecompositionMixin,
-    create_jax_accelerated_class
+    create_jax_accelerated_class,
 )
 
 
@@ -48,26 +48,28 @@
             try:
                 # Try to create JAX-accelerated version
                 self._impl = create_accelerated_estimator(
-                    self._original_class,
-                    *self._init_args,
-                    **self._init_kwargs
+                    self._original_class, *self._init_args, **self._init_kwargs
                 )
 
                 # Check if we got a JAX implementation
                 from ._accelerator import AcceleratorRegistry
                 from . import _registry
+
                 accelerated_class = _registry.get_accelerated(self._original_class)
-                self._using_jax = (accelerated_class is not None and
-                                 isinstance(self._impl, accelerated_class))
+                self._using_jax = accelerated_class is not None and isinstance(
+                    self._impl, accelerated_class
+                )
 
             except Exception as e:
                 if config["fallback_on_error"]:
                     warnings.warn(
                         f"Failed to create JAX implementation: {e}. "
                         "Using original implementation.",
-                        UserWarning
+                        UserWarning,
+                    )
+                    self._impl = self._original_class(
+                        *self._init_args, **self._init_kwargs
                     )
-                    self._impl = self._original_class(*self._init_args, **self._init_kwargs)
                     self._using_jax = False
                 else:
                     raise
@@ -82,21 +84,23 @@
             return to_numpy(result)
         elif isinstance(result, (list, tuple)):
             return type(result)(
-                to_numpy(item) if is_jax_array(item) else item
-                for item in result
+                to_numpy(item) if is_jax_array(item) else item for item in result
             )
         return result
 
     def __getattr__(self, name):
         """Delegate attribute access to the underlying implementation."""
-        if name.startswith('_'):
+        if name.startswith("_"):
             # Private attributes should be handled by this class
-            raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
+            raise AttributeError(
+                f"'{type(self).__name__}' object has no attribute '{name}'"
+            )
 
         attr = getattr(self._impl, name)
 
         # If it's a method, wrap it to handle data conversion
         if callable(attr):
+
             @functools.wraps(attr)
             def wrapper(*args, **kwargs):
                 try:
@@ -118,10 +122,12 @@
                         warnings.warn(
                             f"JAX method {name} failed: {e}. "
                             "Recreating with original implementation.",
-                            UserWarning
+                            UserWarning,
                         )
                         # Recreate with original implementation
-                        self._impl = self._original_class(*self._init_args, **self._init_kwargs)
+                        self._impl = self._original_class(
+                            *self._init_args, **self._init_kwargs
+                        )
                         self._using_jax = False
 
                         # Retry the method call
@@ -137,12 +143,18 @@
 
     def __setattr__(self, name, value):
         """Handle attribute setting."""
-        if name.startswith('_') or name in ('_original_class', '_init_args', '_init_kwargs', '_impl', '_using_jax'):
+        if name.startswith("_") or name in (
+            "_original_class",
+            "_init_args",
+            "_init_kwargs",
+            "_impl",
+            "_using_jax",
+        ):
             # Private attributes of the proxy
             super().__setattr__(name, value)
         else:
             # Delegate to the underlying implementation
-            if hasattr(self, '_impl') and self._impl is not None:
+            if hasattr(self, "_impl") and self._impl is not None:
                 setattr(self._impl, name, value)
             else:
                 super().__setattr__(name, value)
@@ -186,6 +198,7 @@
     proxy_class : type
         A proxy class that transparently handles JAX acceleration.
     """
+
     class ProxyClass(EstimatorProxy):
         def __init__(self, *args, **kwargs):
             super().__init__(original_class, *args, **kwargs)
@@ -217,14 +230,20 @@
     module_name = original_class.__module__
 
     # Select appropriate mixin
-    if 'linear_model' in module_name or any(keyword in class_name.lower() for keyword in
-                                           ['linear', 'regression', 'ridge', 'lasso', 'elastic', 'logistic']):
+    if "linear_model" in module_name or any(
+        keyword in class_name.lower()
+        for keyword in ["linear", "regression", "ridge", "lasso", "elastic", "logistic"]
+    ):
         mixin_class = JAXLinearModelMixin
-    elif 'cluster' in module_name or any(keyword in class_name.lower() for keyword in
-                                        ['kmeans', 'cluster', 'dbscan', 'agglomerative']):
+    elif "cluster" in module_name or any(
+        keyword in class_name.lower()
+        for keyword in ["kmeans", "cluster", "dbscan", "agglomerative"]
+    ):
         mixin_class = JAXClusterMixin
-    elif 'decomposition' in module_name or any(keyword in class_name.lower() for keyword in
-                                              ['pca', 'svd', 'nmf', 'ica', 'decomposition']):
+    elif "decomposition" in module_name or any(
+        keyword in class_name.lower()
+        for keyword in ["pca", "svd", "nmf", "ica", "decomposition"]
+    ):
         mixin_class = JAXDecompositionMixin
     else:
         # For other algorithms, use the base mixin with minimal acceleration
@@ -253,10 +272,13 @@
         # Register it with the accelerator system
         from ._accelerator import AcceleratorRegistry
         from . import _registry
+
         _registry.register(original_class, jax_class)
 
     except Exception as e:
-        warnings.warn(f"Failed to create JAX acceleration for {original_class.__name__}: {e}")
+        warnings.warn(
+            f"Failed to create JAX acceleration for {original_class.__name__}: {e}"
+        )
 
     # Create and return proxy class
     return create_proxy_class(original_class)

--- xlearn/_jax/_universal_jax.py
+++ xlearn/_jax/_universal_jax.py
@@ -15,69 +15,74 @@
 
 class UniversalJAXMixin:
     """Mixin class that provides universal JAX acceleration for common operations."""
-    
+
     def __init__(self):
         self._jax_compiled_functions = {}
         self._performance_cache = {}
-    
+
     def _should_use_jax(self, X: np.ndarray, algorithm_name: str = None) -> bool:
         """Determine if JAX should be used based on data characteristics."""
         config = get_config()
         if not config.get("enable_jax", True):
             return False
-        
+
         n_samples, n_features = X.shape
-        
+
         # Cache key for performance decision
         cache_key = (n_samples, n_features, algorithm_name or self.__class__.__name__)
         if cache_key in self._performance_cache:
             return self._performance_cache[cache_key]
-        
+
         # Heuristic decision based on data size and algorithm type
         decision = self._performance_heuristic(n_samples, n_features, algorithm_name)
         self._performance_cache[cache_key] = decision
-        
+
         return decision
-    
-    def _performance_heuristic(self, n_samples: int, n_features: int, algorithm_name: str = None) -> bool:
+
+    def _performance_heuristic(
+        self, n_samples: int, n_features: int, algorithm_name: str = None
+    ) -> bool:
         """Heuristic to decide whether to use JAX based on problem characteristics."""
         complexity = n_samples * n_features
-        
+
         # Algorithm-specific thresholds based on our testing
         thresholds = {
             # Linear models - benefit from JAX on large problems
-            'LinearRegression': {'min_complexity': 1e8, 'min_samples': 10000},
-            'Ridge': {'min_complexity': 1e8, 'min_samples': 10000},
-            'Lasso': {'min_complexity': 5e7, 'min_samples': 5000},  # Iterative, benefits earlier
-            'LogisticRegression': {'min_complexity': 5e7, 'min_samples': 5000},
-            
+            "LinearRegression": {"min_complexity": 1e8, "min_samples": 10000},
+            "Ridge": {"min_complexity": 1e8, "min_samples": 10000},
+            "Lasso": {
+                "min_complexity": 5e7,
+                "min_samples": 5000,
+            },  # Iterative, benefits earlier
+            "LogisticRegression": {"min_complexity": 5e7, "min_samples": 5000},
             # Clustering - benefit from vectorization
-            'KMeans': {'min_complexity': 1e6, 'min_samples': 5000},
-            'DBSCAN': {'min_complexity': 1e6, 'min_samples': 1000},
-            
+            "KMeans": {"min_complexity": 1e6, "min_samples": 5000},
+            "DBSCAN": {"min_complexity": 1e6, "min_samples": 1000},
             # Decomposition - matrix operations benefit greatly
-            'PCA': {'min_complexity': 1e7, 'min_samples': 5000},
-            'TruncatedSVD': {'min_complexity': 1e7, 'min_samples': 5000},
-            'NMF': {'min_complexity': 5e6, 'min_samples': 2000},
-            
+            "PCA": {"min_complexity": 1e7, "min_samples": 5000},
+            "TruncatedSVD": {"min_complexity": 1e7, "min_samples": 5000},
+            "NMF": {"min_complexity": 5e6, "min_samples": 2000},
             # Tree-based - limited JAX benefit but some operations can be accelerated
-            'RandomForestClassifier': {'min_complexity': 1e5, 'min_samples': 1000},
-            'RandomForestRegressor': {'min_complexity': 1e5, 'min_samples': 1000},
-            
+            "RandomForestClassifier": {"min_complexity": 1e5, "min_samples": 1000},
+            "RandomForestRegressor": {"min_complexity": 1e5, "min_samples": 1000},
             # Default for unknown algorithms
-            'default': {'min_complexity': 1e7, 'min_samples': 10000}
+            "default": {"min_complexity": 1e7, "min_samples": 10000},
         }
-        
+
         # Get threshold for this algorithm
         algo_name = algorithm_name or self.__class__.__name__
-        threshold = thresholds.get(algo_name, thresholds['default'])
-        
-        return (complexity >= threshold['min_complexity'] and 
-                n_samples >= threshold['min_samples'])
-    
+        threshold = thresholds.get(algo_name, thresholds["default"])
+
+        return (
+            complexity >= threshold["min_complexity"]
+            and n_samples >= threshold["min_samples"]
+        )
+
     @staticmethod
     @jax.jit
-    def _jax_solve_linear_system(A: jnp.ndarray, b: jnp.ndarray, regularization: float = 1e-10) -> jnp.ndarray:
+    def _jax_solve_linear_system(
+        A: jnp.ndarray, b: jnp.ndarray, regularization: float = 1e-10
+    ) -> jnp.ndarray:
         """JAX-compiled function to solve linear system Ax = b."""
         # Add regularization for numerical stability
         if A.ndim == 2 and A.shape[0] == A.shape[1]:
@@ -87,171 +92,185 @@
             AtA = A.T @ A
             A_reg = AtA + regularization * jnp.eye(AtA.shape[0])
             b = A.T @ b
-        
+
         return linalg.solve(A_reg, b)
-    
+
     @staticmethod
     @jax.jit
-    def _jax_linear_regression_fit(X: jnp.ndarray, y: jnp.ndarray) -> Tuple[jnp.ndarray, jnp.ndarray]:
+    def _jax_linear_regression_fit(
+        X: jnp.ndarray, y: jnp.ndarray
+    ) -> Tuple[jnp.ndarray, jnp.ndarray]:
         """JAX-compiled linear regression fitting."""
         n_samples, n_features = X.shape
-        
+
         # Center the data
         X_mean = jnp.mean(X, axis=0)
         y_mean = jnp.mean(y)
         X_centered = X - X_mean
         y_centered = y - y_mean
-        
+
         # Solve normal equations: (X^T X) coef = X^T y
         XtX = X_centered.T @ X_centered
         Xty = X_centered.T @ y_centered
-        
+
         # Add small regularization for numerical stability
         regularization = 1e-10 * jnp.trace(XtX) / n_features
         coef = linalg.solve(XtX + regularization * jnp.eye(n_features), Xty)
-        
+
         # Calculate intercept
         intercept = y_mean - X_mean @ coef
-        
+
         return coef, intercept
-    
+
     @staticmethod
     @jax.jit
-    def _jax_ridge_regression_fit(X: jnp.ndarray, y: jnp.ndarray, alpha: float) -> Tuple[jnp.ndarray, jnp.ndarray]:
+    def _jax_ridge_regression_fit(
+        X: jnp.ndarray, y: jnp.ndarray, alpha: float
+    ) -> Tuple[jnp.ndarray, jnp.ndarray]:
         """JAX-compiled Ridge regression fitting."""
         n_samples, n_features = X.shape
-        
+
         # Center the data
         X_mean = jnp.mean(X, axis=0)
         y_mean = jnp.mean(y)
         X_centered = X - X_mean
         y_centered = y - y_mean
-        
+
         # Solve regularized normal equations: (X^T X + alpha*I) coef = X^T y
         XtX = X_centered.T @ X_centered
         Xty = X_centered.T @ y_centered
-        
+
         coef = linalg.solve(XtX + alpha * jnp.eye(n_features), Xty)
         intercept = y_mean - X_mean @ coef
-        
+
         return coef, intercept
-    
+
     @staticmethod
     @jax.jit
-    def _jax_pca_fit(X: jnp.ndarray, n_components: int) -> Tuple[jnp.ndarray, jnp.ndarray, jnp.ndarray]:
+    def _jax_pca_fit(
+        X: jnp.ndarray, n_components: int
+    ) -> Tuple[jnp.ndarray, jnp.ndarray, jnp.ndarray]:
         """JAX-compiled PCA fitting."""
         n_samples, n_features = X.shape
-        
+
         # Center the data
         X_mean = jnp.mean(X, axis=0)
         X_centered = X - X_mean
-        
+
         # Compute SVD
         U, s, Vt = jnp.linalg.svd(X_centered, full_matrices=False)
-        
+
         # Select top components
         components = Vt[:n_components]
         explained_variance = (s[:n_components] ** 2) / (n_samples - 1)
-        
+
         return components, explained_variance, X_mean
-    
+
     @staticmethod
     @jax.jit
-    def _jax_kmeans_step(X: jnp.ndarray, centers: jnp.ndarray) -> Tuple[jnp.ndarray, jnp.ndarray]:
+    def _jax_kmeans_step(
+        X: jnp.ndarray, centers: jnp.ndarray
+    ) -> Tuple[jnp.ndarray, jnp.ndarray]:
         """JAX-compiled K-means iteration step."""
         # Compute distances to all centers
         distances = jnp.linalg.norm(X[:, None, :] - centers[None, :, :], axis=2)
-        
+
         # Assign points to closest centers
         labels = jnp.argmin(distances, axis=1)
-        
+
         # Update centers
-        new_centers = jnp.array([
-            jnp.mean(X[labels == k], axis=0) 
-            for k in range(centers.shape[0])
-        ])
-        
+        new_centers = jnp.array(
+            [jnp.mean(X[labels == k], axis=0) for k in range(centers.shape[0])]
+        )
+
         return new_centers, labels
-    
-    def _apply_jax_linear_regression(self, X: np.ndarray, y: np.ndarray) -> Dict[str, np.ndarray]:
+
+    def _apply_jax_linear_regression(
+        self, X: np.ndarray, y: np.ndarray
+    ) -> Dict[str, np.ndarray]:
         """Apply JAX-accelerated linear regression."""
         X_jax = to_jax(X)
         y_jax = to_jax(y)
-        
+
         coef_jax, intercept_jax = self._jax_linear_regression_fit(X_jax, y_jax)
-        
-        return {
-            'coef_': to_numpy(coef_jax),
-            'intercept_': to_numpy(intercept_jax)
-        }
-    
-    def _apply_jax_ridge_regression(self, X: np.ndarray, y: np.ndarray, alpha: float = 1.0) -> Dict[str, np.ndarray]:
+
+        return {"coef_": to_numpy(coef_jax), "intercept_": to_numpy(intercept_jax)}
+
+    def _apply_jax_ridge_regression(
+        self, X: np.ndarray, y: np.ndarray, alpha: float = 1.0
+    ) -> Dict[str, np.ndarray]:
         """Apply JAX-accelerated Ridge regression."""
         X_jax = to_jax(X)
         y_jax = to_jax(y)
-        
+
         coef_jax, intercept_jax = self._jax_ridge_regression_fit(X_jax, y_jax, alpha)
-        
-        return {
-            'coef_': to_numpy(coef_jax),
-            'intercept_': to_numpy(intercept_jax)
-        }
-    
+
+        return {"coef_": to_numpy(coef_jax), "intercept_": to_numpy(intercept_jax)}
+
     def _apply_jax_pca(self, X: np.ndarray, n_components: int) -> Dict[str, np.ndarray]:
         """Apply JAX-accelerated PCA."""
         X_jax = to_jax(X)
-        
-        components_jax, explained_variance_jax, mean_jax = self._jax_pca_fit(X_jax, n_components)
-        
+
+        components_jax, explained_variance_jax, mean_jax = self._jax_pca_fit(
+            X_jax, n_components
+        )
+
         return {
-            'components_': to_numpy(components_jax),
-            'explained_variance_': to_numpy(explained_variance_jax),
-            'mean_': to_numpy(mean_jax)
+            "components_": to_numpy(components_jax),
+            "explained_variance_": to_numpy(explained_variance_jax),
+            "mean_": to_numpy(mean_jax),
         }
-    
-    def _apply_jax_kmeans_iteration(self, X: np.ndarray, centers: np.ndarray) -> Dict[str, np.ndarray]:
+
+    def _apply_jax_kmeans_iteration(
+        self, X: np.ndarray, centers: np.ndarray
+    ) -> Dict[str, np.ndarray]:
         """Apply JAX-accelerated K-means iteration."""
         X_jax = to_jax(X)
         centers_jax = to_jax(centers)
-        
+
         new_centers_jax, labels_jax = self._jax_kmeans_step(X_jax, centers_jax)
-        
+
         return {
-            'cluster_centers_': to_numpy(new_centers_jax),
-            'labels_': to_numpy(labels_jax)
+            "cluster_centers_": to_numpy(new_centers_jax),
+            "labels_": to_numpy(labels_jax),
         }
 
 
 class JAXLinearModelMixin(UniversalJAXMixin):
     """Mixin for JAX-accelerated linear models."""
-    
-    def jax_fit(self, X: np.ndarray, y: np.ndarray, algorithm: str = 'linear') -> 'JAXLinearModelMixin':
+
+    def jax_fit(
+        self, X: np.ndarray, y: np.ndarray, algorithm: str = "linear"
+    ) -> "JAXLinearModelMixin":
         """JAX-accelerated fitting for linear models."""
         if not self._should_use_jax(X, algorithm):
             # Fallback to original implementation
             return self._original_fit(X, y)
-        
+
         try:
-            if algorithm == 'linear':
+            if algorithm == "linear":
                 results = self._apply_jax_linear_regression(X, y)
-            elif algorithm == 'ridge':
-                alpha = getattr(self, 'alpha', 1.0)
+            elif algorithm == "ridge":
+                alpha = getattr(self, "alpha", 1.0)
                 results = self._apply_jax_ridge_regression(X, y, alpha)
             else:
                 # Fallback for unsupported algorithms
                 return self._original_fit(X, y)
-            
+
             # Set attributes
             for attr_name, attr_value in results.items():
                 setattr(self, attr_name, attr_value)
-            
+
             return self
-            
+
         except Exception as e:
             config = get_config()
             if config.get("fallback_on_error", True):
                 import warnings
-                warnings.warn(f"JAX fitting failed: {e}. Using original implementation.")
+
+                warnings.warn(
+                    f"JAX fitting failed: {e}. Using original implementation."
+                )
                 return self._original_fit(X, y)
             else:
                 raise
@@ -259,83 +278,89 @@
 
 class JAXClusterMixin(UniversalJAXMixin):
     """Mixin for JAX-accelerated clustering algorithms."""
-    
-    def jax_fit(self, X: np.ndarray) -> 'JAXClusterMixin':
+
+    def jax_fit(self, X: np.ndarray) -> "JAXClusterMixin":
         """JAX-accelerated fitting for clustering algorithms."""
-        if not self._should_use_jax(X, 'KMeans'):
+        if not self._should_use_jax(X, "KMeans"):
             return self._original_fit(X)
-        
+
         try:
             # Initialize centers (this is algorithm-specific)
-            n_clusters = getattr(self, 'n_clusters', 8)
+            n_clusters = getattr(self, "n_clusters", 8)
             centers = self._initialize_centers(X, n_clusters)
-            
+
             # Iterative K-means with JAX acceleration
-            max_iter = getattr(self, 'max_iter', 300)
-            tol = getattr(self, 'tol', 1e-4)
-            
+            max_iter = getattr(self, "max_iter", 300)
+            tol = getattr(self, "tol", 1e-4)
+
             for i in range(max_iter):
                 results = self._apply_jax_kmeans_iteration(X, centers)
-                new_centers = results['cluster_centers_']
-                
+                new_centers = results["cluster_centers_"]
+
                 # Check convergence
                 if np.allclose(centers, new_centers, atol=tol):
                     break
-                    
+
                 centers = new_centers
-            
+
             # Set final results
             self.cluster_centers_ = centers
-            self.labels_ = results['labels_']
+            self.labels_ = results["labels_"]
             self.n_iter_ = i + 1
-            
+
             return self
-            
+
         except Exception as e:
             config = get_config()
             if config.get("fallback_on_error", True):
                 import warnings
-                warnings.warn(f"JAX clustering failed: {e}. Using original implementation.")
+
+                warnings.warn(
+                    f"JAX clustering failed: {e}. Using original implementation."
+                )
                 return self._original_fit(X)
             else:
                 raise
-    
+
     def _initialize_centers(self, X: np.ndarray, n_clusters: int) -> np.ndarray:
         """Initialize cluster centers."""
         # Simple random initialization - can be improved
         n_samples, n_features = X.shape
-        rng = np.random.RandomState(getattr(self, 'random_state', None))
+        rng = np.random.RandomState(getattr(self, "random_state", None))
         indices = rng.choice(n_samples, n_clusters, replace=False)
         return X[indices].copy()
 
 
 class JAXDecompositionMixin(UniversalJAXMixin):
     """Mixin for JAX-accelerated decomposition algorithms."""
-    
-    def jax_fit(self, X: np.ndarray) -> 'JAXDecompositionMixin':
+
+    def jax_fit(self, X: np.ndarray) -> "JAXDecompositionMixin":
         """JAX-accelerated fitting for decomposition algorithms."""
-        if not self._should_use_jax(X, 'PCA'):
+        if not self._should_use_jax(X, "PCA"):
             return self._original_fit(X)
-        
+
         try:
-            n_components = getattr(self, 'n_components', min(X.shape))
+            n_components = getattr(self, "n_components", min(X.shape))
             results = self._apply_jax_pca(X, n_components)
-            
+
             # Set attributes
             for attr_name, attr_value in results.items():
                 setattr(self, attr_name, attr_value)
-            
+
             # Calculate explained variance ratio
-            total_var = np.sum(results['explained_variance_'])
-            self.explained_variance_ratio_ = results['explained_variance_'] / total_var
-            
+            total_var = np.sum(results["explained_variance_"])
+            self.explained_variance_ratio_ = results["explained_variance_"] / total_var
+
             return self
-            
+
         except Exception as e:
             config = get_config()
             if config.get("fallback_on_error", True):
                 import warnings
-                warnings.warn(f"JAX decomposition failed: {e}. Using original implementation.")
+
+                warnings.warn(
+                    f"JAX decomposition failed: {e}. Using original implementation."
+                )
                 return self._original_fit(X)
             else:
                 raise
@@ -343,34 +368,39 @@
 
 def create_jax_accelerated_class(original_class: type, mixin_class: type) -> type:
     """Create a JAX-accelerated version of a class using a mixin.
-    
+
     Parameters
     ----------
     original_class : type
         The original xlearn class
     mixin_class : type
         The JAX mixin class to use
-    
+
     Returns
     -------
     accelerated_class : type
         JAX-accelerated class
     """
+
     class JAXAcceleratedClass(mixin_class, original_class):
         def __init__(self, *args, **kwargs):
             original_class.__init__(self, *args, **kwargs)
             mixin_class.__init__(self)
-            
+
             # Store original fit method
             self._original_fit = original_class.fit
-        
+
         def fit(self, X, y=None, **kwargs):
             """Override fit to use JAX acceleration when beneficial."""
-            return self.jax_fit(X, y, **kwargs) if y is not None else self.jax_fit(X, **kwargs)
-    
+            return (
+                self.jax_fit(X, y, **kwargs)
+                if y is not None
+                else self.jax_fit(X, **kwargs)
+            )
+
     # Copy metadata
     JAXAcceleratedClass.__name__ = f"JAX{original_class.__name__}"
     JAXAcceleratedClass.__qualname__ = f"JAX{original_class.__qualname__}"
     JAXAcceleratedClass.__module__ = original_class.__module__
-    
+
     return JAXAcceleratedClass

--- xlearn/datasets/tests/test_openml.py
+++ xlearn/datasets/tests/test_openml.py
@@ -1527,9 +1527,7 @@
             url=None, code=404, msg="Simulated network error", hdrs=None, fp=BytesIO()
         )
 
-    monkeypatch.setattr(
-        xlearn.datasets._openml, "urlopen", _mock_urlopen_network_error
-    )
+    monkeypatch.setattr(xlearn.datasets._openml, "urlopen", _mock_urlopen_network_error)
 
     invalid_openml_url = "https://api.openml.org/invalid-url"
 

--- xlearn/ensemble/_hist_gradient_boosting/tests/test_compare_lightgbm.py
+++ xlearn/ensemble/_hist_gradient_boosting/tests/test_compare_lightgbm.py
@@ -108,8 +108,7 @@
         # cause is not clear, maybe algorithmic differences. One such example is the
         # poisson_max_delta_step parameter of LightGBM which does not exist in HGBT.
         assert (
-            np.mean(np.isclose(pred_lightgbm, pred_xlearn, rtol=1e-2, atol=1e-2))
-            > 0.65
+            np.mean(np.isclose(pred_lightgbm, pred_xlearn, rtol=1e-2, atol=1e-2)) > 0.65
         )
     else:
         # Less than 1% of the predictions may deviate more than 1e-3 in relative terms.

--- xlearn/utils/_testing.py
+++ xlearn/utils/_testing.py
@@ -503,9 +503,7 @@
     ignore = [] if ignore is None else ignore
 
     func_name = _get_func_name(func)
-    if not func_name.startswith("xlearn.") or func_name.startswith(
-        "xlearn.externals"
-    ):
+    if not func_name.startswith("xlearn.") or func_name.startswith("xlearn.externals"):
         return incorrect
     # Don't check docstring for property-functions
     if inspect.isdatadescriptor(func):

11 files would be reformatted, 921 files already formatted

mypy

mypy detected issues. Please fix them locally and push the changes. Here you can see the detected issues. Note that the installed mypy version is mypy=1.15.0.


xlearn/_jax/_accelerator.py:17: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
xlearn/_jax/_universal_jax.py:23: error: Incompatible default for argument "algorithm_name" (default has type "None", argument has type "str")  [assignment]
xlearn/_jax/_universal_jax.py:23: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True
xlearn/_jax/_universal_jax.py:23: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
xlearn/_jax/_universal_jax.py:42: error: Incompatible default for argument "algorithm_name" (default has type "None", argument has type "str")  [assignment]
xlearn/_jax/_universal_jax.py:42: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True
xlearn/_jax/_universal_jax.py:42: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase
xlearn/_jax/_universal_jax.py:232: error: "JAXLinearModelMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_universal_jax.py:242: error: "JAXLinearModelMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_universal_jax.py:255: error: "JAXLinearModelMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_universal_jax.py:266: error: "JAXClusterMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_universal_jax.py:299: error: "JAXClusterMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_universal_jax.py:318: error: "JAXDecompositionMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_universal_jax.py:339: error: "JAXDecompositionMixin" has no attribute "_original_fit"  [attr-defined]
xlearn/_jax/_proxy.py:173: error: Incompatible return value type (got "None", expected "type[Any]")  [return-value]
xlearn/_jax/_proxy.py:225: error: Incompatible types in assignment (expression has type "type[JAXClusterMixin]", variable has type "type[JAXLinearModelMixin]")  [assignment]
xlearn/_jax/_proxy.py:228: error: Incompatible types in assignment (expression has type "type[JAXDecompositionMixin]", variable has type "type[JAXLinearModelMixin]")  [assignment]
xlearn/_jax/_proxy.py:284: error: No return value expected  [return-value]
xlearn/__init__.py:119: error: Incompatible types in assignment (expression has type "None", variable has type "AcceleratorRegistry")  [assignment]
Found 14 errors in 3 files (checked 571 source files)

cython-lint

cython-lint detected issues. Please fix them locally and push the changes. Here you can see the detected issues. Note that the installed cython-lint version is cython-lint=0.18.1.


/home/runner/work/jax-sklearn/jax-sklearn/xlearn/cluster/_hdbscan/_tree.pyx:786:19: unnecessary set + generator (just use a set comprehension)

Generated for commit: 95f6e97. Link to the linter CI: here

Copy link

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed everything up to 95f6e97 in 1 minute and 13 seconds. Click for details.
  • Reviewed 288 lines of code in 11 files
  • Skipped 0 files when reviewing.
  • Skipped posting 13 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. .github/workflows/arm-unit-tests.yml:22
  • Draft comment:
    Upgrade to actions/setup-python v6 looks good. Please verify that any caching behavior (parameter 'cache: pip') still works as expected with the new version.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to verify that caching behavior still works as expected after an upgrade. This falls under the rule of not asking the author to ensure behavior is intended or tested. Therefore, this comment should be removed.
2. .github/workflows/check-changelog.yml:33
  • Draft comment:
    Bump to scientific-python/action-towncrier-changelog v2 is applied. Ensure that any breaking changes or new configuration requirements in v2 are reflected in the workflow if needed.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is related to a dependency change, specifically a version bump of a GitHub Action. The comment asks the author to ensure that any breaking changes or new configuration requirements are reflected in the workflow. This falls under the rule of not commenting on dependency changes or asking the author to ensure compatibility, which is not allowed.
3. .github/workflows/check-sdist.yml:17
  • Draft comment:
    Updating actions/setup-python to v6 is consistent here. Verify that Python 3.10 still behaves as expected under the new version.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to verify behavior, which is against the rules. It doesn't provide a specific suggestion or point out a specific issue with the code.
4. .github/workflows/codeql.yml:44
  • Draft comment:
    CodeQL actions are updated to v4 (init, autobuild, analyze). It’s important to review the v4 release notes for any configuration changes that might affect analysis.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is purely informative, suggesting to review release notes for configuration changes. It doesn't provide a specific code suggestion or point out a potential issue in the code.
5. .github/workflows/cuda-ci.yml:21
  • Draft comment:
    The cibuildwheel action has been updated to a commit hash (9c00cb4...). Ensure that this pinned version is fully compatible with your CUDA GPU builds.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to ensure compatibility with their CUDA GPU builds, which is against the rules. It does not provide a specific suggestion or point out a specific issue with the code.
6. .github/workflows/emscripten.yml:70
  • Draft comment:
    The cibuildwheel action is now referenced by a new commit hash for building WASM wheels. Verify that this change works correctly with the Pyodide build configuration.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% The comment is asking the PR author to verify that a change works correctly, which violates the rule against asking for confirmation or verification. It doesn't provide a specific suggestion or point out a specific issue with the code.
7. .github/workflows/emscripten.yml:82
  • Draft comment:
    The upload-artifact action is upgraded to v5, ensuring Node v24 support. Confirm that artifact uploads (wheel files) behave correctly post‐upgrade.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% The comment is asking the PR author to confirm the behavior of artifact uploads after upgrading an action. This falls under asking the author to ensure behavior is intended or tested, which violates the rules.
8. .github/workflows/labeler-title-regex.yml:19
  • Draft comment:
    The setup-python version is bumped to v6 here. This update is straightforward; just ensure any Python environment specifics remain compatible.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is related to a dependency change, specifically the version bump of setup-python to v6. It asks the author to ensure compatibility, which is against the rules for comments. The comment does not provide a specific code suggestion or ask for a specific test to be written.
9. .github/workflows/lint.yml:28
  • Draft comment:
    Actions/setup-python has been updated to v6 in the lint job. Validate that dependencies and environment variables are still correctly set up for linting.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to validate that dependencies and environment variables are correctly set up after an update to a GitHub Action. This falls under the category of asking the author to double-check things, which is against the rules.
10. .github/workflows/lint.yml:54
  • Draft comment:
    Artifact uploader updated to v5. Ensure that lint log artifacts are correctly uploaded and can be downloaded in subsequent steps.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to ensure that the lint log artifacts are correctly uploaded and can be downloaded, which violates the rule against asking the author to ensure behavior is intended or tested. It does not provide a specific code suggestion or ask for a specific test to be written.
11. .github/workflows/publish_pypi.yml:42
  • Draft comment:
    The gh-action-pypi-publish action is upgraded to v1.13.0, which includes important security fixes (e.g. GHSA-vxmw-7h4f-hqxh). Ensure that the new version's behavior is in line with your publishing process.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is about a dependency change, specifically an upgrade to a GitHub Action. The comment is informative and suggests ensuring the new version's behavior aligns with the publishing process, which is not specific enough to be actionable. It violates the rule against making purely informative comments and asking the author to ensure behavior is intended.
12. .github/workflows/update_tracking_issue.yml:33
  • Draft comment:
    Setup-python has been bumped to v6 in the update tracking issue workflow. Confirm that the tracking script interacts correctly with the updated environment.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to confirm that the tracking script interacts correctly with the updated environment, which violates the rule against asking the author to confirm or ensure behavior. It does not provide a specific suggestion or point out a specific issue.
13. .github/workflows/wheels.yml:167
  • Draft comment:
    Multiple action versions have been updated (setup-python, upload-artifact, download-artifact) in the wheels workflow. Please verify that the build matrix, conda integration, and artifact handling still perform as expected.
  • Reason this comment was not posted:
    Comment did not seem useful. Confidence is useful = 0% <= threshold 50% This comment is asking the PR author to verify that certain aspects of the code still perform as expected after updating action versions. This falls under the rule of not asking the author to ensure behavior is intended or to double-check things. Therefore, this comment should be removed.

Workflow ID: wflow_VouIBcarwRsijcPi

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant