Compare commits

..

92 Commits

Author SHA1 Message Date
Jonas Eschle
b3d0b4443b
Merge pull request #473 from ucsd-salad/feat/ignore-errors-flag
Add `--ignore-errors` to skip files with errors
2025-04-10 10:14:53 +02:00
Jonas Eschle
2ef6a981d5
Merge pull request #482 from bndr/fix_ci_coverage
fix typo
2025-04-08 10:56:33 +02:00
Jonas Eschle
fcfe190bed
Merge pull request #471 from rasa/rasa-py-3.13-support
chore: Add python 3.13 support, drop 3.8
2025-04-08 10:56:15 +02:00
Jonas Eschle
38af24c1d1 fix typo 2025-04-08 10:49:39 +02:00
Jonas Eschle
645451aa39 fix typo 2025-04-08 10:47:41 +02:00
Jonas Eschle
2326604c81 fix typo 2025-04-08 10:43:26 +02:00
Jonas Eschle
6e0447755c
Update flake8.yml ci 2025-04-08 09:51:13 +02:00
Jonas Eschle
da63e0d71c
Merge branch 'master' into rasa-py-3.13-support 2025-04-08 09:49:42 +02:00
Jonas Eschle
a7a6b856b0
Merge pull request #481 from bndr/projecttoml_upgrade
upgrade Python version
2025-04-08 09:45:54 +02:00
Jonas Eschle
cdcf14bce6 fix typo 2025-04-08 09:44:41 +02:00
Jonas Eschle
1f462ab50a upgrade Python version 2025-04-08 09:43:46 +02:00
Jonas Eschle
287da35bc2
Merge pull request #480 from bndr/add_venv_ignore
Add .venv to ignored files
2025-04-07 19:46:26 +02:00
Jonas Eschle
58d62cb7b8
Update pipreqs.py 2025-04-07 18:02:59 +02:00
Jonas Eschle
36efbbe460
Merge pull request #478 from bndr/precommit
add pre-commit configuration file
2025-04-07 17:47:39 +02:00
Jonas Eschle
08c5eb09cc
Merge pull request #479 from bndr/projecttoml_upgrade
Update pyproject.toml and tests.yml config
2025-04-07 17:47:27 +02:00
Jonas Eschle
34163de8d0
Merge pull request #325 from klieret/no-pin-doc-improvement
Doc: no-pin instead of non-pin for scheme
2025-04-07 17:45:23 +02:00
Jonas Eschle
d6d06a2f1f
Merge pull request #378 from testpushpleaseignore/master
add pywintypes to mapping
2025-04-07 17:43:02 +02:00
Jonas Eschle
1c319d7106
Merge pull request #394 from xinshoutw/patch-1
Update mapping for SpeechRecognition
2025-04-07 17:40:42 +02:00
Jonas Eschle
d6725caee6 upgrade Python version 2025-04-07 17:25:37 +02:00
Jonas Eschle
ebfa1f4832 Upgrade dependencies in pyproject.toml and tests.yml 2025-04-07 17:19:23 +02:00
Jonas Eschle
9eedfb39db Upgrade dependencies in pyproject.toml and tests.yml 2025-04-07 17:18:54 +02:00
Jonas Eschle
e5336a446a Update pyproject.toml and tests.yml config 2025-04-07 17:05:51 +02:00
Jonas Eschle
648a43ffbe add pre-commit configuration file 2025-04-07 16:41:44 +02:00
Jonas Eschle
3d490ec251
Merge pull request #440 from Pwuts/patch-1
Add mapping `jose` -> `python-jose`
2025-04-07 16:30:58 +02:00
Jonas Eschle
4c65892517
Merge pull request #443 from bndr/dependabot/pip/idna-3.7
Bump idna from 3.6 to 3.7
2025-04-07 16:25:50 +02:00
Jonas Eschle
a14e8b4256
Merge pull request #447 from bndr/dependabot/pip/jinja2-3.1.4
Bump jinja2 from 3.1.3 to 3.1.4
2025-04-07 16:25:31 +02:00
Jonas Eschle
cc8545d530
Merge pull request #373 from mcp292/patch-1
Add OpenCV mapping
2025-04-07 16:19:22 +02:00
Jonas Eschle
aabe973eb1
Merge pull request #477 from bndr/next
update tests badge in readme
2025-04-07 16:15:30 +02:00
Jonas Eschle
5cdc9019d7
Merge pull request #430 from Borda/readme/badge
update tests badge in readme [stale]
2025-04-07 16:11:17 +02:00
Shun Kashiwa
08eead345a
Add --ignore-errors to skip files with syntax errors and attempt to find requirements on a best-effort basis 2025-03-14 09:49:03 -07:00
Ross Smith II
967a6688cb chore: Add python 3.13 support, drop 3.8 2025-02-27 16:29:50 -08:00
Jirka Borovec
eb37d03ff7
Merge branch 'next' into readme/badge 2024-11-12 21:57:56 +01:00
dependabot[bot]
1e9cc81f8e
Bump jinja2 from 3.1.3 to 3.1.4
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.3 to 3.1.4.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.3...3.1.4)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-05-06 21:09:57 +00:00
dependabot[bot]
75e7892310
Bump idna from 3.6 to 3.7
Bumps [idna](https://github.com/kjd/idna) from 3.6 to 3.7.
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](https://github.com/kjd/idna/compare/v3.6...v3.7)

---
updated-dependencies:
- dependency-name: idna
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-12 04:43:35 +00:00
Reinier van der Leer
4767b6444e
Add mapping jose -> python-jose 2024-03-19 16:46:50 +01:00
Alan Barzilay
cd3f437689 release pipreqs v0.5 2024-02-18 14:35:24 -03:00
Alan Barzilay
64fc5a2972 ci: fix 3.12 python tests
tests were failing due to too restrictive python version requirements
(<=3.12 when 3.12.X exist).
switching to <3.13 allows every python 3.12 patch version to work normally
2024-02-18 14:08:22 -03:00
Alan Barzilay
3a9bc86108 Merge branch 'next' 2024-02-18 14:04:54 -03:00
Alan Barzilay
769f3e501e Revert "Bump wheel from 0.23.0 to 0.38.1"
this is causing too many weird conflicts to merge next into master. Also,
requirements file was deprecated in next branch in lieu of
pyproject.toml

This reverts commit 68f9b2859d45a60e699839f87b1b9558cd36a329.
2024-02-18 14:00:26 -03:00
Jirka Borovec
e4faec2c1e
update tests badge in readme 2024-01-16 13:34:00 +01:00
mateuslatrova
4a9176b39a add support for .pyw files
Now, pipreqs will also scan imports in .pyw files by default.
2023-12-06 20:11:57 +00:00
mateuslatrova
de68691438 fix flake8 environment in tox.ini file
The way we configured the tox.ini file makes flake8 not to be run. Tox is only running python tests for the flake8 environment.

With this PR, tox will run flake8 for the pipreqs and tests folders as desired.
2023-12-06 18:03:41 +00:00
fernandocrz
b50b4a76eb Add support for jupyter notebooks
Credits to @pakio and @mateuslatrova for the contributions
2023-12-05 18:15:54 +00:00
fernandocrz
03c92488de suppress errors and warnings in unit tests
Credits to @mateuslatrova for the contribution.
2023-12-05 18:15:54 +00:00
fernandocrz
aa283ada00 define pipreqs default encoding to utf-8 2023-12-05 18:15:54 +00:00
Lucas de Sousa Rosa
f041de9bdb Bump python 3.12 support 2023-11-08 18:15:21 -03:00
Lucas de Sousa Rosa
fb4560c740 Migrating the packaging system to poetry with pyproject.toml
- Deleted old setup files `requirements.txt`, `setup.cfg`, `setup.py`, `MANIFEST.in`
- Added poetry files `poetry.toml`, `pyproject.toml`, `poetry.lock`
- Added `.pyenv-version` and `.tool-versions` for `pyenv` and `asdf`
- Updated `Makefile`, `CONTRIBUTING.rst`, `tox.ini`
2023-11-08 10:49:10 -03:00
mateuslatrova
368e9ae7e7 handle FileNotFoundError in parse_requirements function 2023-10-20 08:59:50 -03:00
darwish
55eee298ec Specify pypy version
- The pypy v7.3.13, used by GitHub Actions, was failing. So I force to use pypy v7.3.12 that was passing.
- The original PR was done by @EwoutH at #334 and modified by @willianrocha at #398.
2023-10-12 14:08:39 -03:00
Willian Rocha
8af7d85a74 CI: Add Python 3.10 and 3.11, update actions, run on pushes and manually
A bit of CI maintenance:
- Add Python 3.10 and 3.11 runs, and update the PyPy run to PyPy 3.9
- Removed Python 3.7 as it is deprecated
- Update the used actions (checkout and setup-python) to the latest versions
- Also run on pushes, when manually triggered (workflow_dispatch)

The original PR was done by @EwoutH at #334
2023-10-12 14:08:39 -03:00
dependabot[bot]
eb65254646 Bump wheel from 0.23.0 to 0.38.1
Bumps [wheel](https://github.com/pypa/wheel) from 0.23.0 to 0.38.1.
- [Release notes](https://github.com/pypa/wheel/releases)
- [Changelog](https://github.com/pypa/wheel/blob/main/docs/news.rst)
- [Commits](https://github.com/pypa/wheel/compare/0.23.0...0.38.1)

---
updated-dependencies:
- dependency-name: wheel
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-11 15:42:04 -03:00
Alan Barzilay
6f232bd080
Merge pull request #389 from mateuslatrova/compare_modules_test
Add unit test for "compare_modules" function
2023-10-10 15:41:00 -03:00
Alan Barzilay
2ebfc4645a
Merge branch 'next' into compare_modules_test 2023-10-10 15:36:35 -03:00
fernandocrz
12cc1e5b74 add test for parse_requirements function 2023-10-10 15:30:38 -03:00
Mateus Latrova
ed46d270e9 add test for "compare_modules" function 2023-10-09 16:12:42 -03:00
dependabot[bot]
68f9b2859d Bump wheel from 0.23.0 to 0.38.1
Bumps [wheel](https://github.com/pypa/wheel) from 0.23.0 to 0.38.1.
- [Release notes](https://github.com/pypa/wheel/releases)
- [Changelog](https://github.com/pypa/wheel/blob/main/docs/news.rst)
- [Commits](https://github.com/pypa/wheel/compare/0.23.0...0.38.1)

---
updated-dependencies:
- dependency-name: wheel
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-08 00:21:32 -03:00
Mateus Latrova
3c786e3537 add "copy to clipboard icon" in installation section
- Based on PR #369 and #396
- Authors Muhammad-Aadil and mateuslatrova
2023-10-08 00:10:47 -03:00
Alan Barzilay
acc41cc9bc
Merge pull request #395 from fredgrub/output_requirements_next
create test for output_requirements
2023-10-04 14:26:29 -03:00
Lucas de Sousa Rosa
ccf097b02f create test for output_requirements 2023-10-02 18:00:23 -03:00
mateuslatrova
6ac4357cf4 fix return type on docs 2023-09-28 20:36:24 -03:00
mateuslatrova
d56d5feb29 delete unused function "filter_line" 2023-09-28 20:36:24 -03:00
Alan Barzilay
cebc3f7e24
Merge pull request #393 from willianrocha/flake8_next
applying pep8 rules
2023-09-27 14:12:40 -03:00
XinShou
ab492859f4
Update mapping for SpeechRecognition 2023-09-22 06:44:24 +08:00
Willian Rocha
f2e745256e applying pep8 rules 2023-09-21 19:37:37 -03:00
testpushpleaseignore
c3d0f169d3
add pywintypes to mapping
mapped pywintypes to pywin32
2023-06-19 16:30:09 -05:00
mcp292
c0d6b293b1
Add OpenCV mapping
Fixes #307.
2023-05-17 15:07:42 -07:00
Vadim Kravcenko
a1f83d27d9 Bump Version to 0.4.13 2023-04-14 09:08:49 +02:00
Vadim Kravcenko
181525cdea
Merge pull request #304 from Marocco2/patch-1
Mapping Telegram to python-telegram-bot
2023-04-13 20:39:56 +02:00
Vadim Kravcenko
f9f9d53620
Merge pull request #335 from mmngreco/patch-1
enh: include airflow mapping
2023-04-13 20:38:32 +02:00
Vadim Kravcenko
717d4926bc
Merge pull request #364 from adeadfed/master
Mitigation for dependency confusion in pipreqs
2023-04-13 20:38:09 +02:00
adeadfed
2103371746 add whitespaces to pep8 formatted strings 2023-04-13 16:05:50 +02:00
adeadfed
5537bcd217 fix pep8 linting issues 2023-04-13 15:59:50 +02:00
Vadim Kravcenko
96440ac2ff
Merge pull request #343 from Alexander-Wilms/master
Add mapping for python-constraint
2023-04-13 14:32:32 +02:00
Vadim Kravcenko
4c380a9661
Merge pull request #349 from hansvana/patch-1
Update mapping socketio to python-socketio
2023-04-13 14:27:26 +02:00
Vadim Kravcenko
ed13177c81
Merge pull request #354 from bskiefer/patch-1
Update mapping for python-slugify
2023-04-13 14:27:06 +02:00
adeadfed
bb61883079 revert changes to .gitignore 2023-04-08 17:10:02 +02:00
adeadfed
3ae82087f9 remove invalid comment 2023-03-29 00:32:00 +02:00
adeadfed
001ead91ed delete vscode related files from repo 2023-03-29 00:28:17 +02:00
adeadfed
50498d3cd4 change warning messages 2023-03-29 00:27:13 +02:00
adeadfed
da442e7a2a add warnings to for remote resolving mode 2023-03-29 00:25:45 +02:00
adeadfed
6cd9925a31 improved version of local package resolving 2023-03-29 00:17:49 +02:00
adeadfed
3f5964fcb9 fix name resolution for local packages 2023-03-14 21:14:19 +01:00
bskiefer
175524ec24
Update mapping for python-slugify 2023-02-06 10:51:09 -06:00
Alan Barzilay
e5493d922e
Merge pull request #344 from clvnkhr/Check-if-requirements.txt-exist-before-proceeding-further-inspection-#285
resolve issue #285
2023-01-14 11:49:35 -03:00
hansvana
5cdc966b2e
Update mapping socketio to python-socketio
Originally mapped to gevent-socketio, which is outdated and hasn't been updated since 2016. Anyone importing socketio is much more likely to mean python-socketio now.
2023-01-13 11:15:15 +01:00
clvnkhr
f97d8b9ba4 resolve issue #285
simply moved the check in init for requirements.txt ahead of searching for packages. Closes #285. Passes tox (3.7, 3.8, 3.9, pypy3, flake8)
2023-01-05 15:06:39 +08:00
Alexander Wilms
775681975a
Add mapping for python-constraint 2023-01-02 22:42:27 +01:00
Alan Barzilay
80bfad8285
Merge pull request #339 from EwoutH/py37
Require Python 3.7 or higher
2022-12-02 17:02:21 -03:00
Ewout ter Hoeven
80503cd3fd
Require Python 3.7 or higher 2022-12-02 17:36:42 +01:00
Maximiliano Greco
0aa3f80b84
enh: include airflow mapping
closes https://github.com/bndr/pipreqs/issues/179
2022-11-17 13:25:49 +01:00
Kilian Lieret
de13f174c3
Doc: no-pin instead of non-pin for scheme 2022-08-09 12:16:57 -04:00
Marocco2
2a26eadc91
Mapping Telegram to python-telegram-bot
telegram is a hollow pypi package, so pipreqs will try to install that instead of python-telegram-bot, which have a module named telegram.
I'll ask to approve this change, as I'm blocked with my Pipedream workflow.
2022-04-10 09:49:12 +00:00
31 changed files with 3271 additions and 368 deletions

View File

@ -1,18 +1,32 @@
name: flake8
on: pull_request
concurrency:
group: ${{ github.ref }}
cancel-in-progress: true
on:
workflow_dispatch:
push:
tags:
- "*"
branches:
- main
- master
- develop
- "release/*"
pull_request:
jobs:
flake8-lint:
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
name: Lint
steps:
- name: Check out source repository
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Set up Python environment
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: "3.9"
python-version: "3.13"
- name: flake8 Lint
uses: reviewdog/action-flake8@v3
with:

View File

@ -1,50 +1,65 @@
name: Tests and Codecov
on: pull_request
on:
push:
branches:
- master
- main
- "release/*"
pull_request:
workflow_dispatch:
jobs:
run_tests:
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
strategy:
fail-fast: false
matrix:
python-version: [3.6, 3.7, 3.8, 3.9, pypy-3.7]
python-version: ['3.9', '3.10', '3.11', '3.12', '3.13', 'pypy-3.10']
steps:
- name: Checkout repository
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
python -m pip install uv
uv pip install --system tox tox-gh-actions
- name: Test with tox
run: tox
coverage_report:
needs: run_tests
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
steps:
- name: Checkout repository
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Set up Python 3.13
uses: actions/setup-python@v5
with:
python-version: 3.13
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install coverage docopt yarg requests
python -m pip install uv
uv pip install --system poetry
uv pip install --system .[dev]
- name: Calculate coverage
run: coverage run --source=pipreqs -m unittest discover
run: poetry run coverage run --source=pipreqs -m unittest discover
- name: Create XML report
run: coverage xml
run: poetry run coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v2
uses: codecov/codecov-action@v5
with:
files: coverage.xml
fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: false

96
.pre-commit-config.yaml Normal file
View File

@ -0,0 +1,96 @@
ci:
autoupdate_commit_msg: "chore: update pre-commit hooks"
autofix_commit_msg: "style: pre-commit fixes"
autoupdate_schedule: quarterly
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-added-large-files
args: [ '--maxkb=1000' ]
- id: check-case-conflict
- id: check-merge-conflict
- id: check-symlinks
- id: check-yaml
- id: check-toml
- id: check-json
- id: debug-statements
- id: end-of-file-fixer
- id: mixed-line-ending
- id: requirements-txt-fixer
- id: trailing-whitespace
files: ".*\\.(?:tex|py)$"
args: [ --markdown-linebreak-ext=md ]
exclude: (^notebooks/|^tests/truth/)
- id: detect-private-key
- id: fix-byte-order-marker
- id: check-ast
- id: check-docstring-first
- id: debug-statements
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.10.0
hooks:
- id: python-use-type-annotations
- id: python-check-mock-methods
- id: python-no-eval
- id: rst-backticks
- id: rst-directive-colons
- repo: https://github.com/asottile/pyupgrade
rev: v3.3.1
hooks:
- id: pyupgrade
args: [ --py38-plus ]
# Notebook formatting
- repo: https://github.com/nbQA-dev/nbQA
rev: 1.9.1
hooks:
- id: nbqa-isort
additional_dependencies: [ isort ]
- id: nbqa-pyupgrade
additional_dependencies: [ pyupgrade ]
args: [ --py38-plus ]
- repo: https://github.com/kynan/nbstripout
rev: 0.8.1
hooks:
- id: nbstripout
- repo: https://github.com/sondrelg/pep585-upgrade
rev: 'v1.0'
hooks:
- id: upgrade-type-hints
args: [ '--futures=true' ]
- repo: https://github.com/MarcoGorelli/auto-walrus
rev: 0.3.4
hooks:
- id: auto-walrus
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.30.0
hooks:
- id: check-github-workflows
- id: check-github-actions
- id: check-dependabot
- id: check-readthedocs
- repo: https://github.com/dannysepler/rm_unneeded_f_str
rev: v0.2.0
hooks:
- id: rm-unneeded-f-str
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.8.6"
hooks:
- id: ruff
types_or: [ python, pyi, jupyter ]
args: [ --fix, --show-fixes , --line-length=120 ] # --unsafe-fixes,
# Run the formatter.
- id: ruff-format
types_or: [ python, pyi, jupyter ]

7
.python-version Normal file
View File

@ -0,0 +1,7 @@
3.13
3.12
3.11
3.10
3.9
3.8
pypy3.9-7.3.12

1
.tool-versions Normal file
View File

@ -0,0 +1 @@
python 3.13 3.12 3.11 3.10 3.9 3.8 pypy3.9-7.3.12

View File

@ -61,12 +61,11 @@ Ready to contribute? Here's how to set up `pipreqs` for local development.
2. Clone your fork locally::
$ git clone git@github.com:your_name_here/pipreqs.git
3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::
$ mkvirtualenv pipreqs
$ cd pipreqs/
$ python setup.py develop
3. Pipreqs is developed using Poetry. Refer to the `documentation <https://python-poetry.org/docs/>`_ to install Poetry in your local environment. Next, you should install pipreqs's dependencies::
$ poetry install --with dev
4. Create a branch for local development::
@ -76,11 +75,11 @@ Ready to contribute? Here's how to set up `pipreqs` for local development.
5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::
$ flake8 pipreqs tests
$ python setup.py test
$ tox
$ poetry run flake8 pipreqs tests
$ poetry run python -m unittest discover
$ poetry run tox
To get flake8 and tox, just pip install them into your virtualenv.
To test all versions of python using tox you need to have them installed and for this two options are recommended: `pyenv` or `asdf`.
6. Commit your changes and push your branch to GitHub::
@ -99,7 +98,7 @@ Before you submit a pull request, check that it meets these guidelines:
2. If the pull request adds functionality, the docs should be updated. Put
your new functionality into a function with a docstring, and add the
feature to the list in README.rst.
3. The pull request should work for Python 2.7, 3.4, 3.5, 3.6, and PyPy. Check
3. The pull request should work for currently supported Python and PyPy versions. Check
https://travis-ci.org/bndr/pipreqs/pull_requests and make sure that the
tests pass for all supported Python versions.
@ -108,4 +107,4 @@ Tips
To run a subset of tests::
$ python -m unittest tests.test_pipreqs
$ poetry run python -m unittest tests.test_pipreqs

View File

@ -1,13 +0,0 @@
include AUTHORS.rst
include CONTRIBUTING.rst
include HISTORY.rst
include LICENSE
include README.rst
include pipreqs/stdlib
include pipreqs/mapping
recursive-include tests *
recursive-exclude * __pycache__
recursive-exclude * *.py[co]
recursive-include docs *.rst conf.py Makefile make.bat stdlib mapping

View File

@ -6,13 +6,14 @@ help:
@echo "clean-pyc - remove Python file artifacts"
@echo "clean-test - remove test and coverage artifacts"
@echo "lint - check style with flake8"
@echo "test - run tests quickly with the default Python"
@echo "test - run tests quickly using the default Python"
@echo "test-all - run tests on every Python version with tox"
@echo "coverage - check code coverage quickly with the default Python"
@echo "docs - generate Sphinx HTML documentation, including API docs"
@echo "release - package and upload a release"
@echo "dist - package"
@echo "install - install the package to the active Python's site-packages"
@echo "publish - package and upload a release"
@echo "publish-to-test - package and upload a release to test-pypi"
@echo "build - build the package"
@echo "install - install the dependencies into the Poetry virtual environment"
clean: clean-build clean-pyc clean-test
@ -35,14 +36,13 @@ clean-test:
rm -fr htmlcov/
lint:
flake8 pipreqs tests
poetry run flake8 pipreqs tests
test:
pip install -r requirements.txt
python setup.py test
poetry run python -m unittest discover
test-all:
tox
poetry run tox
coverage:
coverage run --source pipreqs setup.py test
@ -58,13 +58,14 @@ docs:
$(MAKE) -C docs html
open docs/_build/html/index.html
release: clean
python setup.py sdist bdist_wheel upload -r pypi
publish: build
poetry publish
dist: clean
python setup.py sdist
python setup.py bdist_wheel
ls -l dist
publish-to-test: build
poetry publish --repository test-pypi
build: clean
poetry build
install: clean
python setup.py install
poetry install --with dev

View File

@ -2,8 +2,8 @@
``pipreqs`` - Generate requirements.txt file for any project based on imports
=============================================================================
.. image:: https://img.shields.io/travis/bndr/pipreqs.svg
:target: https://travis-ci.org/bndr/pipreqs
.. image:: https://github.com/bndr/pipreqs/actions/workflows/tests.yml/badge.svg
:target: https://github.com/bndr/pipreqs/actions/workflows/tests.yml
.. image:: https://img.shields.io/pypi/v/pipreqs.svg
@ -21,10 +21,18 @@
Installation
------------
::
.. code-block:: sh
pip install pipreqs
Obs.: if you don't want support for jupyter notebooks, you can install pipreqs without the dependencies that give support to it.
To do so, run:
.. code-block:: sh
pip install --no-deps pipreqs
pip install yarg==0.1.9 docopt==0.6.2
Usage
-----
@ -47,6 +55,7 @@ Usage
--debug Print debug information
--ignore <dirs>... Ignore extra directories, each separated by a comma
--no-follow-links Do not follow symbolic links in the project
--ignore-errors Ignore errors while scanning files
--encoding <charset> Use encoding parameter for file open
--savepath <file> Save the list of requirements in the given file
--print Output the list of requirements in the standard output
@ -57,6 +66,7 @@ Usage
<compat> | e.g. Flask~=1.1.2
<gt> | e.g. Flask>=1.1.2
<no-pin> | e.g. Flask
--scan-notebooks Look for imports in jupyter notebook files.
Example
-------

View File

@ -1,3 +1,3 @@
__author__ = 'Vadim Kravcenko'
__email__ = 'vadim.kravcenko@gmail.com'
__version__ = '0.4.11'
__version__ = '0.4.13'

View File

@ -10,6 +10,7 @@ BeautifulSoupTests:BeautifulSoup
BioSQL:biopython
BuildbotStatusShields:BuildbotEightStatusShields
ComputedAttribute:ExtensionClass
constraint:python-constraint
Crypto:pycryptodome
Cryptodome:pycryptodomex
FSM:pexpect
@ -35,6 +36,7 @@ Pyxides:astro_pyxis
QtCore:PySide
S3:s3cmd
SCons:pystick
speech_recognition:SpeechRecognition
Shared:Zope2
Signals:Zope2
Stemmer:PyStemmer
@ -129,6 +131,7 @@ aios3:aio_s3
airbrake:airbrake_flask
airship:airship_icloud
airship:airship_steamcloud
airflow:apache-airflow
akamai:edgegrid_python
alation:alation_api
alba_client:alba_client_python
@ -580,6 +583,7 @@ ctff:tff
cups:pycups
curator:elasticsearch_curator
curl:pycurl
cv2:opencv-python
daemon:python_daemon
dare:DARE
dateutil:python_dateutil
@ -718,6 +722,7 @@ jaraco:jaraco.util
jinja2:Jinja2
jiracli:jira_cli
johnny:johnny_cache
jose:python_jose
jpgrid:python_geohash
jpiarea:python_geohash
jpype:JPype1
@ -973,6 +978,7 @@ pysynth_samp:PySynth
pythongettext:python_gettext
pythonjsonlogger:python_json_logger
pyutilib:PyUtilib
pywintypes:pywin32
pyximport:Cython
qs:qserve
quadtree:python_geohash
@ -1030,9 +1036,10 @@ skbio:scikit_bio
sklearn:scikit_learn
slack:slackclient
slugify:unicode_slugify
slugify:python-slugify
smarkets:smk_python_sdk
snappy:ctypes_snappy
socketio:gevent_socketio
socketio:python-socketio
socketserver:pies2overrides
sockjs:sockjs_tornado
socks:SocksiPy_branch
@ -1061,6 +1068,7 @@ tasksitter:cerebrod
tastypie:django_tastypie
teamcity:teamcity_messages
telebot:pyTelegramBotAPI
telegram:python-telegram-bot
tempita:Tempita
tenjin:Tenjin
termstyle:python_termstyle

View File

@ -20,6 +20,7 @@ Options:
$ export HTTPS_PROXY="https://10.10.1.10:1080"
--debug Print debug information
--ignore <dirs>... Ignore extra directories, each separated by a comma
--ignore-errors Ignore errors while scanning files
--no-follow-links Do not follow symbolic links in the project
--encoding <charset> Use encoding parameter for file open
--savepath <file> Save the list of requirements in the given file
@ -31,10 +32,11 @@ Options:
--clean <file> Clean up requirements.txt by removing modules
that are not imported in project
--mode <scheme> Enables dynamic versioning with <compat>,
<gt> or <non-pin> schemes.
<gt> or <no-pin> schemes.
<compat> | e.g. Flask~=1.1.2
<gt> | e.g. Flask>=1.1.2
<no-pin> | e.g. Flask
--scan-notebooks Look for imports in jupyter notebook files.
"""
from contextlib import contextmanager
import os
@ -50,14 +52,23 @@ from yarg.exceptions import HTTPError
from pipreqs import __version__
REGEXP = [
re.compile(r'^import (.+)$'),
re.compile(r'^from ((?!\.+).*?) import (?:.*)$')
]
REGEXP = [re.compile(r"^import (.+)$"), re.compile(r"^from ((?!\.+).*?) import (?:.*)$")]
DEFAULT_EXTENSIONS = [".py", ".pyw"]
scan_noteboooks = False
class NbconvertNotInstalled(ImportError):
default_message = (
"In order to scan jupyter notebooks, please install the nbconvert and ipython libraries"
)
def __init__(self, message=default_message):
super().__init__(message)
@contextmanager
def _open(filename=None, mode='r'):
def _open(filename=None, mode="r"):
"""Open a file or ``sys.stdout`` depending on the provided filename.
Args:
@ -70,13 +81,13 @@ def _open(filename=None, mode='r'):
A file handle.
"""
if not filename or filename == '-':
if not mode or 'r' in mode:
if not filename or filename == "-":
if not mode or "r" in mode:
file = sys.stdin
elif 'w' in mode:
elif "w" in mode:
file = sys.stdout
else:
raise ValueError('Invalid mode for file: {}'.format(mode))
raise ValueError("Invalid mode for file: {}".format(mode))
else:
file = open(filename, mode)
@ -87,13 +98,21 @@ def _open(filename=None, mode='r'):
file.close()
def get_all_imports(
path, encoding=None, extra_ignore_dirs=None, follow_links=True):
def get_all_imports(path, encoding="utf-8", extra_ignore_dirs=None, follow_links=True, ignore_errors=False):
imports = set()
raw_imports = set()
candidates = []
ignore_errors = False
ignore_dirs = [".hg", ".svn", ".git", ".tox", "__pycache__", "env", "venv"]
ignore_dirs = [
".hg",
".svn",
".git",
".tox",
"__pycache__",
"env",
"venv",
".venv",
".ipynb_checkpoints",
]
if extra_ignore_dirs:
ignore_dirs_parsed = []
@ -101,19 +120,23 @@ def get_all_imports(
ignore_dirs_parsed.append(os.path.basename(os.path.realpath(e)))
ignore_dirs.extend(ignore_dirs_parsed)
extensions = get_file_extensions()
walk = os.walk(path, followlinks=follow_links)
for root, dirs, files in walk:
dirs[:] = [d for d in dirs if d not in ignore_dirs]
candidates.append(os.path.basename(root))
files = [fn for fn in files if os.path.splitext(fn)[1] == ".py"]
py_files = [file for file in files if file_ext_is_allowed(file, DEFAULT_EXTENSIONS)]
candidates.extend([os.path.splitext(filename)[0] for filename in py_files])
files = [fn for fn in files if file_ext_is_allowed(fn, extensions)]
candidates += [os.path.splitext(fn)[0] for fn in files]
for file_name in files:
file_name = os.path.join(root, file_name)
with open(file_name, "r", encoding=encoding) as f:
contents = f.read()
try:
contents = read_file_content(file_name, encoding)
tree = ast.parse(contents)
for node in ast.walk(tree):
if isinstance(node, ast.Import):
@ -123,7 +146,7 @@ def get_all_imports(
raw_imports.add(node.module)
except Exception as exc:
if ignore_errors:
traceback.print_exc(exc)
traceback.print_exc()
logging.warn("Failed on file: %s" % file_name)
continue
else:
@ -137,11 +160,11 @@ def get_all_imports(
# Cleanup: We only want to first part of the import.
# Ex: from django.conf --> django.conf. But we only want django
# as an import.
cleaned_name, _, _ = name.partition('.')
cleaned_name, _, _ = name.partition(".")
imports.add(cleaned_name)
packages = imports - (set(candidates) & imports)
logging.debug('Found packages: {0}'.format(packages))
logging.debug("Found packages: {0}".format(packages))
with open(join("stdlib"), "r") as f:
data = {x.strip() for x in f}
@ -149,53 +172,96 @@ def get_all_imports(
return list(packages - data)
def filter_line(line):
return len(line) > 0 and line[0] != "#"
def get_file_extensions():
return DEFAULT_EXTENSIONS + [".ipynb"] if scan_noteboooks else DEFAULT_EXTENSIONS
def read_file_content(file_name: str, encoding="utf-8"):
if file_ext_is_allowed(file_name, DEFAULT_EXTENSIONS):
with open(file_name, "r", encoding=encoding) as f:
contents = f.read()
elif file_ext_is_allowed(file_name, [".ipynb"]) and scan_noteboooks:
contents = ipynb_2_py(file_name, encoding=encoding)
return contents
def file_ext_is_allowed(file_name, acceptable):
return os.path.splitext(file_name)[1] in acceptable
def ipynb_2_py(file_name, encoding="utf-8"):
"""
Args:
file_name (str): notebook file path to parse as python script
encoding (str): encoding of file
Returns:
str: parsed string
"""
exporter = PythonExporter()
(body, _) = exporter.from_filename(file_name)
return body.encode(encoding)
def generate_requirements_file(path, imports, symbol):
with _open(path, "w") as out_file:
logging.debug('Writing {num} requirements: {imports} to {file}'.format(
num=len(imports),
file=path,
imports=", ".join([x['name'] for x in imports])
))
fmt = '{name}' + symbol + '{version}'
out_file.write('\n'.join(
fmt.format(**item) if item['version'] else '{name}'.format(**item)
for item in imports) + '\n')
logging.debug(
"Writing {num} requirements: {imports} to {file}".format(
num=len(imports), file=path, imports=", ".join([x["name"] for x in imports])
)
)
fmt = "{name}" + symbol + "{version}"
out_file.write(
"\n".join(
fmt.format(**item) if item["version"] else "{name}".format(**item)
for item in imports
)
+ "\n"
)
def output_requirements(imports, symbol):
generate_requirements_file('-', imports, symbol)
generate_requirements_file("-", imports, symbol)
def get_imports_info(
imports, pypi_server="https://pypi.python.org/pypi/", proxy=None):
def get_imports_info(imports, pypi_server="https://pypi.python.org/pypi/", proxy=None):
result = []
for item in imports:
try:
response = requests.get(
"{0}{1}/json".format(pypi_server, item), proxies=proxy)
logging.warning(
'Import named "%s" not found locally. ' "Trying to resolve it at the PyPI server.",
item,
)
response = requests.get("{0}{1}/json".format(pypi_server, item), proxies=proxy)
if response.status_code == 200:
if hasattr(response.content, 'decode'):
if hasattr(response.content, "decode"):
data = json2package(response.content.decode())
else:
data = json2package(response.content)
elif response.status_code >= 300:
raise HTTPError(status_code=response.status_code,
reason=response.reason)
raise HTTPError(status_code=response.status_code, reason=response.reason)
except HTTPError:
logging.debug(
'Package %s does not exist or network problems', item)
logging.warning('Package "%s" does not exist or network problems', item)
continue
result.append({'name': item, 'version': data.latest_release_id})
logging.warning(
'Import named "%s" was resolved to "%s:%s" package (%s).\n'
"Please, verify manually the final list of requirements.txt "
"to avoid possible dependency confusions.",
item,
data.name,
data.latest_release_id,
data.pypi_url,
)
result.append({"name": item, "version": data.latest_release_id})
return result
def get_locally_installed_packages(encoding=None):
packages = {}
def get_locally_installed_packages(encoding="utf-8"):
packages = []
ignore = ["tests", "_tests", "egg", "EGG", "info"]
for path in sys.path:
for root, dirs, files in os.walk(path):
@ -205,39 +271,53 @@ def get_locally_installed_packages(encoding=None):
with open(item, "r", encoding=encoding) as f:
package = root.split(os.sep)[-1].split("-")
try:
package_import = f.read().strip().split("\n")
top_level_modules = f.read().strip().split("\n")
except: # NOQA
# TODO: What errors do we intend to suppress here?
continue
for i_item in package_import:
if ((i_item not in ignore) and
(package[0] not in ignore)):
version = None
if len(package) > 1:
version = package[1].replace(
".dist", "").replace(".egg", "")
packages[i_item] = {
'version': version,
'name': package[0]
}
# filter off explicitly ignored top-level modules
# such as test, egg, etc.
filtered_top_level_modules = list()
for module in top_level_modules:
if (module not in ignore) and (package[0] not in ignore):
# append exported top level modules to the list
filtered_top_level_modules.append(module)
version = None
if len(package) > 1:
version = package[1].replace(".dist", "").replace(".egg", "")
# append package: top_level_modules pairs
# instead of top_level_module: package pairs
packages.append(
{
"name": package[0],
"version": version,
"exports": filtered_top_level_modules,
}
)
return packages
def get_import_local(imports, encoding=None):
def get_import_local(imports, encoding="utf-8"):
local = get_locally_installed_packages()
result = []
for item in imports:
if item.lower() in local:
result.append(local[item.lower()])
# search through local packages
for package in local:
# if candidate import name matches export name
# or candidate import name equals to the package name
# append it to the result
if item in package["exports"] or item == package["name"]:
result.append(package)
# removing duplicates of package/version
result_unique = [
dict(t)
for t in set([
tuple(d.items()) for d in result
])
]
# had to use second method instead of the previous one,
# because we have a list in the 'exports' field
# https://stackoverflow.com/questions/9427163/remove-duplicate-dict-in-list-in-python
result_unique = [i for n, i in enumerate(result) if i not in result[n + 1:]]
return result_unique
@ -268,7 +348,7 @@ def get_name_without_alias(name):
match = REGEXP[0].match(name.strip())
if match:
name = match.groups(0)[0]
return name.partition(' as ')[0].partition('.')[0].strip()
return name.partition(" as ")[0].partition(".")[0].strip()
def join(f):
@ -282,6 +362,9 @@ def parse_requirements(file_):
delimiter, get module name by element index, create a dict consisting of
module:version, and add dict to list of parsed modules.
If file ´file_´ is not found in the system, the program will print a
helpful message and end its execution immediately.
Args:
file_: File to parse.
@ -289,7 +372,7 @@ def parse_requirements(file_):
OSerror: If there's any issues accessing the file.
Returns:
tuple: The contents of the file, excluding comments.
list: The contents of the file, excluding comments.
"""
modules = []
# For the dependency identifier specification, see
@ -298,9 +381,12 @@ def parse_requirements(file_):
try:
f = open(file_, "r")
except OSError:
logging.error("Failed on file: {}".format(file_))
raise
except FileNotFoundError:
print(f"File {file_} was not found. Please, fix it and run again.")
sys.exit(1)
except OSError as error:
logging.error(f"There was an error opening the file {file_}: {str(error)}")
raise error
else:
try:
data = [x.strip() for x in f.readlines() if x != "\n"]
@ -336,8 +422,8 @@ def compare_modules(file_, imports):
imports (tuple): Modules being imported in the project.
Returns:
tuple: The modules not imported in the project, but do exist in the
specified file.
set: The modules not imported in the project, but do exist in the
specified file.
"""
modules = parse_requirements(file_)
@ -354,7 +440,8 @@ def diff(file_, imports):
logging.info(
"The following modules are in {} but do not seem to be imported: "
"{}".format(file_, ", ".join(x for x in modules_not_imported)))
"{}".format(file_, ", ".join(x for x in modules_not_imported))
)
def clean(file_, imports):
@ -401,21 +488,57 @@ def dynamic_versioning(scheme, imports):
return imports, symbol
def handle_scan_noteboooks():
if not scan_noteboooks:
logging.info("Not scanning for jupyter notebooks.")
return
try:
global PythonExporter
from nbconvert import PythonExporter
except ImportError:
raise NbconvertNotInstalled()
def init(args):
encoding = args.get('--encoding')
extra_ignore_dirs = args.get('--ignore')
follow_links = not args.get('--no-follow-links')
input_path = args['<path>']
global scan_noteboooks
encoding = args.get("--encoding")
extra_ignore_dirs = args.get("--ignore")
follow_links = not args.get("--no-follow-links")
ignore_errors = args.get("--ignore-errors")
scan_noteboooks = args.get("--scan-notebooks", False)
handle_scan_noteboooks()
input_path = args["<path>"]
if encoding is None:
encoding = "utf-8"
if input_path is None:
input_path = os.path.abspath(os.curdir)
if extra_ignore_dirs:
extra_ignore_dirs = extra_ignore_dirs.split(',')
extra_ignore_dirs = extra_ignore_dirs.split(",")
candidates = get_all_imports(input_path,
encoding=encoding,
extra_ignore_dirs=extra_ignore_dirs,
follow_links=follow_links)
path = (
args["--savepath"] if args["--savepath"] else os.path.join(input_path, "requirements.txt")
)
if (
not args["--print"]
and not args["--savepath"]
and not args["--force"]
and os.path.exists(path)
):
logging.warning("requirements.txt already exists, " "use --force to overwrite it")
return
candidates = get_all_imports(
input_path,
encoding=encoding,
extra_ignore_dirs=extra_ignore_dirs,
follow_links=follow_links,
ignore_errors=ignore_errors,
)
candidates = get_pkg_names(candidates)
logging.debug("Found imports: " + ", ".join(candidates))
pypi_server = "https://pypi.python.org/pypi/"
@ -424,26 +547,34 @@ def init(args):
pypi_server = args["--pypi-server"]
if args["--proxy"]:
proxy = {'http': args["--proxy"], 'https': args["--proxy"]}
proxy = {"http": args["--proxy"], "https": args["--proxy"]}
if args["--use-local"]:
logging.debug(
"Getting package information ONLY from local installation.")
logging.debug("Getting package information ONLY from local installation.")
imports = get_import_local(candidates, encoding=encoding)
else:
logging.debug("Getting packages information from Local/PyPI")
local = get_import_local(candidates, encoding=encoding)
# Get packages that were not found locally
difference = [x for x in candidates
if x.lower() not in [z['name'].lower() for z in local]]
imports = local + get_imports_info(difference,
proxy=proxy,
pypi_server=pypi_server)
# sort imports based on lowercase name of package, similar to `pip freeze`.
imports = sorted(imports, key=lambda x: x['name'].lower())
path = (args["--savepath"] if args["--savepath"] else
os.path.join(input_path, "requirements.txt"))
# check if candidate name is found in
# the list of exported modules, installed locally
# and the package name is not in the list of local module names
# it add to difference
difference = [
x
for x in candidates
if
# aggregate all export lists into one
# flatten the list
# check if candidate is in exports
x.lower() not in [y for x in local for y in x["exports"]] and
# check if candidate is package names
x.lower() not in [x["name"] for x in local]
]
imports = local + get_imports_info(difference, proxy=proxy, pypi_server=pypi_server)
# sort imports based on lowercase name of package, similar to `pip freeze`.
imports = sorted(imports, key=lambda x: x["name"].lower())
if args["--diff"]:
diff(args["--diff"], imports)
@ -453,21 +584,14 @@ def init(args):
clean(args["--clean"], imports)
return
if (not args["--print"]
and not args["--savepath"]
and not args["--force"]
and os.path.exists(path)):
logging.warning("requirements.txt already exists, "
"use --force to overwrite it")
return
if args["--mode"]:
scheme = args.get("--mode")
if scheme in ["compat", "gt", "no-pin"]:
imports, symbol = dynamic_versioning(scheme, imports)
else:
raise ValueError("Invalid argument for mode flag, "
"use 'compat', 'gt' or 'no-pin' instead")
raise ValueError(
"Invalid argument for mode flag, " "use 'compat', 'gt' or 'no-pin' instead"
)
else:
symbol = "=="
@ -481,8 +605,8 @@ def init(args):
def main(): # pragma: no cover
args = docopt(__doc__, version=__version__)
log_level = logging.DEBUG if args['--debug'] else logging.INFO
logging.basicConfig(level=log_level, format='%(levelname)s: %(message)s')
log_level = logging.DEBUG if args["--debug"] else logging.INFO
logging.basicConfig(level=log_level, format="%(levelname)s: %(message)s")
try:
init(args)
@ -490,5 +614,5 @@ def main(): # pragma: no cover
sys.exit(0)
if __name__ == '__main__':
if __name__ == "__main__":
main() # pragma: no cover

2021
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

2
poetry.toml Normal file
View File

@ -0,0 +1,2 @@
[virtualenvs]
prefer-active-python = true

53
pyproject.toml Normal file
View File

@ -0,0 +1,53 @@
[project]
name = "pipreqs"
version = "0.5.0"
description = "Pip requirements.txt generator based on imports in project"
authors = [
{ name = "Vadim Kravcenko", email = "vadim.kravcenko@gmail.com" }
]
maintainers = [
{name = "Jonas Eschle", email = "jonas.eschle@gmail.com"}
]
license = "Apache-2.0"
readme = "README.rst"
packages = [{ include = "pipreqs" }]
repository = "https://github.com/bndr/pipreqs"
keywords = ["pip", "requirements", "imports"]
classifiers = [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Natural Language :: English",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
]
requires-python = ">=3.9, <3.14"
dependencies = [
"yarg>=0.1.9",
"docopt>=0.6.2",
"nbconvert>=7.11.0",
"ipython>=8.12.3",
]
[project.optional-dependencies]
dev = [
"flake8>=6.1.0",
"tox>=4.11.3",
"coverage>=7.3.2",
"sphinx>=7.2.6;python_version>='3.9'",
]
[tool.poetry.group.dev.dependencies] # for legacy usage
flake8 = "^6.1.0"
tox = "^4.11.3"
coverage = "^7.3.2"
sphinx = { version = "^7.2.6", python = ">=3.9" }
[project.scripts]
pipreqs = "pipreqs.pipreqs:main"
[build-system]
requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api"

View File

@ -1,3 +0,0 @@
wheel==0.23.0
Yarg==0.1.9
docopt==0.6.2

View File

@ -1,2 +0,0 @@
[wheel]
universal = 1

View File

@ -1,57 +0,0 @@
#!/usr/bin/env python
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
from pipreqs import __version__
with open('README.rst') as readme_file:
readme = readme_file.read()
with open('HISTORY.rst') as history_file:
history = history_file.read().replace('.. :changelog:', '')
requirements = [
'docopt', 'yarg'
]
setup(
name='pipreqs',
version=__version__,
description='Pip requirements.txt generator based on imports in project',
long_description=readme + '\n\n' + history,
author='Vadim Kravcenko',
author_email='vadim.kravcenko@gmail.com',
url='https://github.com/bndr/pipreqs',
packages=[
'pipreqs',
],
package_dir={'pipreqs':
'pipreqs'},
include_package_data=True,
package_data={'': ['stdlib', 'mapping']},
install_requires=requirements,
license='Apache License',
zip_safe=False,
keywords='pip requirements imports',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
],
test_suite='tests',
entry_points={
'console_scripts': [
'pipreqs=pipreqs.pipreqs:main',
],
},
)

0
tests/_data/empty.txt Normal file
View File

3
tests/_data/imports.txt Normal file
View File

@ -0,0 +1,3 @@
pandas==2.0.0
numpy>=1.2.3
torch<4.0.0

View File

@ -0,0 +1,4 @@
numpy
pandas==2.0.0
tensorflow
torch<4.0.0

View File

@ -0,0 +1,3 @@
pandas
tensorflow
torch

View File

@ -0,0 +1,65 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Magic test"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%automagic true"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ls -la\n",
"logstate"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ls -la"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%automagic false"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ls -la"
]
}
],
"metadata": {
"language_info": {
"name": "python"
},
"orig_nbformat": 4
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -0,0 +1,37 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Markdown test\n",
"import sklearn\n",
"\n",
"```python\n",
"import FastAPI\n",
"```"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.1"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

View File

View File

@ -0,0 +1,102 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"\"\"\"unused import\"\"\"\n",
"# pylint: disable=undefined-all-variable, import-error, no-absolute-import, too-few-public-methods, missing-docstring\n",
"import xml.etree # [unused-import]\n",
"import xml.sax # [unused-import]\n",
"import os.path as test # [unused-import]\n",
"from sys import argv as test2 # [unused-import]\n",
"from sys import flags # [unused-import]\n",
"# +1:[unused-import,unused-import]\n",
"from collections import deque, OrderedDict, Counter\n",
"# All imports above should be ignored\n",
"import requests # [unused-import]\n",
"\n",
"# setuptools\n",
"import zipimport # command/easy_install.py\n",
"\n",
"# twisted\n",
"from importlib import invalidate_caches # python/test/test_deprecate.py\n",
"\n",
"# astroid\n",
"import zipimport # manager.py\n",
"# IPython\n",
"from importlib.machinery import all_suffixes # core/completerlib.py\n",
"import importlib # html/notebookapp.py\n",
"\n",
"from IPython.utils.importstring import import_item # Many files\n",
"\n",
"# pyflakes\n",
"# test/test_doctests.py\n",
"from pyflakes.test.test_imports import Test as TestImports\n",
"\n",
"# Nose\n",
"from nose.importer import Importer, add_path, remove_path # loader.py\n",
"\n",
"import atexit\n",
"from __future__ import print_function\n",
"from docopt import docopt\n",
"import curses, logging, sqlite3\n",
"import logging\n",
"import os\n",
"import sqlite3\n",
"import time\n",
"import sys\n",
"import signal\n",
"import bs4\n",
"import nonexistendmodule\n",
"import boto as b, peewee as p\n",
"# import django\n",
"import flask.ext.somext # # #\n",
"from sqlalchemy import model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" import ujson as json\n",
"except ImportError:\n",
" import json\n",
"\n",
"import models\n",
"\n",
"\n",
"def main():\n",
" pass\n",
"\n",
"import after_method_is_valid_even_if_not_pep8"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.1"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

5
tests/_data_pyw/py.py Normal file
View File

@ -0,0 +1,5 @@
import airflow
import numpy
airflow
numpy

3
tests/_data_pyw/pyw.pyw Normal file
View File

@ -0,0 +1,3 @@
import matplotlib
import pandas
import tensorflow

View File

@ -0,0 +1,34 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"cd ."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.4"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

View File

@ -8,53 +8,93 @@ test_pipreqs
Tests for `pipreqs` module.
"""
from io import StringIO
import logging
from unittest.mock import patch, Mock
import unittest
import os
import requests
import sys
import warnings
from pipreqs import pipreqs
class TestPipreqs(unittest.TestCase):
def setUp(self):
self.modules = [
'flask', 'requests', 'sqlalchemy', 'docopt', 'boto', 'ipython',
'pyflakes', 'nose', 'analytics', 'flask_seasurf', 'peewee',
'ujson', 'nonexistendmodule', 'bs4',
'after_method_is_valid_even_if_not_pep8'
]
self.modules2 = ['beautifulsoup4']
self.local = ["docopt", "requests", "nose", 'pyflakes']
self.project = os.path.join(os.path.dirname(__file__), "_data")
self.project_clean = os.path.join(
os.path.dirname(__file__),
"_data_clean"
)
self.project_invalid = os.path.join(
os.path.dirname(__file__),
"_invalid_data"
)
self.project_with_ignore_directory = os.path.join(
os.path.dirname(__file__),
"_data_ignore"
)
self.project_with_duplicated_deps = os.path.join(
os.path.dirname(__file__),
"_data_duplicated_deps"
)
self.requirements_path = os.path.join(self.project, "requirements.txt")
self.alt_requirement_path = os.path.join(
self.project,
"requirements2.txt"
)
@classmethod
def setUpClass(cls):
# Disable all logs for not spamming the terminal when running tests.
logging.disable(logging.CRITICAL)
# Specific warning not covered by the above command:
warnings.filterwarnings("ignore", category=DeprecationWarning, module="jupyter_client")
cls.modules = [
"flask",
"requests",
"sqlalchemy",
"docopt",
"boto",
"ipython",
"pyflakes",
"nose",
"analytics",
"flask_seasurf",
"peewee",
"ujson",
"nonexistendmodule",
"bs4",
"after_method_is_valid_even_if_not_pep8",
]
cls.modules2 = ["beautifulsoup4"]
cls.local = ["docopt", "requests", "nose", "pyflakes", "ipython"]
cls.project = os.path.join(os.path.dirname(__file__), "_data")
cls.empty_filepath = os.path.join(cls.project, "empty.txt")
cls.imports_filepath = os.path.join(cls.project, "imports.txt")
cls.imports_no_version_filepath = os.path.join(cls.project, "imports_no_version.txt")
cls.imports_any_version_filepath = os.path.join(cls.project, "imports_any_version.txt")
cls.non_existent_filepath = os.path.join(cls.project, "non_existent_file.txt")
cls.parsed_packages = [
{"name": "pandas", "version": "2.0.0"},
{"name": "numpy", "version": "1.2.3"},
{"name": "torch", "version": "4.0.0"},
]
cls.parsed_packages_no_version = [
{"name": "pandas", "version": None},
{"name": "tensorflow", "version": None},
{"name": "torch", "version": None},
]
cls.parsed_packages_any_version = [
{"name": "numpy", "version": None},
{"name": "pandas", "version": "2.0.0"},
{"name": "tensorflow", "version": None},
{"name": "torch", "version": "4.0.0"},
]
cls.project_clean = os.path.join(os.path.dirname(__file__), "_data_clean")
cls.project_invalid = os.path.join(os.path.dirname(__file__), "_invalid_data")
cls.project_with_ignore_directory = os.path.join(os.path.dirname(__file__), "_data_ignore")
cls.project_with_duplicated_deps = os.path.join(os.path.dirname(__file__), "_data_duplicated_deps")
cls.requirements_path = os.path.join(cls.project, "requirements.txt")
cls.alt_requirement_path = os.path.join(cls.project, "requirements2.txt")
cls.non_existing_filepath = "xpto"
cls.project_with_notebooks = os.path.join(os.path.dirname(__file__), "_data_notebook")
cls.project_with_invalid_notebooks = os.path.join(os.path.dirname(__file__), "_invalid_data_notebook")
cls.python_path_same_imports = os.path.join(os.path.dirname(__file__), "_data/test.py")
cls.notebook_path_same_imports = os.path.join(os.path.dirname(__file__), "_data_notebook/test.ipynb")
def test_get_all_imports(self):
imports = pipreqs.get_all_imports(self.project)
self.assertEqual(len(imports), 15)
for item in imports:
self.assertTrue(
item.lower() in self.modules, "Import is missing: " + item)
self.assertTrue(item.lower() in self.modules, "Import is missing: " + item)
self.assertFalse("time" in imports)
self.assertFalse("logging" in imports)
self.assertFalse("curses" in imports)
@ -72,8 +112,14 @@ class TestPipreqs(unittest.TestCase):
"""
Test that invalid python files cannot be imported.
"""
self.assertRaises(
SyntaxError, pipreqs.get_all_imports, self.project_invalid)
self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_invalid)
def test_ignore_errors(self):
"""
Test that invalid python files do not raise an exception when ignore_errors is True.
"""
imports = pipreqs.get_all_imports(self.project_invalid, ignore_errors=True)
self.assertEqual(len(imports), 0)
def test_get_imports_info(self):
"""
@ -86,13 +132,14 @@ class TestPipreqs(unittest.TestCase):
self.assertEqual(len(with_info), 13)
for item in with_info:
self.assertTrue(
item['name'].lower() in self.modules,
"Import item appears to be missing " + item['name'])
item["name"].lower() in self.modules,
"Import item appears to be missing " + item["name"],
)
def test_get_pkg_names(self):
pkgs = ['jury', 'Japan', 'camel', 'Caroline']
pkgs = ["jury", "Japan", "camel", "Caroline"]
actual_output = pipreqs.get_pkg_names(pkgs)
expected_output = ['camel', 'Caroline', 'Japan', 'jury']
expected_output = ["camel", "Caroline", "Japan", "jury"]
self.assertEqual(actual_output, expected_output)
def test_get_use_local_only(self):
@ -107,22 +154,33 @@ class TestPipreqs(unittest.TestCase):
# should find only docopt and requests
imports_with_info = pipreqs.get_import_local(self.modules)
for item in imports_with_info:
self.assertTrue(item['name'].lower() in self.local)
self.assertTrue(item["name"].lower() in self.local)
def test_init(self):
"""
Test that all modules we will test upon are in requirements file
"""
pipreqs.init({'<path>': self.project, '--savepath': None, '--print': False,
'--use-local': None, '--force': True, '--proxy':None, '--pypi-server':None,
'--diff': None, '--clean': None, '--mode': None})
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
with open(self.requirements_path, "r") as f:
data = f.read().lower()
for item in self.modules[:-3]:
self.assertTrue(item.lower() in data)
# It should be sorted based on names.
data = data.strip().split('\n')
data = data.strip().split("\n")
self.assertEqual(data, sorted(data))
def test_init_local_only(self):
@ -130,9 +188,20 @@ class TestPipreqs(unittest.TestCase):
Test that items listed in requirements.text are the same
as locals expected
"""
pipreqs.init({'<path>': self.project, '--savepath': None, '--print': False,
'--use-local': True, '--force': True, '--proxy':None, '--pypi-server':None,
'--diff': None, '--clean': None, '--mode': None})
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": True,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
with open(self.requirements_path, "r") as f:
data = f.readlines()
@ -145,9 +214,19 @@ class TestPipreqs(unittest.TestCase):
Test that we can save requirements.txt correctly
to a different path
"""
pipreqs.init({'<path>': self.project, '--savepath': self.alt_requirement_path,
'--use-local': None, '--proxy':None, '--pypi-server':None, '--print': False,
'--diff': None, '--clean': None, '--mode': None})
pipreqs.init(
{
"<path>": self.project,
"--savepath": self.alt_requirement_path,
"--use-local": None,
"--proxy": None,
"--pypi-server": None,
"--print": False,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.alt_requirement_path) == 1
with open(self.alt_requirement_path, "r") as f:
data = f.read().lower()
@ -163,9 +242,20 @@ class TestPipreqs(unittest.TestCase):
"""
with open(self.requirements_path, "w") as f:
f.write("should_not_be_overwritten")
pipreqs.init({'<path>': self.project, '--savepath': None, '--use-local': None,
'--force': None, '--proxy':None, '--pypi-server':None, '--print': False,
'--diff': None, '--clean': None, '--mode': None})
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--print": False,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
with open(self.requirements_path, "r") as f:
data = f.read().lower()
@ -180,41 +270,49 @@ class TestPipreqs(unittest.TestCase):
"""
import_name_with_alias = "requests as R"
expected_import_name_without_alias = "requests"
import_name_without_aliases = pipreqs.get_name_without_alias(
import_name_with_alias)
self.assertEqual(
import_name_without_aliases,
expected_import_name_without_alias
)
import_name_without_aliases = pipreqs.get_name_without_alias(import_name_with_alias)
self.assertEqual(import_name_without_aliases, expected_import_name_without_alias)
def test_custom_pypi_server(self):
"""
Test that trying to get a custom pypi sever fails correctly
"""
self.assertRaises(
requests.exceptions.MissingSchema, pipreqs.init,
{'<path>': self.project, '--savepath': None, '--print': False,
'--use-local': None, '--force': True, '--proxy': None,
'--pypi-server': 'nonexistent'}
)
requests.exceptions.MissingSchema,
pipreqs.init,
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": "nonexistent",
},
)
def test_ignored_directory(self):
"""
Test --ignore parameter
"""
pipreqs.init(
{'<path>': self.project_with_ignore_directory, '--savepath': None,
'--print': False, '--use-local': None, '--force': True,
'--proxy':None, '--pypi-server':None,
'--ignore':'.ignored_dir,.ignore_second',
'--diff': None,
'--clean': None,
'--mode': None
}
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--ignore": ".ignored_dir,.ignore_second",
"--diff": None,
"--clean": None,
"--mode": None,
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.read().lower()
for item in ['click', 'getpass']:
for item in ["click", "getpass"]:
self.assertFalse(item.lower() in data)
def test_dynamic_version_no_pin_scheme(self):
@ -222,17 +320,22 @@ class TestPipreqs(unittest.TestCase):
Test --mode=no-pin
"""
pipreqs.init(
{'<path>': self.project_with_ignore_directory, '--savepath': None,
'--print': False, '--use-local': None, '--force': True,
'--proxy': None, '--pypi-server': None,
'--diff': None,
'--clean': None,
'--mode': 'no-pin'
}
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": "no-pin",
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.read().lower()
for item in ['beautifulsoup4', 'boto']:
for item in ["beautifulsoup4", "boto"]:
self.assertTrue(item.lower() in data)
def test_dynamic_version_gt_scheme(self):
@ -240,20 +343,24 @@ class TestPipreqs(unittest.TestCase):
Test --mode=gt
"""
pipreqs.init(
{'<path>': self.project_with_ignore_directory, '--savepath': None, '--print': False,
'--use-local': None, '--force': True,
'--proxy': None,
'--pypi-server': None,
'--diff': None,
'--clean': None,
'--mode': 'gt'
}
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": "gt",
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.readlines()
for item in data:
symbol = '>='
message = 'symbol is not in item'
symbol = ">="
message = "symbol is not in item"
self.assertIn(symbol, item, message)
def test_dynamic_version_compat_scheme(self):
@ -261,20 +368,24 @@ class TestPipreqs(unittest.TestCase):
Test --mode=compat
"""
pipreqs.init(
{'<path>': self.project_with_ignore_directory, '--savepath': None, '--print': False,
'--use-local': None, '--force': True,
'--proxy': None,
'--pypi-server': None,
'--diff': None,
'--clean': None,
'--mode': 'compat'
}
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": "compat",
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.readlines()
for item in data:
symbol = '~='
message = 'symbol is not in item'
symbol = "~="
message = "symbol is not in item"
self.assertIn(symbol, item, message)
def test_clean(self):
@ -282,18 +393,34 @@ class TestPipreqs(unittest.TestCase):
Test --clean parameter
"""
pipreqs.init(
{'<path>': self.project, '--savepath': None, '--print': False,
'--use-local': None, '--force': True, '--proxy': None,
'--pypi-server': None, '--diff': None, '--clean': None,
'--mode': None}
)
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
pipreqs.init(
{'<path>': self.project, '--savepath': None, '--print': False,
'--use-local': None, '--force': None, '--proxy': None,
'--pypi-server': None, '--diff': None,
'--clean': self.requirements_path, '--mode': 'non-pin'}
)
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": self.requirements_path,
"--mode": "non-pin",
}
)
with open(self.requirements_path, "r") as f:
data = f.read().lower()
for item in self.modules[:-3]:
@ -303,25 +430,255 @@ class TestPipreqs(unittest.TestCase):
"""
Test --clean parameter when there are imports to clean
"""
cleaned_module = 'sqlalchemy'
cleaned_module = "sqlalchemy"
pipreqs.init(
{'<path>': self.project, '--savepath': None, '--print': False,
'--use-local': None, '--force': True, '--proxy': None,
'--pypi-server': None, '--diff': None, '--clean': None,
'--mode': None}
)
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
modules_clean = [m for m in self.modules if m != cleaned_module]
pipreqs.init(
{'<path>': self.project_clean, '--savepath': None,
'--print': False, '--use-local': None, '--force': None,
'--proxy': None, '--pypi-server': None, '--diff': None,
'--clean': self.requirements_path, '--mode': 'non-pin'}
)
{
"<path>": self.project_clean,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": self.requirements_path,
"--mode": "non-pin",
}
)
with open(self.requirements_path, "r") as f:
data = f.read().lower()
self.assertTrue(cleaned_module not in data)
def test_compare_modules(self):
test_cases = [
(self.empty_filepath, [], set()), # both empty
(self.empty_filepath, self.parsed_packages, set()), # only file empty
(
self.imports_filepath,
[],
set(package["name"] for package in self.parsed_packages),
), # only imports empty
(self.imports_filepath, self.parsed_packages, set()), # no difference
(
self.imports_filepath,
self.parsed_packages[1:],
set([self.parsed_packages[0]["name"]]),
), # common case
]
for test_case in test_cases:
with self.subTest(test_case):
filename, imports, expected_modules_not_imported = test_case
modules_not_imported = pipreqs.compare_modules(filename, imports)
self.assertSetEqual(modules_not_imported, expected_modules_not_imported)
def test_output_requirements(self):
"""
Test --print parameter
It should print to stdout the same content as requeriments.txt
"""
capturedOutput = StringIO()
sys.stdout = capturedOutput
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": True,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
with open(self.requirements_path, "r") as f:
file_content = f.read().lower()
stdout_content = capturedOutput.getvalue().lower()
self.assertTrue(file_content == stdout_content)
def test_import_notebooks(self):
"""
Test the function get_all_imports() using .ipynb file
"""
self.mock_scan_notebooks()
imports = pipreqs.get_all_imports(self.project_with_notebooks)
for item in imports:
self.assertTrue(item.lower() in self.modules, "Import is missing: " + item)
not_desired_imports = ["time", "logging", "curses", "__future__", "django", "models", "FastAPI", "sklearn"]
for not_desired_import in not_desired_imports:
self.assertFalse(
not_desired_import in imports,
f"{not_desired_import} was imported, but it should not have been."
)
def test_invalid_notebook(self):
"""
Test that invalid notebook files cannot be imported.
"""
self.mock_scan_notebooks()
self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_with_invalid_notebooks)
def test_ipynb_2_py(self):
"""
Test the function ipynb_2_py() which converts .ipynb file to .py format
"""
python_imports = pipreqs.get_all_imports(self.python_path_same_imports)
notebook_imports = pipreqs.get_all_imports(self.notebook_path_same_imports)
self.assertEqual(python_imports, notebook_imports)
def test_file_ext_is_allowed(self):
"""
Test the function file_ext_is_allowed()
"""
self.assertTrue(pipreqs.file_ext_is_allowed("main.py", [".py"]))
self.assertTrue(pipreqs.file_ext_is_allowed("main.py", [".py", ".ipynb"]))
self.assertFalse(pipreqs.file_ext_is_allowed("main.py", [".ipynb"]))
def test_parse_requirements(self):
"""
Test parse_requirements function
"""
test_cases = [
(self.empty_filepath, []), # empty file
(self.imports_filepath, self.parsed_packages), # imports with versions
(
self.imports_no_version_filepath,
self.parsed_packages_no_version,
), # imports without versions
(
self.imports_any_version_filepath,
self.parsed_packages_any_version,
), # imports with and without versions
]
for test in test_cases:
with self.subTest(test):
filename, expected_parsed_requirements = test
parsed_requirements = pipreqs.parse_requirements(filename)
self.assertListEqual(parsed_requirements, expected_parsed_requirements)
@patch("sys.exit")
def test_parse_requirements_handles_file_not_found(self, exit_mock):
captured_output = StringIO()
sys.stdout = captured_output
# This assertion is needed, because since "sys.exit" is mocked, the program won't end,
# and the code that is after the except block will be run
with self.assertRaises(UnboundLocalError):
pipreqs.parse_requirements(self.non_existing_filepath)
exit_mock.assert_called_once_with(1)
printed_text = captured_output.getvalue().strip()
sys.stdout = sys.__stdout__
self.assertEqual(printed_text, "File xpto was not found. Please, fix it and run again.")
def test_ignore_notebooks(self):
"""
Test if notebooks are ignored when the scan-notebooks parameter is False
"""
notebook_requirement_path = os.path.join(self.project_with_notebooks, "requirements.txt")
pipreqs.init(
{
"<path>": self.project_with_notebooks,
"--savepath": None,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--print": False,
"--diff": None,
"--clean": None,
"--mode": None,
"--scan-notebooks": False,
}
)
assert os.path.exists(notebook_requirement_path) == 1
assert os.path.getsize(notebook_requirement_path) == 1 # file only has a "\n", meaning it's empty
def test_pipreqs_get_imports_from_pyw_file(self):
pyw_test_dirpath = os.path.join(os.path.dirname(__file__), "_data_pyw")
requirements_path = os.path.join(pyw_test_dirpath, "requirements.txt")
pipreqs.init(
{
"<path>": pyw_test_dirpath,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
self.assertTrue(os.path.exists(requirements_path))
expected_imports = [
"airflow",
"matplotlib",
"numpy",
"pandas",
"tensorflow",
]
with open(requirements_path, "r") as f:
imports_data = f.read().lower()
for _import in expected_imports:
self.assertTrue(
_import.lower() in imports_data,
f"'{_import}' import was expected but not found.",
)
os.remove(requirements_path)
def mock_scan_notebooks(self):
pipreqs.scan_noteboooks = Mock(return_value=True)
pipreqs.handle_scan_noteboooks()
def tearDown(self):
"""
Remove requiremnts.txt files that were written
@ -336,5 +693,5 @@ class TestPipreqs(unittest.TestCase):
pass
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

30
tox.ini
View File

@ -1,17 +1,31 @@
[tox]
envlist = py36, py37, py38, py39, pypy3, flake8
isolated_build = true
envlist = py39, py310, py311, py312, py313, pypy3, flake8
[gh-actions]
python =
3.6: py36
3.7: py37
3.8: py38
3.9: py39
pypy-3.7: pypy3
3.10: py310
3.11: py311
3.12: py312
3.13: py313
pypy-3.10: pypy3
[testenv]
setenv =
PYTHONPATH = {toxinidir}:{toxinidir}/pipreqs
commands = python setup.py test
deps =
-r{toxinidir}/requirements.txt
commands =
python -m unittest discover
[testenv:flake8]
deps = flake8
commands = flake8 pipreqs tests
[flake8]
exclude =
tests/_data/
tests/_data_clean/
tests/_data_duplicated_deps/
tests/_data_ignore/
tests/_invalid_data/
max-line-length = 120