Compare commits

...

252 Commits

Author SHA1 Message Date
Jonas Eschle
b3d0b4443b
Merge pull request #473 from ucsd-salad/feat/ignore-errors-flag
Add `--ignore-errors` to skip files with errors
2025-04-10 10:14:53 +02:00
Jonas Eschle
2ef6a981d5
Merge pull request #482 from bndr/fix_ci_coverage
fix typo
2025-04-08 10:56:33 +02:00
Jonas Eschle
fcfe190bed
Merge pull request #471 from rasa/rasa-py-3.13-support
chore: Add python 3.13 support, drop 3.8
2025-04-08 10:56:15 +02:00
Jonas Eschle
38af24c1d1 fix typo 2025-04-08 10:49:39 +02:00
Jonas Eschle
645451aa39 fix typo 2025-04-08 10:47:41 +02:00
Jonas Eschle
2326604c81 fix typo 2025-04-08 10:43:26 +02:00
Jonas Eschle
6e0447755c
Update flake8.yml ci 2025-04-08 09:51:13 +02:00
Jonas Eschle
da63e0d71c
Merge branch 'master' into rasa-py-3.13-support 2025-04-08 09:49:42 +02:00
Jonas Eschle
a7a6b856b0
Merge pull request #481 from bndr/projecttoml_upgrade
upgrade Python version
2025-04-08 09:45:54 +02:00
Jonas Eschle
cdcf14bce6 fix typo 2025-04-08 09:44:41 +02:00
Jonas Eschle
1f462ab50a upgrade Python version 2025-04-08 09:43:46 +02:00
Jonas Eschle
287da35bc2
Merge pull request #480 from bndr/add_venv_ignore
Add .venv to ignored files
2025-04-07 19:46:26 +02:00
Jonas Eschle
58d62cb7b8
Update pipreqs.py 2025-04-07 18:02:59 +02:00
Jonas Eschle
36efbbe460
Merge pull request #478 from bndr/precommit
add pre-commit configuration file
2025-04-07 17:47:39 +02:00
Jonas Eschle
08c5eb09cc
Merge pull request #479 from bndr/projecttoml_upgrade
Update pyproject.toml and tests.yml config
2025-04-07 17:47:27 +02:00
Jonas Eschle
34163de8d0
Merge pull request #325 from klieret/no-pin-doc-improvement
Doc: no-pin instead of non-pin for scheme
2025-04-07 17:45:23 +02:00
Jonas Eschle
d6d06a2f1f
Merge pull request #378 from testpushpleaseignore/master
add pywintypes to mapping
2025-04-07 17:43:02 +02:00
Jonas Eschle
1c319d7106
Merge pull request #394 from xinshoutw/patch-1
Update mapping for SpeechRecognition
2025-04-07 17:40:42 +02:00
Jonas Eschle
d6725caee6 upgrade Python version 2025-04-07 17:25:37 +02:00
Jonas Eschle
ebfa1f4832 Upgrade dependencies in pyproject.toml and tests.yml 2025-04-07 17:19:23 +02:00
Jonas Eschle
9eedfb39db Upgrade dependencies in pyproject.toml and tests.yml 2025-04-07 17:18:54 +02:00
Jonas Eschle
e5336a446a Update pyproject.toml and tests.yml config 2025-04-07 17:05:51 +02:00
Jonas Eschle
648a43ffbe add pre-commit configuration file 2025-04-07 16:41:44 +02:00
Jonas Eschle
3d490ec251
Merge pull request #440 from Pwuts/patch-1
Add mapping `jose` -> `python-jose`
2025-04-07 16:30:58 +02:00
Jonas Eschle
4c65892517
Merge pull request #443 from bndr/dependabot/pip/idna-3.7
Bump idna from 3.6 to 3.7
2025-04-07 16:25:50 +02:00
Jonas Eschle
a14e8b4256
Merge pull request #447 from bndr/dependabot/pip/jinja2-3.1.4
Bump jinja2 from 3.1.3 to 3.1.4
2025-04-07 16:25:31 +02:00
Jonas Eschle
cc8545d530
Merge pull request #373 from mcp292/patch-1
Add OpenCV mapping
2025-04-07 16:19:22 +02:00
Jonas Eschle
aabe973eb1
Merge pull request #477 from bndr/next
update tests badge in readme
2025-04-07 16:15:30 +02:00
Jonas Eschle
5cdc9019d7
Merge pull request #430 from Borda/readme/badge
update tests badge in readme [stale]
2025-04-07 16:11:17 +02:00
Shun Kashiwa
08eead345a
Add --ignore-errors to skip files with syntax errors and attempt to find requirements on a best-effort basis 2025-03-14 09:49:03 -07:00
Ross Smith II
967a6688cb chore: Add python 3.13 support, drop 3.8 2025-02-27 16:29:50 -08:00
Jirka Borovec
eb37d03ff7
Merge branch 'next' into readme/badge 2024-11-12 21:57:56 +01:00
dependabot[bot]
1e9cc81f8e
Bump jinja2 from 3.1.3 to 3.1.4
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.3 to 3.1.4.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.3...3.1.4)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-05-06 21:09:57 +00:00
dependabot[bot]
75e7892310
Bump idna from 3.6 to 3.7
Bumps [idna](https://github.com/kjd/idna) from 3.6 to 3.7.
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](https://github.com/kjd/idna/compare/v3.6...v3.7)

---
updated-dependencies:
- dependency-name: idna
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-12 04:43:35 +00:00
Reinier van der Leer
4767b6444e
Add mapping jose -> python-jose 2024-03-19 16:46:50 +01:00
Alan Barzilay
cd3f437689 release pipreqs v0.5 2024-02-18 14:35:24 -03:00
Alan Barzilay
64fc5a2972 ci: fix 3.12 python tests
tests were failing due to too restrictive python version requirements
(<=3.12 when 3.12.X exist).
switching to <3.13 allows every python 3.12 patch version to work normally
2024-02-18 14:08:22 -03:00
Alan Barzilay
3a9bc86108 Merge branch 'next' 2024-02-18 14:04:54 -03:00
Alan Barzilay
769f3e501e Revert "Bump wheel from 0.23.0 to 0.38.1"
this is causing too many weird conflicts to merge next into master. Also,
requirements file was deprecated in next branch in lieu of
pyproject.toml

This reverts commit 68f9b2859d45a60e699839f87b1b9558cd36a329.
2024-02-18 14:00:26 -03:00
Jirka Borovec
e4faec2c1e
update tests badge in readme 2024-01-16 13:34:00 +01:00
mateuslatrova
4a9176b39a add support for .pyw files
Now, pipreqs will also scan imports in .pyw files by default.
2023-12-06 20:11:57 +00:00
mateuslatrova
de68691438 fix flake8 environment in tox.ini file
The way we configured the tox.ini file makes flake8 not to be run. Tox is only running python tests for the flake8 environment.

With this PR, tox will run flake8 for the pipreqs and tests folders as desired.
2023-12-06 18:03:41 +00:00
fernandocrz
b50b4a76eb Add support for jupyter notebooks
Credits to @pakio and @mateuslatrova for the contributions
2023-12-05 18:15:54 +00:00
fernandocrz
03c92488de suppress errors and warnings in unit tests
Credits to @mateuslatrova for the contribution.
2023-12-05 18:15:54 +00:00
fernandocrz
aa283ada00 define pipreqs default encoding to utf-8 2023-12-05 18:15:54 +00:00
Lucas de Sousa Rosa
f041de9bdb Bump python 3.12 support 2023-11-08 18:15:21 -03:00
Lucas de Sousa Rosa
fb4560c740 Migrating the packaging system to poetry with pyproject.toml
- Deleted old setup files `requirements.txt`, `setup.cfg`, `setup.py`, `MANIFEST.in`
- Added poetry files `poetry.toml`, `pyproject.toml`, `poetry.lock`
- Added `.pyenv-version` and `.tool-versions` for `pyenv` and `asdf`
- Updated `Makefile`, `CONTRIBUTING.rst`, `tox.ini`
2023-11-08 10:49:10 -03:00
mateuslatrova
368e9ae7e7 handle FileNotFoundError in parse_requirements function 2023-10-20 08:59:50 -03:00
darwish
55eee298ec Specify pypy version
- The pypy v7.3.13, used by GitHub Actions, was failing. So I force to use pypy v7.3.12 that was passing.
- The original PR was done by @EwoutH at #334 and modified by @willianrocha at #398.
2023-10-12 14:08:39 -03:00
Willian Rocha
8af7d85a74 CI: Add Python 3.10 and 3.11, update actions, run on pushes and manually
A bit of CI maintenance:
- Add Python 3.10 and 3.11 runs, and update the PyPy run to PyPy 3.9
- Removed Python 3.7 as it is deprecated
- Update the used actions (checkout and setup-python) to the latest versions
- Also run on pushes, when manually triggered (workflow_dispatch)

The original PR was done by @EwoutH at #334
2023-10-12 14:08:39 -03:00
dependabot[bot]
eb65254646 Bump wheel from 0.23.0 to 0.38.1
Bumps [wheel](https://github.com/pypa/wheel) from 0.23.0 to 0.38.1.
- [Release notes](https://github.com/pypa/wheel/releases)
- [Changelog](https://github.com/pypa/wheel/blob/main/docs/news.rst)
- [Commits](https://github.com/pypa/wheel/compare/0.23.0...0.38.1)

---
updated-dependencies:
- dependency-name: wheel
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-11 15:42:04 -03:00
Alan Barzilay
6f232bd080
Merge pull request #389 from mateuslatrova/compare_modules_test
Add unit test for "compare_modules" function
2023-10-10 15:41:00 -03:00
Alan Barzilay
2ebfc4645a
Merge branch 'next' into compare_modules_test 2023-10-10 15:36:35 -03:00
fernandocrz
12cc1e5b74 add test for parse_requirements function 2023-10-10 15:30:38 -03:00
Mateus Latrova
ed46d270e9 add test for "compare_modules" function 2023-10-09 16:12:42 -03:00
dependabot[bot]
68f9b2859d Bump wheel from 0.23.0 to 0.38.1
Bumps [wheel](https://github.com/pypa/wheel) from 0.23.0 to 0.38.1.
- [Release notes](https://github.com/pypa/wheel/releases)
- [Changelog](https://github.com/pypa/wheel/blob/main/docs/news.rst)
- [Commits](https://github.com/pypa/wheel/compare/0.23.0...0.38.1)

---
updated-dependencies:
- dependency-name: wheel
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-08 00:21:32 -03:00
Mateus Latrova
3c786e3537 add "copy to clipboard icon" in installation section
- Based on PR #369 and #396
- Authors Muhammad-Aadil and mateuslatrova
2023-10-08 00:10:47 -03:00
Alan Barzilay
acc41cc9bc
Merge pull request #395 from fredgrub/output_requirements_next
create test for output_requirements
2023-10-04 14:26:29 -03:00
Lucas de Sousa Rosa
ccf097b02f create test for output_requirements 2023-10-02 18:00:23 -03:00
mateuslatrova
6ac4357cf4 fix return type on docs 2023-09-28 20:36:24 -03:00
mateuslatrova
d56d5feb29 delete unused function "filter_line" 2023-09-28 20:36:24 -03:00
Alan Barzilay
cebc3f7e24
Merge pull request #393 from willianrocha/flake8_next
applying pep8 rules
2023-09-27 14:12:40 -03:00
XinShou
ab492859f4
Update mapping for SpeechRecognition 2023-09-22 06:44:24 +08:00
Willian Rocha
f2e745256e applying pep8 rules 2023-09-21 19:37:37 -03:00
testpushpleaseignore
c3d0f169d3
add pywintypes to mapping
mapped pywintypes to pywin32
2023-06-19 16:30:09 -05:00
mcp292
c0d6b293b1
Add OpenCV mapping
Fixes #307.
2023-05-17 15:07:42 -07:00
Vadim Kravcenko
a1f83d27d9 Bump Version to 0.4.13 2023-04-14 09:08:49 +02:00
Vadim Kravcenko
181525cdea
Merge pull request #304 from Marocco2/patch-1
Mapping Telegram to python-telegram-bot
2023-04-13 20:39:56 +02:00
Vadim Kravcenko
f9f9d53620
Merge pull request #335 from mmngreco/patch-1
enh: include airflow mapping
2023-04-13 20:38:32 +02:00
Vadim Kravcenko
717d4926bc
Merge pull request #364 from adeadfed/master
Mitigation for dependency confusion in pipreqs
2023-04-13 20:38:09 +02:00
adeadfed
2103371746 add whitespaces to pep8 formatted strings 2023-04-13 16:05:50 +02:00
adeadfed
5537bcd217 fix pep8 linting issues 2023-04-13 15:59:50 +02:00
Vadim Kravcenko
96440ac2ff
Merge pull request #343 from Alexander-Wilms/master
Add mapping for python-constraint
2023-04-13 14:32:32 +02:00
Vadim Kravcenko
4c380a9661
Merge pull request #349 from hansvana/patch-1
Update mapping socketio to python-socketio
2023-04-13 14:27:26 +02:00
Vadim Kravcenko
ed13177c81
Merge pull request #354 from bskiefer/patch-1
Update mapping for python-slugify
2023-04-13 14:27:06 +02:00
adeadfed
bb61883079 revert changes to .gitignore 2023-04-08 17:10:02 +02:00
adeadfed
3ae82087f9 remove invalid comment 2023-03-29 00:32:00 +02:00
adeadfed
001ead91ed delete vscode related files from repo 2023-03-29 00:28:17 +02:00
adeadfed
50498d3cd4 change warning messages 2023-03-29 00:27:13 +02:00
adeadfed
da442e7a2a add warnings to for remote resolving mode 2023-03-29 00:25:45 +02:00
adeadfed
6cd9925a31 improved version of local package resolving 2023-03-29 00:17:49 +02:00
adeadfed
3f5964fcb9 fix name resolution for local packages 2023-03-14 21:14:19 +01:00
bskiefer
175524ec24
Update mapping for python-slugify 2023-02-06 10:51:09 -06:00
Alan Barzilay
e5493d922e
Merge pull request #344 from clvnkhr/Check-if-requirements.txt-exist-before-proceeding-further-inspection-#285
resolve issue #285
2023-01-14 11:49:35 -03:00
hansvana
5cdc966b2e
Update mapping socketio to python-socketio
Originally mapped to gevent-socketio, which is outdated and hasn't been updated since 2016. Anyone importing socketio is much more likely to mean python-socketio now.
2023-01-13 11:15:15 +01:00
clvnkhr
f97d8b9ba4 resolve issue #285
simply moved the check in init for requirements.txt ahead of searching for packages. Closes #285. Passes tox (3.7, 3.8, 3.9, pypy3, flake8)
2023-01-05 15:06:39 +08:00
Alexander Wilms
775681975a
Add mapping for python-constraint 2023-01-02 22:42:27 +01:00
Alan Barzilay
80bfad8285
Merge pull request #339 from EwoutH/py37
Require Python 3.7 or higher
2022-12-02 17:02:21 -03:00
Ewout ter Hoeven
80503cd3fd
Require Python 3.7 or higher 2022-12-02 17:36:42 +01:00
Maximiliano Greco
0aa3f80b84
enh: include airflow mapping
closes https://github.com/bndr/pipreqs/issues/179
2022-11-17 13:25:49 +01:00
Kilian Lieret
de13f174c3
Doc: no-pin instead of non-pin for scheme 2022-08-09 12:16:57 -04:00
Marocco2
2a26eadc91
Mapping Telegram to python-telegram-bot
telegram is a hollow pypi package, so pipreqs will try to install that instead of python-telegram-bot, which have a module named telegram.
I'll ask to approve this change, as I'm blocked with my Pipedream workflow.
2022-04-10 09:49:12 +00:00
Alan Barzilay
a593d27e3d readme: fix rst syntax
While trying to upload release 0.4.11 to pypi I got a few errors due to
the rst syntax of the readme. This commit fixes the errors but not the
warnings (mostly undeclared long format description type in setup.py)
2021-10-23 06:17:21 -03:00
Alan Barzilay
23097b102c Version bump to 0.4.11 2021-10-23 05:54:58 -03:00
Alan Barzilay
dea950dd07
Merge pull request #275 from bndr/next
Merge changes for version 0.4.11
2021-09-04 15:25:17 -03:00
Alan Barzilay
e7571ea033 Add tox tests & codecov github action
This commit essentially adds back tests to our CI pipeline. They were
previously dropped due to Travis pricing policy change.

This workflow utilizes a few interesting projects to make this action
easier to maintain such as the codecov github action
and the tox-gh-actions project
(https://github.com/ymyzk/tox-gh-actions)

This commit uses codecov instead of coveralls because using coveralls
directly inside GH-actions is buggy and the official coveralls action
only supports lcov reports which we can't seem to be able to generate at
the moment. For more information see the pull request that introduced
this commit
2021-09-04 15:14:48 -03:00
Alan Barzilay
aadc9c8de1 Add flake8 github action with review dog
Review dog is an incredible project that makes linting and formatting
review a breeze by commenting inline what is wrong in a pull request.
This makes the review process easier for the maintainer and also
provides a clearer feedback to the contributor
2021-09-04 15:14:48 -03:00
Alan Barzilay
682a0e2fcf Drop Travis as our CI provider
Since travis changed its pricing policy it has become more limited in
what we can accomplish with it. It now uses a limited credit model for
open source projects which besides being cumbersome puts the project in
a precarious position where we may be unable to run tests if we dont
have anymore credits.

For this reason we will be moving to github actions since they seem to
be the best alternative at the moment
2021-09-04 15:14:48 -03:00
Rahul Kumaresan
e1d4a92910 refine markdown 2021-09-04 15:14:48 -03:00
Rahul Kumaresan
9e514a4caa update README and pipreqs docstring to help prevent discripancies 2021-09-04 15:14:48 -03:00
alan-barzilay
30a944cdda Bump python version
Bump tests version and supported versions in setup.py

stdlib: update packages for python 3.9

By utilizing the packages listed in
https://github.com/jackmaney/python-stdlib-list
for python 3.8 and 3.9, we were able to drop all stdlib packages that
existed solely in python 2 and add the missing stdlib python 3 packages
2021-09-04 15:14:48 -03:00
alan-barzilay
65ccd7eca3 delete everything related to python2
We are dropping support for python 2, so we are dropping the
verification of which python version is currently running and we also stop
checking for python2 specific packages.
We also drop the encoding definition since python3 uses utf-8 as
default.
The helper open_func function is also substituted by open.
2021-09-04 15:14:48 -03:00
Alan Barzilay
8001001e9c Revert "Revert all commits since last release"
This reverts commit 90102acdbb23c09574d27df8bd1f568d34e0cfd3.
Now that we are ready to make a new release we can revert the revert and
hopefuly never have to solve a mess like this again to keep master
synchronized with the latest release
2021-09-04 15:12:27 -03:00
alan-barzilay
90102acdbb Revert all commits since last release
By reverting all commits done since release v0.4.10 we will have the
master branch synchronized with the latest release available in pipy.

All commits done since the latest release will be moved to another
branch called `next` where we will centralize development. Once we are ready
for a new release of pipreqs, the `next` branch will be merged back on to
master and a new release will be made.

This change will make development more organized and will avoid new
issues from users complaining about features only present in master not working
on their installation of pipreqs.

I would also like to thank @pedroteosousa for his help on reverting and
squashing all commits
2021-05-05 02:08:31 -03:00
Alan Barzilay
2a299cc7bc
Merge pull request #245 from komodovaran/patch-1
Update stdlib with dataclasses
2021-05-03 21:36:36 -03:00
Alan Barzilay
e432bb1999
Merge pull request #249 from LionyxML/patch-1
De-capitalize "Requirements.txt" in log msg
2021-05-03 21:35:48 -03:00
Alan Barzilay
df8aab7609
Merge pull request #247 from sxooler/sxooler/patch-hydra
Add hydra-core mapping
2021-04-30 23:45:52 -03:00
Alan Barzilay
a80bf1d05a
Merge pull request #251 from berrysauce/patch-1
Added secrets to standard libraries file
2021-04-30 23:45:26 -03:00
Paul
4dd2c0916d
Added secrets to standard libraries file
Is new in Python 3.6+
2021-04-27 20:40:26 +02:00
Rahul M. Juliato
b188c8ba83
Decaptalized file name in log msg
When trying to generate requirements.txt and file is already existent, the user message produced asks for "Requirements.txt" instead of "requirements.txt". 
This may cause some trouble when used in case sensitive env, as well as in automated scripts.
2021-04-26 12:05:41 -03:00
Simon Ondracek
537458f2df Add hydra-core mapping
Maps `hydra` to `hydra-core`. Fixes #244.
2021-04-21 13:22:46 +02:00
Johannes Thomsen
bab012b491
Update stdlib with dataclasses
Part of the standard lib since 3.7: https://docs.python.org/3.7/library/dataclasses.html
2021-04-20 16:32:54 +02:00
Alan Barzilay
cb4add28ef
Merge pull request #239 from mapattacker/mode
New Option --mode for "gt", "compat", and "non-pin"
2021-03-29 15:34:49 -03:00
Jake
5dc02aa2fa remove obsolete '==' 2021-03-29 22:16:45 +08:00
Jake
e5924d14b3 resolve conflicts; update new features 2021-03-29 22:13:19 +08:00
Jake
22fefca900 merged from origin 2021-03-29 21:48:44 +08:00
Siyang
69a884a4c4 changes based on discussions w maintainer 2021-03-29 21:35:34 +08:00
Alan Barzilay
a26ce943c1
Merge pull request #238 from ryan-rozario/master
Mapping for github3
2021-03-26 17:24:45 -03:00
ryan-rozario
c7c91fcabe Mapping for github3 2021-03-26 23:56:20 +04:00
Alan Barzilay
1149429ab6
Merge pull request #195 from AlexPHorta/bugfix/issue88
Bugfix/issue88
2021-03-26 14:19:07 -03:00
Jake Teo
4eae4794a0
New Option for Dynamic Versioning (#1)
* added new option for dynamic versioning

* added quotes for dynamic options

Co-authored-by: Siyang <teo_siyang@imda.gov.sg>
2021-03-25 17:37:00 +08:00
Alan Barzilay
1aff63c049
Merge pull request #236 from bndr/pypy
Fix Pypy Build
2021-03-24 19:00:27 -03:00
alan-barzilay
b7e061f73a Remove unecessary pip upgrade step 2021-03-24 18:40:30 -03:00
alan-barzilay
b1725c7409 Upgrading pypy to pypy3
Maybe forcing pypy to use python 3 will solve the issue (although it works fine with python 2.7 at the moment)
2021-03-24 18:33:15 -03:00
alan-barzilay
7b69881e76 Upgrade pip before running tests on travis
This should ensure that all tests use the same version of pip and this should hopefully fix the pypy build that is using pip 19
2021-03-24 16:21:56 -03:00
Alan Barzilay
787c202901
Merge pull request #191 from HariSekhon/mysql-python
fixed MySQL + krbV mappings to be reversed
2021-03-23 21:50:29 -03:00
Alan Barzilay
0449fa8341
Merge branch 'master' into mysql-python 2021-03-23 21:50:03 -03:00
Alan Barzilay
b56bbe976f
Merge pull request #201 from PatMyron/patch-1
mapping aws-sam-translator
2021-03-23 21:40:24 -03:00
Alan Barzilay
cdb30bcf33
Merge pull request #212 from bbodenmiller/patch-1
Tweak formatting
2021-03-23 21:38:33 -03:00
Alan Barzilay
27f9ca7ee8
Merge pull request #205 from toanant/patch-1
Imports are now sorted based on lowercase package's name, similar to pip freeze.
2021-03-23 21:33:05 -03:00
Alan Barzilay
e06a850883
Merge pull request #220 from 36000/36000-pyAFQ-1
Add pyAFQ Mapping
2021-03-23 21:24:55 -03:00
Alan Barzilay
112e2b6bdc
Merge pull request #224 from pkalemba/patch-1
Add discord mapping/ Sort mapping file
2021-03-23 21:22:52 -03:00
Alan Barzilay
1e830de783
Merge pull request #234 from SwiftWinds/patch-1
Add PyFunctional mapping
2021-03-23 21:09:40 -03:00
SwiftWinds
bcd0be47d0
Add PyFunctional mapping
Maps `functional` to `pyfunctional`. Fixes #232.
2021-03-13 14:46:02 -08:00
Paweł Kalemba
469baefc1e
Add discord mapping
Map discord to discord.py
2020-12-10 10:04:17 +01:00
John Kruper
30d8fab76c
Add pyAFQ Mapping 2020-11-10 10:27:02 -08:00
Ben Bodenmiller
bd88dd85cd
Tweak formatting 2020-07-09 16:11:56 -07:00
Abhishek Kumar Singh
fc720f18bb Fixed#133
Sorted `imports` based on `lowercase` package's `name`, similar to `pip freeze`.
2020-06-14 00:20:26 +05:30
Jon Banafato
060f52f597
Merge pull request #203 from jonafato/fix-flake8
Fix flake8 error for latest release
2020-05-18 19:46:19 -04:00
Jon Banafato
3f06f4375a Fix flake8 error for latest release 2020-05-18 19:21:05 -04:00
Pat Myron
7b2b5e9e58
mapping aws-sam-translator
https://pypi.org/project/aws-sam-translator/
https://github.com/awslabs/serverless-application-model
2020-04-21 11:17:49 -07:00
AlexPHorta
21d3907e96 Fixed #88 2020-02-27 21:24:31 -03:00
AlexPHorta
386e677d80 More --clean tests. 2020-02-27 07:54:44 -03:00
AlexPHorta
2022f25ae9 Working on issue #88 2020-02-27 00:59:02 -03:00
Hari Sekhon
a4bd5b552a fixed MySQL + krbV mappings to be reversed 2020-02-18 13:59:50 +00:00
Vadim Kravcenko
6ca1f42d4e
Merge pull request #111 from yonatanp/master
bugfix: f.close() only required if open succeeded
2019-11-14 11:43:11 +01:00
Vadim Kravcenko
05a28a461f
Merge pull request #143 from andrew-vant/patch-1
Add patricia-trie mapping
2019-11-14 11:42:32 +01:00
Vadim Kravcenko
2d3fd405e4
Merge pull request #166 from raxod502/rr-mpv-and-portalocker
Fix mappings for python-mpv and portalocker
2019-11-14 11:39:17 +01:00
Vadim Kravcenko
d37bfbccce
Merge pull request #175 from PatMyron/patch-1
mapping cfn-lint
2019-11-14 11:38:29 +01:00
Vadim Kravcenko
e9731a9632
Merge pull request #173 from HariSekhon/mysql-python
added MySQL-python and krbV mappings
2019-11-14 11:37:06 +01:00
Vadim Kravcenko
591c907b62
Merge pull request #153 from invious/master
deduped, sorted, and added typing package to standard library file
2019-11-14 11:35:10 +01:00
Vadim Kravcenko
6a49451471
Merge pull request #167 from raxod502/feat/setuptools
Add setuptools to stdlib
2019-11-14 11:34:13 +01:00
Vadim Kravcenko
e76ad4dde4
Merge pull request #165 from raxod502/feat/pysynth-and-slack
Add mappings for PySynth and slackclient
2019-11-14 11:33:54 +01:00
Vadim Kravcenko
480ec3ab9a
Merge pull request #171 from kanoonsantikul/feature/omit-version
support omit package output version
2019-11-14 11:33:28 +01:00
Pat Myron
a11aa924b3
mapping cfn-lint
https://github.com/aws-cloudformation/cfn-python-lint/
2019-11-03 11:22:32 -08:00
Hari Sekhon
0cad380111 added krbV -> krbv mapping 2019-10-21 15:33:59 +01:00
Hari Sekhon
deaf895b1e added MySQL-python mapping 2019-10-21 15:14:24 +01:00
Niti Santikul
71fc2dc90c support omit package output version 2019-10-21 00:53:03 +07:00
Radon Rosborough
d0d9fe58e2 Also add pkg_resources 2019-08-08 14:51:27 -07:00
Radon Rosborough
443d9e595b Add setuptools to stdlib 2019-08-08 11:24:17 -07:00
Radon Rosborough
1753e1ff41 Fix mappings for python-mpv and portalocker 2019-08-08 11:16:10 -07:00
Radon Rosborough
21cac5723b Add mappings for PySynth and slackclient 2019-08-02 14:30:48 -07:00
Aymon Fournier
8cc70869b0 deduped, sorted, and added typing package to standard library file 2019-01-28 17:51:33 -05:00
Andrew Vant
3daaebfa17
Add patricia-trie mapping
For this package: https://pypi.org/project/patricia-trie/. Old, but I think it's the
only pure-python trie implementation on pypi.
2018-11-20 15:20:12 -05:00
Jon Banafato
15208540da
Merge pull request #129 from answerquest/patch-1
Add Cryptodome:pycryptodomex
2018-11-08 22:52:24 -05:00
Jon Banafato
529a5e0eea
Merge pull request #137 from tyhunt99/master
Add prefetch:django-prefetch
2018-10-19 14:40:57 -04:00
Tyler Hunt
ea731aab13 added a mapping for django-prefetch -> prefetch 2018-10-19 10:36:52 -06:00
Nikhil VJ
e4817bc68d
Add Cryptodome:pycryptodomex
Ref: https://github.com/bndr/pipreqs/issues/66#issuecomment-415642021  This is a wholly separate package from pycryptodome (which replaced pycrypto in #124 and uses the namespace Crypto in import statments). This uses the namespace `Cryptodome` (is that what we call the parent name of module when importing?)
2018-08-24 09:20:30 +05:30
Vadim Kravcenko
0b8c38ce41
Merge pull request #124 from answerquest/patch-1
Replace pycrypto with pycryptodome
2018-08-23 17:25:26 +02:00
Vadim Kravcenko
f0b8593809
Merge pull request #116 from ch-nickgustafson/fix-oauth-and-sort-bugs
Fix oauth and sort bugs
2018-08-23 17:25:13 +02:00
Nikhil VJ
abd57602fe
Replace pycrypto with pycryptodome
Replace pycrypto with pycryptodome. See https://github.com/bndr/pipreqs/issues/66 for details.
2018-04-04 13:56:19 +05:30
Nick Gustafson
4571740919 add test for get_pkg_names change 2018-02-08 11:05:47 -08:00
Nick Gustafson
712879a7be fix sorting bug in get_pkg_names for consistency with pip freeze 2018-02-08 10:11:16 -08:00
Nick Gustafson
638e2f1046 fix oauth2client bug in mapping 2018-02-08 10:10:54 -08:00
yonatanp
a7ad636dab
bugfix: f.close() only required if open succeeded
when open fails, instead of raising the original error, the "f.close()" was erroring in the final clause.
2018-01-05 11:33:47 +02:00
Jon Banafato
5707a39df6
Merge pull request #101 from jonafato/unified-output-function
Consolidate logic for writing to a file and to stdout
2017-11-16 13:47:03 -05:00
Jon Banafato
51446ac2be
Merge pull request #104 from jonafato/changelog-dates
Add dates to recent changelog entries
2017-10-31 15:26:40 -04:00
Jon Banafato
52505ce32c Add dates to recent changelog entries
Via https://pypi.python.org/pypi/pipreqs/json
2017-10-31 15:18:45 -04:00
Jon Banafato
800abc43e3 Merge pull request #95 from jonafato/cleanup-get_all_imports
Clean up set and file usage in get_all_imports
2017-10-26 17:03:17 -04:00
Jon Banafato
aae6c61f09 Clean up set and file usage in get_all_imports
- Move logic that doesn't need to be inside of file context managers
  outside
- Remove redundant `set` and `list` function calls
2017-10-26 16:36:12 -04:00
Jon Banafato
7afbcb4b16 Merge pull request #102 from jonafato/cleanup-get_pkg_names
Simplify get_pkg_names function
2017-10-26 16:33:31 -04:00
Jon Banafato
84b2e37707 Simplify get_pkg_names function
- Hoist non-file-reading logic outside of the file context manager
- Use a dict instead of a list for faster / more Pythonic lookups
- Use a set to simplify the add / append logic
- Move import sorting from `get_all_imports` to `get_pkg_names` for
  to account for set ordering. This change may also affect #89.
- Add a docstring
2017-10-26 12:16:32 -04:00
Jon Banafato
4b2ad2dc41 [WIP] Consolidate logic for writing to a file and to stdout 2017-10-24 16:58:35 -04:00
Jon Banafato
e0f9ae8c6a Merge pull request #99 from jonafato/flake8
Flake8
2017-10-24 15:17:53 -04:00
Jon Banafato
e88b5ab19c Fix flake8 errors 2017-10-24 15:14:58 -04:00
Jon Banafato
d1a7eda5e8 Enable flake8 linting in tox.ini and .travis.yml
Currently, flake8 is accessible via `make lint`, but it does not run
along side the rest of the test suite. This change adds flake8 checks to
the tox.ini file to enable linting as a routine part of running tests.

Additionally, drop the changes made in #100.
2017-10-24 15:14:52 -04:00
Jon Banafato
1d48345cb0 Merge pull request #100 from jonafato/tox-on-travis
Run Travis-CI tests inside of tox
2017-10-24 14:20:31 -04:00
Jon Banafato
0a9845d87d Run Travis-CI tests inside of tox
By using tox instead of the default Travis-CI Python environments, we
ensure that we have a single entrypoint to testing both locally and in
CI. This reduces redundant code and makes it clear when test
environments don't match up on different platforms.

[tox-travis](https://tox-travis.readthedocs.io/en/stable/) is introduced
here to automatically run tox jobs under the proper Travis-CI
environments. Additionally, the coveralls step is moved to a [build
stage](https://docs.travis-ci.com/user/build-stages) to run once after
all other Travis-CI tests complete.
2017-10-24 14:08:18 -04:00
Jon Banafato
08160bdf95 Merge pull request #97 from jonafato/optional-path
Support optional <path> argument
2017-10-23 23:34:07 -04:00
Jon Banafato
c80d06593d Support optional <path> argument
This change makes the <path> argument optional, defaulting to the
current working directory if omitted.

---

Ideally, this would have been accomplished via docopt, but optional
positional arguments with defaults are not supported at the moment [1, 2].

[1] https://github.com/docopt/docopt/issues/214
[2] https://github.com/docopt/docopt/issues/329
2017-10-22 23:45:16 -04:00
Jon Banafato
755e20196a Merge pull request #85 from zmwangx/correct-arrow-mapping
fix(pipreqs/mapping): correct arrow mapping
2017-10-21 14:13:38 -04:00
Zhiming Wang
d8c94d1690
fix(pipreqs/mapping): correct arrow mapping
https://pypi.org/project/arrow/ is https://github.com/crsmithdev/arrow/, the
real arrow.

https://pypi.org/project/arrow-fatisar/ is https://github.com/fatisar/arrow,
a completely random, outdated fork.
2017-10-21 14:05:47 -04:00
Jon Banafato
4d35819bbd Merge pull request #93 from jonafato/setup-py-quotes
Use single quotes consistently in setup.py
2017-10-20 16:16:16 -04:00
Jon Banafato
62e234ca01 Use single quotes consistently in setup.py 2017-10-20 16:08:56 -04:00
Jon Banafato
ab04f1276e Merge pull request #84 from rspencer01/master
Follow symbolic linked directories
2017-10-20 15:41:27 -04:00
Jon Banafato
2e65861ebc Merge pull request #72 from jonafato/python3.6
Declare support for Python 3.6
2017-10-20 14:38:34 -04:00
Jon Banafato
3b8419ca92 Declare support for Python 3.6
All tests pass under Python 3.6, so declare official support for it.
Closes #68.
2017-10-20 14:35:06 -04:00
Jon Banafato
dca65c3f42 Merge pull request #83 from jonafato/python-decouple-mapping
Add mapping for python-decouple
2017-10-20 14:33:04 -04:00
Jon Banafato
49a3d50c3d Merge pull request #87 from philfreo/patch-1
Minor README formatting consistency tweak
2017-10-20 14:31:37 -04:00
Jon Banafato
fc85d87fa2 Merge pull request #92 from jonafato/drop-eol-pythons
Drop support for end-of-lifed Python versions
2017-10-20 14:30:57 -04:00
Jon Banafato
5008bda188 Drop support for end-of-lifed Python versions
The following versions of Python are no longer supported by the core
developers of Python and pip:

- Python 2.6
    - End of life on 2013-10-29 [1]
    - Dropped from pip on 2017-03-18 [2]
- Python 3.3
    - End of life on 2017-09-29 [3]
    - Dropped from pip on 2017-03-22 [4]

Developers should migrate off of these versions ASAP, as they may be
missing critical security fixes.

[1] https://www.python.org/dev/peps/pep-0361/#release-lifespan
[2] https://github.com/pypa/pip/pull/4343
[3] https://www.python.org/dev/peps/pep-0398/#lifespan
[4] https://github.com/pypa/pip/pull/4355
2017-10-20 13:30:48 -04:00
Phil Freo
0555d849d6 Minor README formatting consistency tweak 2017-09-23 19:32:18 -04:00
Robert Spencer
77c865253c Follow symbolic linked directories
This makes pipreqs dive into directories that are symlinks as Python
recognises these.  If symlinks are not followed, then the requirements
will be incorrect.

We also add a command line option to disable following of symlinks if
need be.
2017-08-31 23:17:31 +02:00
Jon Banafato
42d9f03c79 Add mapping for python-decouple 2017-08-01 11:53:37 -04:00
Vadim Kravcenko
22e80c27c2 Merge pull request #80 from kxrd/fix-issue-74
Exclude concurrent{,.futures} from stdlib if py2
2017-06-30 17:37:40 +02:00
Vadim Kravcenko
9d02b40bc8 Version bump 2017-06-30 14:27:07 +02:00
Vadim Kravcenko
3601802ce2 Update README.rst 2017-06-30 14:22:24 +02:00
Vadim Kravcenko
ac4749681c Merge pull request #77 from kxrd/issue-18
Implement '--clean' and '--diff'
2017-06-30 14:20:54 +02:00
Vadim Kravcenko
2e796183fa Merge pull request #78 from jeepkd/add-mapping-dotenv
Add dotenv to mapping list
2017-06-22 20:47:53 +02:00
Jeep Kiddee
8106376b5e Add dotenv to mapping list 2017-06-22 18:19:05 +07:00
kxrd
254e1cedcb Improve variable module_version 2017-06-13 23:10:14 +02:00
kxrd
06e933ef3c Add new options 2017-06-13 22:55:16 +02:00
kxrd
e7b8ddf72d Replace with statement with a try/except/else/finally block to narrow down problem in the future 2017-06-13 22:22:27 +02:00
kxrd
54be2d1c24 Complete function clean 2017-06-13 22:03:21 +02:00
kxrd
d3efb942d5 Stop exctracting parameters in file, just return list of modules 2017-06-13 21:55:12 +02:00
kxrd
ac6ce860d0 Improve function diff docstring, begin function clean 2017-06-13 01:10:47 +02:00
kxrd
7549b1c416 Add and implement function diff, improve inline comments in function parse_requirements 2017-06-10 21:24:39 +02:00
kxrd
a78203dc2b Improve function parse_requirements docstring 2017-06-10 21:22:18 +02:00
kxrd
882a0d3ec3 Add docstring to function parse_requirements 2017-06-10 20:52:17 +02:00
kxrd
d8b497ea91 Rename function clean to compare_modules, have it return a tuple of modules not imported, add docstring 2017-06-10 20:43:54 +02:00
kxrd
1055a7a28b Add colon to error output for consistency 2017-06-10 20:02:43 +02:00
kxrd
888537a898 Improve except block in function parse_requirements 2017-06-10 20:01:42 +02:00
kxrd
b048c55fe6 Add clean function 2017-06-02 22:30:17 +02:00
kxrd
b6c53c6b02 Begin function clean 2017-06-02 21:16:17 +02:00
kxrd
d463b92810 Return tuple consisting of modules,parameters instead of a concatenated list 2017-06-02 21:13:47 +02:00
kxrd
cdb82c6d9b Switch out print statement against logging.error 2017-06-02 20:13:51 +02:00
kxrd
3d87f21392 Add file_ parameter to function parse_requirements, implement a try-except block in function parse_requirements 2017-06-02 19:59:59 +02:00
kxrd
0886898d8d Fix dict key:value in function parse_requirements() 2017-06-02 18:04:02 +02:00
kxrd
270f0e9933 Add function parse_requirements 2017-06-02 18:00:30 +02:00
kxrd
0b0f54a3de Exclude concurrent{,.futures} from stdlib if py2 2017-06-01 08:22:36 +02:00
Vadim Kravcenko
bddbfc70af remove pypi downloads per month badge
Pypi doesn't provide these stats anymore.
2017-04-29 18:32:00 +02:00
Vadim Kravcenko
3e6c1547dc Version bump 2017-04-20 13:52:57 +02:00
Vadim Kravcenko
7f791ea93d Merge pull request #67 from ryuzakyl/hotfix/remove-package-duplicates
BUG: remove package/version duplicates
2017-04-20 13:49:39 +02:00
L
243e6e6833 BUG: remove package/version duplicates 2017-04-17 11:33:06 -04:00
Vadim Kravcenko
1a990b52d5 refactor(pipreqs): pep8 2017-03-20 20:51:34 +01:00
Vadim Kravcenko
74392c7bc6 Merge pull request #52 from jonafato/python3.5
Explicitly support Python 3.5
2017-03-19 17:56:08 +01:00
Vadim Kravcenko
92cbc175e1 Merge pull request #54 from utgwkk/fix-readme
Add --print to README
2017-03-19 17:55:52 +01:00
Vadim Kravcenko
714393fcea missing parameter for proxy 2016-12-13 15:28:18 +01:00
Vadim Kravcenko
0f70dcc33b Version bump 2016-12-13 15:10:26 +01:00
Vadim Kravcenko
797b6f808b Merge pull request #57 from akaihola/issue-56-pypi-server
Fixes #56: The --pypi-server option now accepts an URL
2016-12-13 15:09:45 +01:00
Antti Kaihola
c69067819a Fixes #56: The --pypi-server option now accepts an URL 2016-12-13 16:06:06 +02:00
UTAGAWA Kiki
dbe1f8b980 Add --print to README 2016-11-22 12:36:44 +09:00
Vadim Kravcenko
a195c35233 Merge pull request #53 from utgwkk/print-stdout
Implement --print to output list of requirements to stdout
2016-11-21 16:50:35 +01:00
UTAGAWA Kiki
ce68e5f98e Fix tests to pass 2016-11-22 00:17:46 +09:00
UTAGAWA Kiki
f31ad89350 Implement --print to output list of requirements to stdout 2016-11-22 00:10:10 +09:00
Jon Banafato
980c92e6fe Explicitly support Python 3.5
All tests pass on Python 3.5, so declare official support for it.
2016-11-03 10:56:41 -04:00
Vadim Kravcenko
345a275b3a Bump Version 2016-07-14 19:09:00 +02:00
Vadim Kravcenko
987eb7af67 fix(tests): fix failing tests 2016-07-14 19:01:18 +02:00
Vadim Kravcenko
63be6bb253 fix(pipres): Add dependency even if version was not found 2016-07-14 18:57:43 +02:00
Vadim Kravcenko
d93319ccf1 fix(pipres): ignore .tox directory #44 2016-07-14 18:55:13 +02:00
Vadim Kravcenko
0799301dee Bump Version 2016-06-02 22:23:43 +02:00
Vadim Kravcenko
71e5b31b23 fix(manifest): add stdlib, mapping to tarball 2016-06-02 22:22:50 +02:00
38 changed files with 5285 additions and 1100 deletions

View File

@ -1 +0,0 @@
service_name: "travis-ci"

34
.github/workflows/flake8.yml vendored Normal file
View File

@ -0,0 +1,34 @@
name: flake8
concurrency:
group: ${{ github.ref }}
cancel-in-progress: true
on:
workflow_dispatch:
push:
tags:
- "*"
branches:
- main
- master
- develop
- "release/*"
pull_request:
jobs:
flake8-lint:
runs-on: ubuntu-24.04
name: Lint
steps:
- name: Check out source repository
uses: actions/checkout@v4
- name: Set up Python environment
uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: flake8 Lint
uses: reviewdog/action-flake8@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
reporter: github-pr-review

65
.github/workflows/tests.yml vendored Normal file
View File

@ -0,0 +1,65 @@
name: Tests and Codecov
on:
push:
branches:
- master
- main
- "release/*"
pull_request:
workflow_dispatch:
jobs:
run_tests:
runs-on: ubuntu-24.04
strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.10', '3.11', '3.12', '3.13', 'pypy-3.10']
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install uv
uv pip install --system tox tox-gh-actions
- name: Test with tox
run: tox
coverage_report:
needs: run_tests
runs-on: ubuntu-24.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Python 3.13
uses: actions/setup-python@v5
with:
python-version: 3.13
- name: Install dependencies
run: |
python -m pip install uv
uv pip install --system poetry
uv pip install --system .[dev]
- name: Calculate coverage
run: poetry run coverage run --source=pipreqs -m unittest discover
- name: Create XML report
run: poetry run coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v5
with:
files: coverage.xml
token: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: false

96
.pre-commit-config.yaml Normal file
View File

@ -0,0 +1,96 @@
ci:
autoupdate_commit_msg: "chore: update pre-commit hooks"
autofix_commit_msg: "style: pre-commit fixes"
autoupdate_schedule: quarterly
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-added-large-files
args: [ '--maxkb=1000' ]
- id: check-case-conflict
- id: check-merge-conflict
- id: check-symlinks
- id: check-yaml
- id: check-toml
- id: check-json
- id: debug-statements
- id: end-of-file-fixer
- id: mixed-line-ending
- id: requirements-txt-fixer
- id: trailing-whitespace
files: ".*\\.(?:tex|py)$"
args: [ --markdown-linebreak-ext=md ]
exclude: (^notebooks/|^tests/truth/)
- id: detect-private-key
- id: fix-byte-order-marker
- id: check-ast
- id: check-docstring-first
- id: debug-statements
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.10.0
hooks:
- id: python-use-type-annotations
- id: python-check-mock-methods
- id: python-no-eval
- id: rst-backticks
- id: rst-directive-colons
- repo: https://github.com/asottile/pyupgrade
rev: v3.3.1
hooks:
- id: pyupgrade
args: [ --py38-plus ]
# Notebook formatting
- repo: https://github.com/nbQA-dev/nbQA
rev: 1.9.1
hooks:
- id: nbqa-isort
additional_dependencies: [ isort ]
- id: nbqa-pyupgrade
additional_dependencies: [ pyupgrade ]
args: [ --py38-plus ]
- repo: https://github.com/kynan/nbstripout
rev: 0.8.1
hooks:
- id: nbstripout
- repo: https://github.com/sondrelg/pep585-upgrade
rev: 'v1.0'
hooks:
- id: upgrade-type-hints
args: [ '--futures=true' ]
- repo: https://github.com/MarcoGorelli/auto-walrus
rev: 0.3.4
hooks:
- id: auto-walrus
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: 0.30.0
hooks:
- id: check-github-workflows
- id: check-github-actions
- id: check-dependabot
- id: check-readthedocs
- repo: https://github.com/dannysepler/rm_unneeded_f_str
rev: v0.2.0
hooks:
- id: rm-unneeded-f-str
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.8.6"
hooks:
- id: ruff
types_or: [ python, pyi, jupyter ]
args: [ --fix, --show-fixes , --line-length=120 ] # --unsafe-fixes,
# Run the formatter.
- id: ruff-format
types_or: [ python, pyi, jupyter ]

7
.python-version Normal file
View File

@ -0,0 +1,7 @@
3.13
3.12
3.11
3.10
3.9
3.8
pypy3.9-7.3.12

1
.tool-versions Normal file
View File

@ -0,0 +1 @@
python 3.13 3.12 3.11 3.10 3.9 3.8 pypy3.9-7.3.12

View File

@ -1,21 +0,0 @@
# Config file for automatic testing at travis-ci.org
language: python
python:
- "3.4"
- "3.3"
- "2.7"
- "2.6"
- "pypy"
# command to install dependencies, e.g. pip install -r requirements.txt --use-mirrors
install:
- "pip install -r requirements.txt"
- "pip install coverage"
- "pip install coveralls"
# command to run tests, e.g. python setup.py test
script: coverage run --source=pipreqs setup.py test
after_success:
coveralls

View File

@ -10,4 +10,5 @@ Development Lead
Contributors Contributors
------------ ------------
None yet. Why not be the first? * Jake Teo <mapattacker@gmail.com>
* Jerome Chan <cjerome94@gmail.com>

View File

@ -61,12 +61,11 @@ Ready to contribute? Here's how to set up `pipreqs` for local development.
2. Clone your fork locally:: 2. Clone your fork locally::
$ git clone git@github.com:your_name_here/pipreqs.git $ git clone git@github.com:your_name_here/pipreqs.git
3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::
$ mkvirtualenv pipreqs
$ cd pipreqs/ $ cd pipreqs/
$ python setup.py develop
3. Pipreqs is developed using Poetry. Refer to the `documentation <https://python-poetry.org/docs/>`_ to install Poetry in your local environment. Next, you should install pipreqs's dependencies::
$ poetry install --with dev
4. Create a branch for local development:: 4. Create a branch for local development::
@ -76,11 +75,11 @@ Ready to contribute? Here's how to set up `pipreqs` for local development.
5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox:: 5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::
$ flake8 pipreqs tests $ poetry run flake8 pipreqs tests
$ python setup.py test $ poetry run python -m unittest discover
$ tox $ poetry run tox
To get flake8 and tox, just pip install them into your virtualenv. To test all versions of python using tox you need to have them installed and for this two options are recommended: `pyenv` or `asdf`.
6. Commit your changes and push your branch to GitHub:: 6. Commit your changes and push your branch to GitHub::
@ -99,13 +98,13 @@ Before you submit a pull request, check that it meets these guidelines:
2. If the pull request adds functionality, the docs should be updated. Put 2. If the pull request adds functionality, the docs should be updated. Put
your new functionality into a function with a docstring, and add the your new functionality into a function with a docstring, and add the
feature to the list in README.rst. feature to the list in README.rst.
3. The pull request should work for Python 2.6, 2.7, 3.3, and 3.4, and for PyPy. Check 3. The pull request should work for currently supported Python and PyPy versions. Check
https://travis-ci.org/bndr/pipreqs/pull_requests https://travis-ci.org/bndr/pipreqs/pull_requests and make sure that the
and make sure that the tests pass for all supported Python versions. tests pass for all supported Python versions.
Tips Tips
---- ----
To run a subset of tests:: To run a subset of tests::
$ python -m unittest tests.test_pipreqs $ poetry run python -m unittest tests.test_pipreqs

View File

@ -3,6 +3,34 @@
History History
------- -------
0.4.11 (2020-03-29)
--------------------
* Implement '--mode' (Jake Teo, Jerome Chan)
0.4.8 (2017-06-30)
--------------------
* Implement '--clean' and '--diff' (kxrd)
* Exclude concurrent{,.futures} from stdlib if py2 (kxrd)
0.4.7 (2017-04-20)
--------------------
* BUG: remove package/version duplicates
* Style: pep8
0.4.5 (2016-12-13)
---------------------
* Fixed the --pypi-server option
0.4.4 (2016-07-14)
---------------------
* Remove Spaces in output
* Add package to output even without version
0.4.2 (2016-02-10) 0.4.2 (2016-02-10)
--------------------- ---------------------

View File

@ -1,11 +0,0 @@
include AUTHORS.rst
include CONTRIBUTING.rst
include HISTORY.rst
include LICENSE
include README.rst
recursive-include tests *
recursive-exclude * __pycache__
recursive-exclude * *.py[co]
recursive-include docs *.rst conf.py Makefile make.bat stdlib mapping

View File

@ -6,13 +6,14 @@ help:
@echo "clean-pyc - remove Python file artifacts" @echo "clean-pyc - remove Python file artifacts"
@echo "clean-test - remove test and coverage artifacts" @echo "clean-test - remove test and coverage artifacts"
@echo "lint - check style with flake8" @echo "lint - check style with flake8"
@echo "test - run tests quickly with the default Python" @echo "test - run tests quickly using the default Python"
@echo "test-all - run tests on every Python version with tox" @echo "test-all - run tests on every Python version with tox"
@echo "coverage - check code coverage quickly with the default Python" @echo "coverage - check code coverage quickly with the default Python"
@echo "docs - generate Sphinx HTML documentation, including API docs" @echo "docs - generate Sphinx HTML documentation, including API docs"
@echo "release - package and upload a release" @echo "publish - package and upload a release"
@echo "dist - package" @echo "publish-to-test - package and upload a release to test-pypi"
@echo "install - install the package to the active Python's site-packages" @echo "build - build the package"
@echo "install - install the dependencies into the Poetry virtual environment"
clean: clean-build clean-pyc clean-test clean: clean-build clean-pyc clean-test
@ -35,14 +36,13 @@ clean-test:
rm -fr htmlcov/ rm -fr htmlcov/
lint: lint:
flake8 pipreqs tests poetry run flake8 pipreqs tests
test: test:
pip install -r requirements.txt poetry run python -m unittest discover
python setup.py test
test-all: test-all:
tox poetry run tox
coverage: coverage:
coverage run --source pipreqs setup.py test coverage run --source pipreqs setup.py test
@ -58,13 +58,14 @@ docs:
$(MAKE) -C docs html $(MAKE) -C docs html
open docs/_build/html/index.html open docs/_build/html/index.html
release: clean publish: build
python setup.py sdist bdist_wheel upload -r pypi poetry publish
dist: clean publish-to-test: build
python setup.py sdist poetry publish --repository test-pypi
python setup.py bdist_wheel
ls -l dist build: clean
poetry build
install: clean install: clean
python setup.py install poetry install --with dev

View File

@ -1,20 +1,17 @@
=============================== =============================================================================
``pipreqs`` - Generate requirements.txt file for any project based on imports ``pipreqs`` - Generate requirements.txt file for any project based on imports
=============================== =============================================================================
.. image:: https://img.shields.io/travis/bndr/pipreqs.svg .. image:: https://github.com/bndr/pipreqs/actions/workflows/tests.yml/badge.svg
:target: https://travis-ci.org/bndr/pipreqs :target: https://github.com/bndr/pipreqs/actions/workflows/tests.yml
.. image:: https://img.shields.io/pypi/v/pipreqs.svg .. image:: https://img.shields.io/pypi/v/pipreqs.svg
:target: https://pypi.python.org/pypi/pipreqs :target: https://pypi.python.org/pypi/pipreqs
.. image:: https://img.shields.io/pypi/dm/pipreqs.svg
:target: https://pypi.python.org/pypi/pipreqs
.. image:: https://img.shields.io/coveralls/bndr/pipreqs.svg
:target: https://coveralls.io/r/bndr/pipreqs
.. image:: https://codecov.io/gh/bndr/pipreqs/branch/master/graph/badge.svg?token=0rfPfUZEAX
:target: https://codecov.io/gh/bndr/pipreqs
.. image:: https://img.shields.io/pypi/l/pipreqs.svg .. image:: https://img.shields.io/pypi/l/pipreqs.svg
:target: https://pypi.python.org/pypi/pipreqs :target: https://pypi.python.org/pypi/pipreqs
@ -24,30 +21,52 @@
Installation Installation
------------ ------------
:: .. code-block:: sh
pip install pipreqs pip install pipreqs
Obs.: if you don't want support for jupyter notebooks, you can install pipreqs without the dependencies that give support to it.
To do so, run:
.. code-block:: sh
pip install --no-deps pipreqs
pip install yarg==0.1.9 docopt==0.6.2
Usage Usage
----- -----
:: ::
Usage: Usage:
pipreqs [options] <path> pipreqs [options] [<path>]
Arguments:
<path> The path to the directory containing the application files for which a requirements file
should be generated (defaults to the current working directory)
Options: Options:
--use-local Use ONLY local package info instead of querying PyPI --use-local Use ONLY local package info instead of querying PyPI
--pypi-server Use custom PyPi server --pypi-server <url> Use custom PyPi server
--proxy Use Proxy, parameter will be passed to requests library. You can also just set the --proxy <url> Use Proxy, parameter will be passed to requests library. You can also just set the
environments parameter in your terminal: environments parameter in your terminal:
$ export HTTP_PROXY="http://10.10.1.10:3128" $ export HTTP_PROXY="http://10.10.1.10:3128"
$ export HTTPS_PROXY="https://10.10.1.10:1080" $ export HTTPS_PROXY="https://10.10.1.10:1080"
--debug Print debug information --debug Print debug information
--ignore <dirs>... Ignore extra directories --ignore <dirs>... Ignore extra directories, each separated by a comma
--no-follow-links Do not follow symbolic links in the project
--ignore-errors Ignore errors while scanning files
--encoding <charset> Use encoding parameter for file open --encoding <charset> Use encoding parameter for file open
--savepath <file> Save the list of requirements in the given file --savepath <file> Save the list of requirements in the given file
--print Output the list of requirements in the standard output
--force Overwrite existing requirements.txt --force Overwrite existing requirements.txt
--diff <file> Compare modules in requirements.txt to project imports
--clean <file> Clean up requirements.txt by removing modules that are not imported in project
--mode <scheme> Enables dynamic versioning with <compat>, <gt> or <non-pin> schemes
<compat> | e.g. Flask~=1.1.2
<gt> | e.g. Flask>=1.1.2
<no-pin> | e.g. Flask
--scan-notebooks Look for imports in jupyter notebook files.
Example Example
------- -------
@ -69,5 +88,5 @@ Why not pip freeze?
------------------- -------------------
- ``pip freeze`` only saves the packages that are installed with ``pip install`` in your environment. - ``pip freeze`` only saves the packages that are installed with ``pip install`` in your environment.
- pip freeze saves all packages in the environment including those that you don't use in your current project. (if you don't have virtualenv) - ``pip freeze`` saves all packages in the environment including those that you don't use in your current project (if you don't have ``virtualenv``).
- and sometimes you just need to create requirements.txt for a new project without installing modules. - and sometimes you just need to create ``requirements.txt`` for a new project without installing modules.

View File

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
__author__ = 'Vadim Kravcenko' __author__ = 'Vadim Kravcenko'
__email__ = 'vadim.kravcenko@gmail.com' __email__ = 'vadim.kravcenko@gmail.com'
__version__ = '0.4.2' __version__ = '0.4.13'

View File

@ -1,3 +1,4 @@
AFQ:pyAFQ
AG_fft_tools:agpy AG_fft_tools:agpy
ANSI:pexpect ANSI:pexpect
Adafruit:Adafruit_Libraries Adafruit:Adafruit_Libraries
@ -9,9 +10,12 @@ BeautifulSoupTests:BeautifulSoup
BioSQL:biopython BioSQL:biopython
BuildbotStatusShields:BuildbotEightStatusShields BuildbotStatusShields:BuildbotEightStatusShields
ComputedAttribute:ExtensionClass ComputedAttribute:ExtensionClass
Crypto:pycrypto constraint:python-constraint
Crypto:pycryptodome
Cryptodome:pycryptodomex
FSM:pexpect FSM:pexpect
FiftyOneDegrees:51degrees_mobile_detector_v3_wrapper FiftyOneDegrees:51degrees_mobile_detector_v3_wrapper
functional:pyfunctional
GeoBaseMain:GeoBasesDev GeoBaseMain:GeoBasesDev
GeoBases:GeoBasesDev GeoBases:GeoBasesDev
Globals:Zope2 Globals:Zope2
@ -21,6 +25,7 @@ Kittens:astro_kittens
Levenshtein:python_Levenshtein Levenshtein:python_Levenshtein
Lifetime:Zope2 Lifetime:Zope2
MethodObject:ExtensionClass MethodObject:ExtensionClass
MySQLdb:MySQL-python
OFS:Zope2 OFS:Zope2
OpenGL:PyOpenGL OpenGL:PyOpenGL
OpenSSL:pyOpenSSL OpenSSL:pyOpenSSL
@ -31,6 +36,7 @@ Pyxides:astro_pyxis
QtCore:PySide QtCore:PySide
S3:s3cmd S3:s3cmd
SCons:pystick SCons:pystick
speech_recognition:SpeechRecognition
Shared:Zope2 Shared:Zope2
Signals:Zope2 Signals:Zope2
Stemmer:PyStemmer Stemmer:PyStemmer
@ -125,6 +131,7 @@ aios3:aio_s3
airbrake:airbrake_flask airbrake:airbrake_flask
airship:airship_icloud airship:airship_icloud
airship:airship_steamcloud airship:airship_steamcloud
airflow:apache-airflow
akamai:edgegrid_python akamai:edgegrid_python
alation:alation_api alation:alation_api
alba_client:alba_client_python alba_client:alba_client_python
@ -263,7 +270,6 @@ armstrong:armstrong.hatband
armstrong:armstrong.templates.standard armstrong:armstrong.templates.standard
armstrong:armstrong.utils.backends armstrong:armstrong.utils.backends
armstrong:armstrong.utils.celery armstrong:armstrong.utils.celery
arrow:arrow_fatisar
arstecnica:arstecnica.raccoon.autobahn arstecnica:arstecnica.raccoon.autobahn
arstecnica:arstecnica.sqlalchemy.async arstecnica:arstecnica.sqlalchemy.async
article-downloader:article_downloader article-downloader:article_downloader
@ -536,6 +542,7 @@ cassandra:cassandra_driver
cassandralauncher:CassandraLauncher cassandralauncher:CassandraLauncher
cc42:42qucc cc42:42qucc
cerberus:Cerberus cerberus:Cerberus
cfnlint:cfn-lint
chameleon:Chameleon chameleon:Chameleon
charmtools:charm_tools charmtools:charm_tools
chef:PyChef chef:PyChef
@ -576,19 +583,23 @@ ctff:tff
cups:pycups cups:pycups
curator:elasticsearch_curator curator:elasticsearch_curator
curl:pycurl curl:pycurl
cv2:opencv-python
daemon:python_daemon daemon:python_daemon
dare:DARE dare:DARE
dateutil:python_dateutil dateutil:python_dateutil
dawg:DAWG dawg:DAWG
deb822:python_debian deb822:python_debian
debian:python_debian debian:python_debian
decouple:python-decouple
demo:webunit demo:webunit
demosongs:PySynth
deployer:juju_deployer deployer:juju_deployer
depot:filedepot depot:filedepot
devtools:tg.devtools devtools:tg.devtools
dgis:2gis dgis:2gis
dhtmlparser:pyDHTMLParser dhtmlparser:pyDHTMLParser
digitalocean:python_digitalocean digitalocean:python_digitalocean
discord:discord.py
distribute_setup:ez_setup distribute_setup:ez_setup
distutils2:Distutils2 distutils2:Distutils2
django:Django django:Django
@ -606,6 +617,7 @@ dogshell:dogapi
dot_parser:pydot dot_parser:pydot
dot_parser:pydot2 dot_parser:pydot2
dot_parser:pydot3k dot_parser:pydot3k
dotenv:python-dotenv
dpkt:dpkt_fix dpkt:dpkt_fix
dsml:python_ldap dsml:python_ldap
durationfield:django_durationfield durationfield:django_durationfield
@ -671,6 +683,7 @@ geventwebsocket:gevent_websocket
gflags:python_gflags gflags:python_gflags
git:GitPython git:GitPython
github:PyGithub github:PyGithub
github3:github3.py
gitpy:git_py gitpy:git_py
globusonline:globusonline_transfer_api_client globusonline:globusonline_transfer_api_client
google:protobuf google:protobuf
@ -697,6 +710,7 @@ html:pies2overrides
htmloutput:nosehtmloutput htmloutput:nosehtmloutput
http:pies2overrides http:pies2overrides
hvad:django_hvad hvad:django_hvad
hydra:hydra-core
i99fix:199Fix i99fix:199Fix
igraph:python_igraph igraph:python_igraph
imdb:IMDbPY imdb:IMDbPY
@ -708,6 +722,7 @@ jaraco:jaraco.util
jinja2:Jinja2 jinja2:Jinja2
jiracli:jira_cli jiracli:jira_cli
johnny:johnny_cache johnny:johnny_cache
jose:python_jose
jpgrid:python_geohash jpgrid:python_geohash
jpiarea:python_geohash jpiarea:python_geohash
jpype:JPype1 jpype:JPype1
@ -722,6 +737,7 @@ keyczar:python_keyczar
keyedcache:django_keyedcache keyedcache:django_keyedcache
keystoneclient:python_keystoneclient keystoneclient:python_keystoneclient
kickstarter:kickstart kickstarter:kickstart
krbv:krbV
kss:kss.core kss:kss.core
kuyruk:Kuyruk kuyruk:Kuyruk
langconv:AdvancedLangConv langconv:AdvancedLangConv
@ -770,6 +786,8 @@ mimeparse:python_mimeparse
minitage:minitage.paste minitage:minitage.paste
minitage:minitage.recipe.common minitage:minitage.recipe.common
missingdrawables:android_missingdrawables missingdrawables:android_missingdrawables
mixfiles:PySynth
mkfreq:PySynth
mkrst_themes:2lazy2rest mkrst_themes:2lazy2rest
mockredis:mockredispy mockredis:mockredispy
modargs:python_modargs modargs:python_modargs
@ -785,6 +803,7 @@ monthdelta:MonthDelta
mopidy:Mopidy mopidy:Mopidy
mopytools:MoPyTools mopytools:MoPyTools
mptt:django_mptt mptt:django_mptt
mpv:python-mpv
mrbob:mr.bob mrbob:mr.bob
msgpack:msgpack_python msgpack:msgpack_python
mutations:aino_mutations mutations:aino_mutations
@ -799,7 +818,7 @@ nester:abofly
nester:bssm_pythonSig nester:bssm_pythonSig
novaclient:python_novaclient novaclient:python_novaclient
oauth2_provider:alauda_django_oauth oauth2_provider:alauda_django_oauth
oauth2client:google_api_python_client oauth2client:oauth2client
odf:odfpy odf:odfpy
ometa:Parsley ometa:Parsley
openid:python_openid openid:python_openid
@ -820,12 +839,14 @@ past:future
paste:PasteScript paste:PasteScript
path:forked_path path:forked_path
path:path.py path:path.py
patricia:patricia-trie
paver:Paver paver:Paver
peak:ProxyTypes peak:ProxyTypes
picasso:anderson.picasso picasso:anderson.picasso
picklefield:django-picklefield picklefield:django-picklefield
pilot:BigJob pilot:BigJob
pivotal:pivotal_py pivotal:pivotal_py
play_wav:PySynth
playhouse:peewee playhouse:peewee
plivoxml:plivo plivoxml:plivo
plone:plone.alterego plone:plone.alterego
@ -909,9 +930,9 @@ plone:plone.z3cform
plonetheme:plonetheme.barceloneta plonetheme:plonetheme.barceloneta
png:pypng png:pypng
polymorphic:django_polymorphic polymorphic:django_polymorphic
portalocker:ConcurrentLogHandler
postmark:python_postmark postmark:python_postmark
powerprompt:bash_powerprompt powerprompt:bash_powerprompt
prefetch:django-prefetch
printList:AndrewList printList:AndrewList
progressbar:progressbar2 progressbar:progressbar2
progressbar:progressbar33 progressbar:progressbar33
@ -946,9 +967,18 @@ pyrimaa:AEI
pysideuic:PySide pysideuic:PySide
pysqlite2:adhocracy_pysqlite pysqlite2:adhocracy_pysqlite
pysqlite2:pysqlite pysqlite2:pysqlite
pysynth_b:PySynth
pysynth_beeper:PySynth
pysynth_c:PySynth
pysynth_d:PySynth
pysynth_e:PySynth
pysynth_p:PySynth
pysynth_s:PySynth
pysynth_samp:PySynth
pythongettext:python_gettext pythongettext:python_gettext
pythonjsonlogger:python_json_logger pythonjsonlogger:python_json_logger
pyutilib:PyUtilib pyutilib:PyUtilib
pywintypes:pywin32
pyximport:Cython pyximport:Cython
qs:qserve qs:qserve
quadtree:python_geohash quadtree:python_geohash
@ -980,6 +1010,7 @@ ruamel:ruamel.base
s2repoze:pysaml2 s2repoze:pysaml2
saga:saga_python saga:saga_python
saml2:pysaml2 saml2:pysaml2
samtranslator:aws-sam-translator
sass:libsass sass:libsass
sassc:libsass sassc:libsass
sasstests:libsass sasstests:libsass
@ -1003,10 +1034,12 @@ singleton:pysingleton
sittercommon:cerebrod sittercommon:cerebrod
skbio:scikit_bio skbio:scikit_bio
sklearn:scikit_learn sklearn:scikit_learn
slack:slackclient
slugify:unicode_slugify slugify:unicode_slugify
slugify:python-slugify
smarkets:smk_python_sdk smarkets:smk_python_sdk
snappy:ctypes_snappy snappy:ctypes_snappy
socketio:gevent_socketio socketio:python-socketio
socketserver:pies2overrides socketserver:pies2overrides
sockjs:sockjs_tornado sockjs:sockjs_tornado
socks:SocksiPy_branch socks:SocksiPy_branch
@ -1035,6 +1068,7 @@ tasksitter:cerebrod
tastypie:django_tastypie tastypie:django_tastypie
teamcity:teamcity_messages teamcity:teamcity_messages
telebot:pyTelegramBotAPI telebot:pyTelegramBotAPI
telegram:python-telegram-bot
tempita:Tempita tempita:Tempita
tenjin:Tenjin tenjin:Tenjin
termstyle:python_termstyle termstyle:python_termstyle

565
pipreqs/pipreqs.py Executable file → Normal file
View File

@ -1,29 +1,48 @@
#!/usr/bin/env python #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""pipreqs - Generate pip requirements.txt file based on imports """pipreqs - Generate pip requirements.txt file based on imports
Usage: Usage:
pipreqs [options] <path> pipreqs [options] [<path>]
Arguments:
<path> The path to the directory containing the application
files for which a requirements file should be
generated (defaults to the current working
directory).
Options: Options:
--use-local Use ONLY local package info instead of querying PyPI --use-local Use ONLY local package info instead of querying PyPI.
--pypi-server Use custom PyPi server --pypi-server <url> Use custom PyPi server.
--proxy Use Proxy, parameter will be passed to requests library. You can also just set the --proxy <url> Use Proxy, parameter will be passed to requests
environments parameter in your terminal: library. You can also just set the environments
parameter in your terminal:
$ export HTTP_PROXY="http://10.10.1.10:3128" $ export HTTP_PROXY="http://10.10.1.10:3128"
$ export HTTPS_PROXY="https://10.10.1.10:1080" $ export HTTPS_PROXY="https://10.10.1.10:1080"
--debug Print debug information --debug Print debug information
--ignore <dirs>... Ignore extra directories, each separated by a comma --ignore <dirs>... Ignore extra directories, each separated by a comma
--ignore-errors Ignore errors while scanning files
--no-follow-links Do not follow symbolic links in the project
--encoding <charset> Use encoding parameter for file open --encoding <charset> Use encoding parameter for file open
--savepath <file> Save the list of requirements in the given file --savepath <file> Save the list of requirements in the given file
--print Output the list of requirements in the standard
output
--force Overwrite existing requirements.txt --force Overwrite existing requirements.txt
--diff <file> Compare modules in requirements.txt to project
imports
--clean <file> Clean up requirements.txt by removing modules
that are not imported in project
--mode <scheme> Enables dynamic versioning with <compat>,
<gt> or <no-pin> schemes.
<compat> | e.g. Flask~=1.1.2
<gt> | e.g. Flask>=1.1.2
<no-pin> | e.g. Flask
--scan-notebooks Look for imports in jupyter notebook files.
""" """
from __future__ import print_function, absolute_import from contextlib import contextmanager
import os import os
import sys import sys
import re import re
import logging import logging
import codecs
import ast import ast
import traceback import traceback
from docopt import docopt from docopt import docopt
@ -33,23 +52,67 @@ from yarg.exceptions import HTTPError
from pipreqs import __version__ from pipreqs import __version__
REGEXP = [ REGEXP = [re.compile(r"^import (.+)$"), re.compile(r"^from ((?!\.+).*?) import (?:.*)$")]
re.compile(r'^import (.+)$'), DEFAULT_EXTENSIONS = [".py", ".pyw"]
re.compile(r'^from ((?!\.+).*?) import (?:.*)$')
]
if sys.version_info[0] > 2: scan_noteboooks = False
open_func = open
else:
open_func = codecs.open
def get_all_imports(path, encoding=None, extra_ignore_dirs=None): class NbconvertNotInstalled(ImportError):
default_message = (
"In order to scan jupyter notebooks, please install the nbconvert and ipython libraries"
)
def __init__(self, message=default_message):
super().__init__(message)
@contextmanager
def _open(filename=None, mode="r"):
"""Open a file or ``sys.stdout`` depending on the provided filename.
Args:
filename (str): The path to the file that should be opened. If
``None`` or ``'-'``, ``sys.stdout`` or ``sys.stdin`` is
returned depending on the desired mode. Defaults to ``None``.
mode (str): The mode that should be used to open the file.
Yields:
A file handle.
"""
if not filename or filename == "-":
if not mode or "r" in mode:
file = sys.stdin
elif "w" in mode:
file = sys.stdout
else:
raise ValueError("Invalid mode for file: {}".format(mode))
else:
file = open(filename, mode)
try:
yield file
finally:
if file not in (sys.stdin, sys.stdout):
file.close()
def get_all_imports(path, encoding="utf-8", extra_ignore_dirs=None, follow_links=True, ignore_errors=False):
imports = set() imports = set()
raw_imports = set() raw_imports = set()
candidates = [] candidates = []
ignore_errors = False ignore_dirs = [
ignore_dirs = [".hg", ".svn", ".git", "__pycache__", "env", "venv"] ".hg",
".svn",
".git",
".tox",
"__pycache__",
"env",
"venv",
".venv",
".ipynb_checkpoints",
]
if extra_ignore_dirs: if extra_ignore_dirs:
ignore_dirs_parsed = [] ignore_dirs_parsed = []
@ -57,17 +120,23 @@ def get_all_imports(path, encoding=None, extra_ignore_dirs=None):
ignore_dirs_parsed.append(os.path.basename(os.path.realpath(e))) ignore_dirs_parsed.append(os.path.basename(os.path.realpath(e)))
ignore_dirs.extend(ignore_dirs_parsed) ignore_dirs.extend(ignore_dirs_parsed)
for root, dirs, files in os.walk(path): extensions = get_file_extensions()
walk = os.walk(path, followlinks=follow_links)
for root, dirs, files in walk:
dirs[:] = [d for d in dirs if d not in ignore_dirs] dirs[:] = [d for d in dirs if d not in ignore_dirs]
candidates.append(os.path.basename(root)) candidates.append(os.path.basename(root))
files = [fn for fn in files if os.path.splitext(fn)[1] == ".py"] py_files = [file for file in files if file_ext_is_allowed(file, DEFAULT_EXTENSIONS)]
candidates.extend([os.path.splitext(filename)[0] for filename in py_files])
files = [fn for fn in files if file_ext_is_allowed(fn, extensions)]
candidates += [os.path.splitext(fn)[0] for fn in files]
for file_name in files: for file_name in files:
with open_func(os.path.join(root, file_name), "r", encoding=encoding) as f: file_name = os.path.join(root, file_name)
contents = f.read()
try: try:
contents = read_file_content(file_name, encoding)
tree = ast.parse(contents) tree = ast.parse(contents)
for node in ast.walk(tree): for node in ast.walk(tree):
if isinstance(node, ast.Import): if isinstance(node, ast.Import):
@ -77,45 +146,85 @@ def get_all_imports(path, encoding=None, extra_ignore_dirs=None):
raw_imports.add(node.module) raw_imports.add(node.module)
except Exception as exc: except Exception as exc:
if ignore_errors: if ignore_errors:
traceback.print_exc(exc) traceback.print_exc()
logging.warn("Failed on file: %s" % os.path.join(root, file_name)) logging.warn("Failed on file: %s" % file_name)
continue continue
else: else:
logging.error("Failed on file: %s" % os.path.join(root, file_name)) logging.error("Failed on file: %s" % file_name)
raise exc raise exc
# Clean up imports # Clean up imports
for name in [n for n in raw_imports if n]: for name in [n for n in raw_imports if n]:
# Sanity check: Name could have been None if the import statement was as from . import X # Sanity check: Name could have been None if the import
# statement was as ``from . import X``
# Cleanup: We only want to first part of the import. # Cleanup: We only want to first part of the import.
# Ex: from django.conf --> django.conf. But we only want django as an import # Ex: from django.conf --> django.conf. But we only want django
cleaned_name, _, _ = name.partition('.') # as an import.
cleaned_name, _, _ = name.partition(".")
imports.add(cleaned_name) imports.add(cleaned_name)
packages = set(imports) - set(set(candidates) & set(imports)) packages = imports - (set(candidates) & imports)
logging.debug('Found packages: {0}'.format(packages)) logging.debug("Found packages: {0}".format(packages))
with open(join("stdlib"), "r") as f: with open(join("stdlib"), "r") as f:
data = [x.strip() for x in f.readlines()] data = {x.strip() for x in f}
return sorted(list(set(packages) - set(data)))
return list(packages - data)
def filter_line(l): def get_file_extensions():
return len(l) > 0 and l[0] != "#" return DEFAULT_EXTENSIONS + [".ipynb"] if scan_noteboooks else DEFAULT_EXTENSIONS
def generate_requirements_file(path, imports): def read_file_content(file_name: str, encoding="utf-8"):
with open(path, "w") as out_file: if file_ext_is_allowed(file_name, DEFAULT_EXTENSIONS):
logging.debug('Writing {num} requirements: {imports} to {file}'.format( with open(file_name, "r", encoding=encoding) as f:
num=len(imports), contents = f.read()
file=path, elif file_ext_is_allowed(file_name, [".ipynb"]) and scan_noteboooks:
imports=", ".join([x['name'] for x in imports]) contents = ipynb_2_py(file_name, encoding=encoding)
)) return contents
fmt = '{name} == {version}'
out_file.write('\n'.join(fmt.format(**item)
for item in imports) + '\n') def file_ext_is_allowed(file_name, acceptable):
return os.path.splitext(file_name)[1] in acceptable
def ipynb_2_py(file_name, encoding="utf-8"):
"""
Args:
file_name (str): notebook file path to parse as python script
encoding (str): encoding of file
Returns:
str: parsed string
"""
exporter = PythonExporter()
(body, _) = exporter.from_filename(file_name)
return body.encode(encoding)
def generate_requirements_file(path, imports, symbol):
with _open(path, "w") as out_file:
logging.debug(
"Writing {num} requirements: {imports} to {file}".format(
num=len(imports), file=path, imports=", ".join([x["name"] for x in imports])
)
)
fmt = "{name}" + symbol + "{version}"
out_file.write(
"\n".join(
fmt.format(**item) if item["version"] else "{name}".format(**item)
for item in imports
)
+ "\n"
)
def output_requirements(imports, symbol):
generate_requirements_file("-", imports, symbol)
def get_imports_info(imports, pypi_server="https://pypi.python.org/pypi/", proxy=None): def get_imports_info(imports, pypi_server="https://pypi.python.org/pypi/", proxy=None):
@ -123,68 +232,115 @@ def get_imports_info(imports, pypi_server="https://pypi.python.org/pypi/", proxy
for item in imports: for item in imports:
try: try:
logging.warning(
'Import named "%s" not found locally. ' "Trying to resolve it at the PyPI server.",
item,
)
response = requests.get("{0}{1}/json".format(pypi_server, item), proxies=proxy) response = requests.get("{0}{1}/json".format(pypi_server, item), proxies=proxy)
if response.status_code == 200: if response.status_code == 200:
if hasattr(response.content, 'decode'): if hasattr(response.content, "decode"):
data = json2package(response.content.decode()) data = json2package(response.content.decode())
else: else:
data = json2package(response.content) data = json2package(response.content)
elif response.status_code >= 300: elif response.status_code >= 300:
raise HTTPError(status_code=response.status_code, raise HTTPError(status_code=response.status_code, reason=response.reason)
reason=response.reason)
except HTTPError: except HTTPError:
logging.debug( logging.warning('Package "%s" does not exist or network problems', item)
'Package %s does not exist or network problems', item)
continue continue
result.append({'name': item, 'version': data.latest_release_id}) logging.warning(
'Import named "%s" was resolved to "%s:%s" package (%s).\n'
"Please, verify manually the final list of requirements.txt "
"to avoid possible dependency confusions.",
item,
data.name,
data.latest_release_id,
data.pypi_url,
)
result.append({"name": item, "version": data.latest_release_id})
return result return result
def get_locally_installed_packages(encoding=None): def get_locally_installed_packages(encoding="utf-8"):
packages = {} packages = []
ignore = ["tests", "_tests", "egg", "EGG", "info"] ignore = ["tests", "_tests", "egg", "EGG", "info"]
for path in sys.path: for path in sys.path:
for root, dirs, files in os.walk(path): for root, dirs, files in os.walk(path):
for item in files: for item in files:
if "top_level" in item: if "top_level" in item:
with open_func(os.path.join(root, item), "r", encoding=encoding) as f: item = os.path.join(root, item)
with open(item, "r", encoding=encoding) as f:
package = root.split(os.sep)[-1].split("-") package = root.split(os.sep)[-1].split("-")
try: try:
package_import = f.read().strip().split("\n") top_level_modules = f.read().strip().split("\n")
except: except: # NOQA
# TODO: What errors do we intend to suppress here?
continue continue
for i_item in package_import:
if ((i_item not in ignore) and # filter off explicitly ignored top-level modules
(package[0] not in ignore)): # such as test, egg, etc.
packages[i_item] = { filtered_top_level_modules = list()
'version': package[1].replace(".dist", "").replace(".egg",""),
'name': package[0] for module in top_level_modules:
if (module not in ignore) and (package[0] not in ignore):
# append exported top level modules to the list
filtered_top_level_modules.append(module)
version = None
if len(package) > 1:
version = package[1].replace(".dist", "").replace(".egg", "")
# append package: top_level_modules pairs
# instead of top_level_module: package pairs
packages.append(
{
"name": package[0],
"version": version,
"exports": filtered_top_level_modules,
} }
)
return packages return packages
def get_import_local(imports, encoding=None): def get_import_local(imports, encoding="utf-8"):
local = get_locally_installed_packages() local = get_locally_installed_packages()
result = [] result = []
for item in imports: for item in imports:
if item.lower() in local: # search through local packages
result.append(local[item.lower()]) for package in local:
return result # if candidate import name matches export name
# or candidate import name equals to the package name
# append it to the result
if item in package["exports"] or item == package["name"]:
result.append(package)
# removing duplicates of package/version
# had to use second method instead of the previous one,
# because we have a list in the 'exports' field
# https://stackoverflow.com/questions/9427163/remove-duplicate-dict-in-list-in-python
result_unique = [i for n, i in enumerate(result) if i not in result[n + 1:]]
return result_unique
def get_pkg_names(pkgs): def get_pkg_names(pkgs):
result = [] """Get PyPI package names from a list of imports.
Args:
pkgs (List[str]): List of import names.
Returns:
List[str]: The corresponding PyPI package names.
"""
result = set()
with open(join("mapping"), "r") as f: with open(join("mapping"), "r") as f:
data = [x.strip().split(":") for x in f.readlines()] data = dict(x.strip().split(":") for x in f)
for pkg in pkgs: for pkg in pkgs:
toappend = pkg # Look up the mapped requirement. If a mapping isn't found,
for item in data: # simply use the package name.
if item[0] == pkg: result.add(data.get(pkg, pkg))
toappend = item[1] # Return a sorted list for backward compatibility.
break return sorted(result, key=lambda s: s.lower())
if toappend not in result:
result.append(toappend)
return result
def get_name_without_alias(name): def get_name_without_alias(name):
@ -192,23 +348,197 @@ def get_name_without_alias(name):
match = REGEXP[0].match(name.strip()) match = REGEXP[0].match(name.strip())
if match: if match:
name = match.groups(0)[0] name = match.groups(0)[0]
return name.partition(' as ')[0].partition('.')[0].strip() return name.partition(" as ")[0].partition(".")[0].strip()
def join(f): def join(f):
return os.path.join(os.path.dirname(__file__), f) return os.path.join(os.path.dirname(__file__), f)
def parse_requirements(file_):
"""Parse a requirements formatted file.
Traverse a string until a delimiter is detected, then split at said
delimiter, get module name by element index, create a dict consisting of
module:version, and add dict to list of parsed modules.
If file ´file_´ is not found in the system, the program will print a
helpful message and end its execution immediately.
Args:
file_: File to parse.
Raises:
OSerror: If there's any issues accessing the file.
Returns:
list: The contents of the file, excluding comments.
"""
modules = []
# For the dependency identifier specification, see
# https://www.python.org/dev/peps/pep-0508/#complete-grammar
delim = ["<", ">", "=", "!", "~"]
try:
f = open(file_, "r")
except FileNotFoundError:
print(f"File {file_} was not found. Please, fix it and run again.")
sys.exit(1)
except OSError as error:
logging.error(f"There was an error opening the file {file_}: {str(error)}")
raise error
else:
try:
data = [x.strip() for x in f.readlines() if x != "\n"]
finally:
f.close()
data = [x for x in data if x[0].isalpha()]
for x in data:
# Check for modules w/o a specifier.
if not any([y in x for y in delim]):
modules.append({"name": x, "version": None})
for y in x:
if y in delim:
module = x.split(y)
module_name = module[0]
module_version = module[-1].replace("=", "")
module = {"name": module_name, "version": module_version}
if module not in modules:
modules.append(module)
break
return modules
def compare_modules(file_, imports):
"""Compare modules in a file to imported modules in a project.
Args:
file_ (str): File to parse for modules to be compared.
imports (tuple): Modules being imported in the project.
Returns:
set: The modules not imported in the project, but do exist in the
specified file.
"""
modules = parse_requirements(file_)
imports = [imports[i]["name"] for i in range(len(imports))]
modules = [modules[i]["name"] for i in range(len(modules))]
modules_not_imported = set(modules) - set(imports)
return modules_not_imported
def diff(file_, imports):
"""Display the difference between modules in a file and imported modules.""" # NOQA
modules_not_imported = compare_modules(file_, imports)
logging.info(
"The following modules are in {} but do not seem to be imported: "
"{}".format(file_, ", ".join(x for x in modules_not_imported))
)
def clean(file_, imports):
"""Remove modules that aren't imported in project from file."""
modules_not_imported = compare_modules(file_, imports)
if len(modules_not_imported) == 0:
logging.info("Nothing to clean in " + file_)
return
re_remove = re.compile("|".join(modules_not_imported))
to_write = []
try:
f = open(file_, "r+")
except OSError:
logging.error("Failed on file: {}".format(file_))
raise
else:
try:
for i in f.readlines():
if re_remove.match(i) is None:
to_write.append(i)
f.seek(0)
f.truncate()
for i in to_write:
f.write(i)
finally:
f.close()
logging.info("Successfully cleaned up requirements in " + file_)
def dynamic_versioning(scheme, imports):
"""Enables dynamic versioning with <compat>, <gt> or <non-pin> schemes."""
if scheme == "no-pin":
imports = [{"name": item["name"], "version": ""} for item in imports]
symbol = ""
elif scheme == "gt":
symbol = ">="
elif scheme == "compat":
symbol = "~="
return imports, symbol
def handle_scan_noteboooks():
if not scan_noteboooks:
logging.info("Not scanning for jupyter notebooks.")
return
try:
global PythonExporter
from nbconvert import PythonExporter
except ImportError:
raise NbconvertNotInstalled()
def init(args): def init(args):
encoding = args.get('--encoding') global scan_noteboooks
extra_ignore_dirs = args.get('--ignore') encoding = args.get("--encoding")
extra_ignore_dirs = args.get("--ignore")
follow_links = not args.get("--no-follow-links")
ignore_errors = args.get("--ignore-errors")
scan_noteboooks = args.get("--scan-notebooks", False)
handle_scan_noteboooks()
input_path = args["<path>"]
if encoding is None:
encoding = "utf-8"
if input_path is None:
input_path = os.path.abspath(os.curdir)
if extra_ignore_dirs: if extra_ignore_dirs:
extra_ignore_dirs = extra_ignore_dirs.split(',') extra_ignore_dirs = extra_ignore_dirs.split(",")
candidates = get_all_imports(args['<path>'], path = (
args["--savepath"] if args["--savepath"] else os.path.join(input_path, "requirements.txt")
)
if (
not args["--print"]
and not args["--savepath"]
and not args["--force"]
and os.path.exists(path)
):
logging.warning("requirements.txt already exists, " "use --force to overwrite it")
return
candidates = get_all_imports(
input_path,
encoding=encoding, encoding=encoding,
extra_ignore_dirs = extra_ignore_dirs) extra_ignore_dirs=extra_ignore_dirs,
follow_links=follow_links,
ignore_errors=ignore_errors,
)
candidates = get_pkg_names(candidates) candidates = get_pkg_names(candidates)
logging.debug("Found imports: " + ", ".join(candidates)) logging.debug("Found imports: " + ", ".join(candidates))
pypi_server = "https://pypi.python.org/pypi/" pypi_server = "https://pypi.python.org/pypi/"
@ -217,37 +547,66 @@ def init(args):
pypi_server = args["--pypi-server"] pypi_server = args["--pypi-server"]
if args["--proxy"]: if args["--proxy"]:
proxy = {'http': args["--proxy"], 'https': args["--proxy"]} proxy = {"http": args["--proxy"], "https": args["--proxy"]}
if args["--use-local"]: if args["--use-local"]:
logging.debug( logging.debug("Getting package information ONLY from local installation.")
"Getting package information ONLY from local installation.")
imports = get_import_local(candidates, encoding=encoding) imports = get_import_local(candidates, encoding=encoding)
else: else:
logging.debug("Getting packages information from Local/PyPI") logging.debug("Getting packages information from Local/PyPI")
local = get_import_local(candidates, encoding=encoding) local = get_import_local(candidates, encoding=encoding)
# Get packages that were not found locally
difference = [x for x in candidates
if x.lower() not in [z['name'].lower() for z in local]]
imports = local + get_imports_info(difference,
proxy=proxy,
pypi_server=pypi_server)
path = (args["--savepath"] if args["--savepath"] else # check if candidate name is found in
os.path.join(args['<path>'], "requirements.txt")) # the list of exported modules, installed locally
# and the package name is not in the list of local module names
# it add to difference
difference = [
x
for x in candidates
if
# aggregate all export lists into one
# flatten the list
# check if candidate is in exports
x.lower() not in [y for x in local for y in x["exports"]] and
# check if candidate is package names
x.lower() not in [x["name"] for x in local]
]
if not args["--savepath"] and not args["--force"] and os.path.exists(path): imports = local + get_imports_info(difference, proxy=proxy, pypi_server=pypi_server)
logging.warning("Requirements.txt already exists, " # sort imports based on lowercase name of package, similar to `pip freeze`.
"use --force to overwrite it") imports = sorted(imports, key=lambda x: x["name"].lower())
if args["--diff"]:
diff(args["--diff"], imports)
return return
generate_requirements_file(path, imports)
if args["--clean"]:
clean(args["--clean"], imports)
return
if args["--mode"]:
scheme = args.get("--mode")
if scheme in ["compat", "gt", "no-pin"]:
imports, symbol = dynamic_versioning(scheme, imports)
else:
raise ValueError(
"Invalid argument for mode flag, " "use 'compat', 'gt' or 'no-pin' instead"
)
else:
symbol = "=="
if args["--print"]:
output_requirements(imports, symbol)
logging.info("Successfully output requirements")
else:
generate_requirements_file(path, imports, symbol)
logging.info("Successfully saved requirements file in " + path) logging.info("Successfully saved requirements file in " + path)
def main(): # pragma: no cover def main(): # pragma: no cover
args = docopt(__doc__, version=__version__) args = docopt(__doc__, version=__version__)
log_level = logging.DEBUG if args['--debug'] else logging.INFO log_level = logging.DEBUG if args["--debug"] else logging.INFO
logging.basicConfig(level=log_level, format='%(levelname)s: %(message)s') logging.basicConfig(level=log_level, format="%(levelname)s: %(message)s")
try: try:
init(args) init(args)
@ -255,5 +614,5 @@ def main(): # pragma: no cover
sys.exit(0) sys.exit(0)
if __name__ == '__main__': if __name__ == "__main__":
main() # pragma: no cover main() # pragma: no cover

File diff suppressed because it is too large Load Diff

2021
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

2
poetry.toml Normal file
View File

@ -0,0 +1,2 @@
[virtualenvs]
prefer-active-python = true

53
pyproject.toml Normal file
View File

@ -0,0 +1,53 @@
[project]
name = "pipreqs"
version = "0.5.0"
description = "Pip requirements.txt generator based on imports in project"
authors = [
{ name = "Vadim Kravcenko", email = "vadim.kravcenko@gmail.com" }
]
maintainers = [
{name = "Jonas Eschle", email = "jonas.eschle@gmail.com"}
]
license = "Apache-2.0"
readme = "README.rst"
packages = [{ include = "pipreqs" }]
repository = "https://github.com/bndr/pipreqs"
keywords = ["pip", "requirements", "imports"]
classifiers = [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
"Natural Language :: English",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
]
requires-python = ">=3.9, <3.14"
dependencies = [
"yarg>=0.1.9",
"docopt>=0.6.2",
"nbconvert>=7.11.0",
"ipython>=8.12.3",
]
[project.optional-dependencies]
dev = [
"flake8>=6.1.0",
"tox>=4.11.3",
"coverage>=7.3.2",
"sphinx>=7.2.6;python_version>='3.9'",
]
[tool.poetry.group.dev.dependencies] # for legacy usage
flake8 = "^6.1.0"
tox = "^4.11.3"
coverage = "^7.3.2"
sphinx = { version = "^7.2.6", python = ">=3.9" }
[project.scripts]
pipreqs = "pipreqs.pipreqs:main"
[build-system]
requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api"

View File

@ -1,3 +0,0 @@
wheel==0.23.0
Yarg==0.1.9
docopt==0.6.2

View File

@ -1,2 +0,0 @@
[wheel]
universal = 1

View File

@ -1,60 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
from pipreqs import __version__
with open('README.rst') as readme_file:
readme = readme_file.read()
with open('HISTORY.rst') as history_file:
history = history_file.read().replace('.. :changelog:', '')
requirements = [
'docopt', 'yarg'
]
setup(
name='pipreqs',
version=__version__,
description="Pip requirements.txt generator based on imports in project",
long_description=readme + '\n\n' + history,
author="Vadim Kravcenko",
author_email='vadim.kravcenko@gmail.com',
url='https://github.com/bndr/pipreqs',
packages=[
'pipreqs',
],
package_dir={'pipreqs':
'pipreqs'},
include_package_data=True,
package_data={'': ['stdlib','mapping']},
install_requires=requirements,
license="Apache License",
zip_safe=False,
keywords='pip requirements imports',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Natural Language :: English',
"Programming Language :: Python :: 2",
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
],
test_suite='tests',
entry_points={
'console_scripts': [
'pipreqs=pipreqs.pipreqs:main',
],
},
)

0
tests/_data/empty.txt Normal file
View File

3
tests/_data/imports.txt Normal file
View File

@ -0,0 +1,3 @@
pandas==2.0.0
numpy>=1.2.3
torch<4.0.0

View File

@ -0,0 +1,4 @@
numpy
pandas==2.0.0
tensorflow
torch<4.0.0

View File

@ -0,0 +1,3 @@
pandas
tensorflow
torch

View File

@ -31,6 +31,10 @@ from pyflakes.test.test_imports import Test as TestImports
# Nose # Nose
from nose.importer import Importer, add_path, remove_path # loader.py from nose.importer import Importer, add_path, remove_path # loader.py
# see issue #88
import analytics
import flask_seasurf
import atexit import atexit
from __future__ import print_function from __future__ import print_function
from docopt import docopt from docopt import docopt

65
tests/_data_clean/test.py Normal file
View File

@ -0,0 +1,65 @@
"""unused import"""
# pylint: disable=undefined-all-variable, import-error, no-absolute-import, too-few-public-methods, missing-docstring
import xml.etree # [unused-import]
import xml.sax # [unused-import]
import os.path as test # [unused-import]
from sys import argv as test2 # [unused-import]
from sys import flags # [unused-import]
# +1:[unused-import,unused-import]
from collections import deque, OrderedDict, Counter
# All imports above should be ignored
import requests # [unused-import]
# setuptools
import zipimport # command/easy_install.py
# twisted
from importlib import invalidate_caches # python/test/test_deprecate.py
# astroid
import zipimport # manager.py
# IPython
from importlib.machinery import all_suffixes # core/completerlib.py
import importlib # html/notebookapp.py
from IPython.utils.importstring import import_item # Many files
# pyflakes
# test/test_doctests.py
from pyflakes.test.test_imports import Test as TestImports
# Nose
from nose.importer import Importer, add_path, remove_path # loader.py
# see issue #88
import analytics
import flask_seasurf
import atexit
from __future__ import print_function
from docopt import docopt
import curses, logging, sqlite3
import logging
import os
import sqlite3
import time
import sys
import signal
import bs4
import nonexistendmodule
import boto as b, peewee as p
# import django
import flask.ext.somext # # #
# from sqlalchemy import model
try:
import ujson as json
except ImportError:
import json
import models
def main():
pass
import after_method_is_valid_even_if_not_pep8

View File

@ -0,0 +1,65 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Magic test"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%automagic true"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ls -la\n",
"logstate"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ls -la"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%automagic false"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ls -la"
]
}
],
"metadata": {
"language_info": {
"name": "python"
},
"orig_nbformat": 4
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -0,0 +1,37 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Markdown test\n",
"import sklearn\n",
"\n",
"```python\n",
"import FastAPI\n",
"```"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.1"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

View File

View File

@ -0,0 +1,102 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"\"\"\"unused import\"\"\"\n",
"# pylint: disable=undefined-all-variable, import-error, no-absolute-import, too-few-public-methods, missing-docstring\n",
"import xml.etree # [unused-import]\n",
"import xml.sax # [unused-import]\n",
"import os.path as test # [unused-import]\n",
"from sys import argv as test2 # [unused-import]\n",
"from sys import flags # [unused-import]\n",
"# +1:[unused-import,unused-import]\n",
"from collections import deque, OrderedDict, Counter\n",
"# All imports above should be ignored\n",
"import requests # [unused-import]\n",
"\n",
"# setuptools\n",
"import zipimport # command/easy_install.py\n",
"\n",
"# twisted\n",
"from importlib import invalidate_caches # python/test/test_deprecate.py\n",
"\n",
"# astroid\n",
"import zipimport # manager.py\n",
"# IPython\n",
"from importlib.machinery import all_suffixes # core/completerlib.py\n",
"import importlib # html/notebookapp.py\n",
"\n",
"from IPython.utils.importstring import import_item # Many files\n",
"\n",
"# pyflakes\n",
"# test/test_doctests.py\n",
"from pyflakes.test.test_imports import Test as TestImports\n",
"\n",
"# Nose\n",
"from nose.importer import Importer, add_path, remove_path # loader.py\n",
"\n",
"import atexit\n",
"from __future__ import print_function\n",
"from docopt import docopt\n",
"import curses, logging, sqlite3\n",
"import logging\n",
"import os\n",
"import sqlite3\n",
"import time\n",
"import sys\n",
"import signal\n",
"import bs4\n",
"import nonexistendmodule\n",
"import boto as b, peewee as p\n",
"# import django\n",
"import flask.ext.somext # # #\n",
"from sqlalchemy import model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" import ujson as json\n",
"except ImportError:\n",
" import json\n",
"\n",
"import models\n",
"\n",
"\n",
"def main():\n",
" pass\n",
"\n",
"import after_method_is_valid_even_if_not_pep8"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.1"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

5
tests/_data_pyw/py.py Normal file
View File

@ -0,0 +1,5 @@
import airflow
import numpy
airflow
numpy

3
tests/_data_pyw/pyw.pyw Normal file
View File

@ -0,0 +1,3 @@
import matplotlib
import pandas
import tensorflow

View File

@ -0,0 +1,34 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"cd ."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.4"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

611
tests/test_pipreqs.py Executable file → Normal file
View File

@ -8,35 +8,93 @@ test_pipreqs
Tests for `pipreqs` module. Tests for `pipreqs` module.
""" """
from io import StringIO
import logging
from unittest.mock import patch, Mock
import unittest import unittest
import os import os
import requests import requests
import sys
import warnings
from pipreqs import pipreqs from pipreqs import pipreqs
class TestPipreqs(unittest.TestCase): class TestPipreqs(unittest.TestCase):
def setUp(self): @classmethod
self.modules = ['flask', 'requests', 'sqlalchemy', def setUpClass(cls):
'docopt', 'boto', 'ipython', 'pyflakes', 'nose', # Disable all logs for not spamming the terminal when running tests.
'peewee', 'ujson', 'nonexistendmodule', 'bs4', 'after_method_is_valid_even_if_not_pep8' ] logging.disable(logging.CRITICAL)
self.modules2 = ['beautifulsoup4']
self.local = ["docopt", "requests", "nose", 'pyflakes'] # Specific warning not covered by the above command:
self.project = os.path.join(os.path.dirname(__file__), "_data") warnings.filterwarnings("ignore", category=DeprecationWarning, module="jupyter_client")
self.project_invalid = os.path.join(os.path.dirname(__file__), "_invalid_data")
self.project_with_ignore_directory = os.path.join(os.path.dirname(__file__), "_data_ignore") cls.modules = [
self.project_with_duplicated_deps = os.path.join(os.path.dirname(__file__), "_data_duplicated_deps") "flask",
self.requirements_path = os.path.join(self.project, "requirements.txt") "requests",
self.alt_requirement_path = os.path.join( "sqlalchemy",
self.project, "requirements2.txt") "docopt",
"boto",
"ipython",
"pyflakes",
"nose",
"analytics",
"flask_seasurf",
"peewee",
"ujson",
"nonexistendmodule",
"bs4",
"after_method_is_valid_even_if_not_pep8",
]
cls.modules2 = ["beautifulsoup4"]
cls.local = ["docopt", "requests", "nose", "pyflakes", "ipython"]
cls.project = os.path.join(os.path.dirname(__file__), "_data")
cls.empty_filepath = os.path.join(cls.project, "empty.txt")
cls.imports_filepath = os.path.join(cls.project, "imports.txt")
cls.imports_no_version_filepath = os.path.join(cls.project, "imports_no_version.txt")
cls.imports_any_version_filepath = os.path.join(cls.project, "imports_any_version.txt")
cls.non_existent_filepath = os.path.join(cls.project, "non_existent_file.txt")
cls.parsed_packages = [
{"name": "pandas", "version": "2.0.0"},
{"name": "numpy", "version": "1.2.3"},
{"name": "torch", "version": "4.0.0"},
]
cls.parsed_packages_no_version = [
{"name": "pandas", "version": None},
{"name": "tensorflow", "version": None},
{"name": "torch", "version": None},
]
cls.parsed_packages_any_version = [
{"name": "numpy", "version": None},
{"name": "pandas", "version": "2.0.0"},
{"name": "tensorflow", "version": None},
{"name": "torch", "version": "4.0.0"},
]
cls.project_clean = os.path.join(os.path.dirname(__file__), "_data_clean")
cls.project_invalid = os.path.join(os.path.dirname(__file__), "_invalid_data")
cls.project_with_ignore_directory = os.path.join(os.path.dirname(__file__), "_data_ignore")
cls.project_with_duplicated_deps = os.path.join(os.path.dirname(__file__), "_data_duplicated_deps")
cls.requirements_path = os.path.join(cls.project, "requirements.txt")
cls.alt_requirement_path = os.path.join(cls.project, "requirements2.txt")
cls.non_existing_filepath = "xpto"
cls.project_with_notebooks = os.path.join(os.path.dirname(__file__), "_data_notebook")
cls.project_with_invalid_notebooks = os.path.join(os.path.dirname(__file__), "_invalid_data_notebook")
cls.python_path_same_imports = os.path.join(os.path.dirname(__file__), "_data/test.py")
cls.notebook_path_same_imports = os.path.join(os.path.dirname(__file__), "_data_notebook/test.ipynb")
def test_get_all_imports(self): def test_get_all_imports(self):
imports = pipreqs.get_all_imports(self.project) imports = pipreqs.get_all_imports(self.project)
self.assertEqual(len(imports), 13) self.assertEqual(len(imports), 15)
for item in imports: for item in imports:
self.assertTrue( self.assertTrue(item.lower() in self.modules, "Import is missing: " + item)
item.lower() in self.modules, "Import is missing: " + item)
self.assertFalse("time" in imports) self.assertFalse("time" in imports)
self.assertFalse("logging" in imports) self.assertFalse("logging" in imports)
self.assertFalse("curses" in imports) self.assertFalse("curses" in imports)
@ -56,62 +114,119 @@ class TestPipreqs(unittest.TestCase):
""" """
self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_invalid) self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_invalid)
def test_ignore_errors(self):
"""
Test that invalid python files do not raise an exception when ignore_errors is True.
"""
imports = pipreqs.get_all_imports(self.project_invalid, ignore_errors=True)
self.assertEqual(len(imports), 0)
def test_get_imports_info(self): def test_get_imports_info(self):
""" """
Test to see that the right number of packages were found on PyPI Test to see that the right number of packages were found on PyPI
""" """
imports = pipreqs.get_all_imports(self.project) imports = pipreqs.get_all_imports(self.project)
with_info = pipreqs.get_imports_info(imports) with_info = pipreqs.get_imports_info(imports)
# Should contain 10 items without the "nonexistendmodule" and "after_method_is_valid_even_if_not_pep8" # Should contain 10 items without the "nonexistendmodule" and
self.assertEqual(len(with_info), 10) # "after_method_is_valid_even_if_not_pep8"
self.assertEqual(len(with_info), 13)
for item in with_info: for item in with_info:
self.assertTrue( self.assertTrue(
item['name'].lower() in self.modules, item["name"].lower() in self.modules,
"Import item appears to be missing " + item['name']) "Import item appears to be missing " + item["name"],
)
def test_get_pkg_names(self):
pkgs = ["jury", "Japan", "camel", "Caroline"]
actual_output = pipreqs.get_pkg_names(pkgs)
expected_output = ["camel", "Caroline", "Japan", "jury"]
self.assertEqual(actual_output, expected_output)
def test_get_use_local_only(self): def test_get_use_local_only(self):
""" """
Test without checking PyPI, check to see if names of local imports matches what we expect Test without checking PyPI, check to see if names of local
imports matches what we expect
- Note even though pyflakes isn't in requirements.txt, - Note even though pyflakes isn't in requirements.txt,
It's added to locals since it is a development dependency for testing It's added to locals since it is a development dependency
for testing
""" """
# should find only docopt and requests # should find only docopt and requests
imports_with_info = pipreqs.get_import_local(self.modules) imports_with_info = pipreqs.get_import_local(self.modules)
for item in imports_with_info: for item in imports_with_info:
self.assertTrue(item['name'].lower() in self.local) self.assertTrue(item["name"].lower() in self.local)
def test_init(self): def test_init(self):
""" """
Test that all modules we will test upon, are in requirements file Test that all modules we will test upon are in requirements file
""" """
pipreqs.init({'<path>': self.project, '--savepath': None, pipreqs.init(
'--use-local': None, '--force': True, '--proxy':None, '--pypi-server':None}) {
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1 assert os.path.exists(self.requirements_path) == 1
with open(self.requirements_path, "r") as f: with open(self.requirements_path, "r") as f:
data = f.read().lower() data = f.read().lower()
for item in self.modules[:-3]: for item in self.modules[:-3]:
self.assertTrue(item.lower() in data) self.assertTrue(item.lower() in data)
# It should be sorted based on names.
data = data.strip().split("\n")
self.assertEqual(data, sorted(data))
def test_init_local_only(self): def test_init_local_only(self):
""" """
Test that items listed in requirements.text are the same as locals expected Test that items listed in requirements.text are the same
as locals expected
""" """
pipreqs.init({'<path>': self.project, '--savepath': None, pipreqs.init(
'--use-local': True, '--force': True, '--proxy':None, '--pypi-server':None}) {
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": True,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1 assert os.path.exists(self.requirements_path) == 1
with open(self.requirements_path, "r") as f: with open(self.requirements_path, "r") as f:
data = f.readlines() data = f.readlines()
for item in data: for item in data:
item = item.strip().split(" == ") item = item.strip().split("==")
self.assertTrue(item[0].lower() in self.local) self.assertTrue(item[0].lower() in self.local)
def test_init_savepath(self): def test_init_savepath(self):
""" """
Test that we can save requiremnts.tt correctly to a different path Test that we can save requirements.txt correctly
to a different path
""" """
pipreqs.init({'<path>': self.project, '--savepath': pipreqs.init(
self.alt_requirement_path, '--use-local': None, '--proxy':None, '--pypi-server':None}) {
"<path>": self.project,
"--savepath": self.alt_requirement_path,
"--use-local": None,
"--proxy": None,
"--pypi-server": None,
"--print": False,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.alt_requirement_path) == 1 assert os.path.exists(self.alt_requirement_path) == 1
with open(self.alt_requirement_path, "r") as f: with open(self.alt_requirement_path, "r") as f:
data = f.read().lower() data = f.read().lower()
@ -122,12 +237,25 @@ class TestPipreqs(unittest.TestCase):
def test_init_overwrite(self): def test_init_overwrite(self):
""" """
Test that if requiremnts.txt exists, it will not automatically be overwritten Test that if requiremnts.txt exists, it will not be
automatically overwritten
""" """
with open(self.requirements_path, "w") as f: with open(self.requirements_path, "w") as f:
f.write("should_not_be_overwritten") f.write("should_not_be_overwritten")
pipreqs.init({'<path>': self.project, '--savepath': None, pipreqs.init(
'--use-local': None, '--force': None, '--proxy':None, '--pypi-server':None}) {
"<path>": self.project,
"--savepath": None,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--print": False,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1 assert os.path.exists(self.requirements_path) == 1
with open(self.requirements_path, "r") as f: with open(self.requirements_path, "r") as f:
data = f.read().lower() data = f.read().lower()
@ -135,40 +263,421 @@ class TestPipreqs(unittest.TestCase):
def test_get_import_name_without_alias(self): def test_get_import_name_without_alias(self):
""" """
Test that function get_name_without_alias() will work on a string. Test that function get_name_without_alias()
- Note: This isn't truly needed when pipreqs is walking the AST to find imports will work on a string.
- Note: This isn't truly needed when pipreqs is walking
the AST to find imports
""" """
import_name_with_alias = "requests as R" import_name_with_alias = "requests as R"
expected_import_name_without_alias = "requests" expected_import_name_without_alias = "requests"
import_name_without_aliases = pipreqs.get_name_without_alias( import_name_without_aliases = pipreqs.get_name_without_alias(import_name_with_alias)
import_name_with_alias) self.assertEqual(import_name_without_aliases, expected_import_name_without_alias)
self.assertEqual(
import_name_without_aliases, expected_import_name_without_alias)
def test_custom_pypi_server(self): def test_custom_pypi_server(self):
""" """
Test that trying to get a custom pypi sever fails correctly Test that trying to get a custom pypi sever fails correctly
""" """
self.assertRaises(requests.exceptions.MissingSchema, pipreqs.init, {'<path>': self.project, '--savepath': None, self.assertRaises(
'--use-local': None, '--force': True, '--proxy': None, '--pypi-server': 'nonexistent'}) requests.exceptions.MissingSchema,
pipreqs.init,
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": "nonexistent",
},
)
def test_ignored_directory(self): def test_ignored_directory(self):
""" """
Test --ignore parameter Test --ignore parameter
""" """
pipreqs.init( pipreqs.init(
{'<path>': self.project_with_ignore_directory, '--savepath': None, {
'--use-local': None, '--force': True, "<path>": self.project_with_ignore_directory,
'--proxy':None, "--savepath": None,
'--pypi-server':None, "--print": False,
'--ignore':'.ignored_dir,.ignore_second' "--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--ignore": ".ignored_dir,.ignore_second",
"--diff": None,
"--clean": None,
"--mode": None,
} }
) )
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f: with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.read().lower() data = f.read().lower()
for item in ['click', 'getpass']: for item in ["click", "getpass"]:
self.assertFalse(item.lower() in data) self.assertFalse(item.lower() in data)
def test_dynamic_version_no_pin_scheme(self):
"""
Test --mode=no-pin
"""
pipreqs.init(
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": "no-pin",
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.read().lower()
for item in ["beautifulsoup4", "boto"]:
self.assertTrue(item.lower() in data)
def test_dynamic_version_gt_scheme(self):
"""
Test --mode=gt
"""
pipreqs.init(
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": "gt",
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.readlines()
for item in data:
symbol = ">="
message = "symbol is not in item"
self.assertIn(symbol, item, message)
def test_dynamic_version_compat_scheme(self):
"""
Test --mode=compat
"""
pipreqs.init(
{
"<path>": self.project_with_ignore_directory,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": "compat",
}
)
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
data = f.readlines()
for item in data:
symbol = "~="
message = "symbol is not in item"
self.assertIn(symbol, item, message)
def test_clean(self):
"""
Test --clean parameter
"""
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": self.requirements_path,
"--mode": "non-pin",
}
)
with open(self.requirements_path, "r") as f:
data = f.read().lower()
for item in self.modules[:-3]:
self.assertTrue(item.lower() in data)
def test_clean_with_imports_to_clean(self):
"""
Test --clean parameter when there are imports to clean
"""
cleaned_module = "sqlalchemy"
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
assert os.path.exists(self.requirements_path) == 1
pipreqs.init(
{
"<path>": self.project_clean,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": self.requirements_path,
"--mode": "non-pin",
}
)
with open(self.requirements_path, "r") as f:
data = f.read().lower()
self.assertTrue(cleaned_module not in data)
def test_compare_modules(self):
test_cases = [
(self.empty_filepath, [], set()), # both empty
(self.empty_filepath, self.parsed_packages, set()), # only file empty
(
self.imports_filepath,
[],
set(package["name"] for package in self.parsed_packages),
), # only imports empty
(self.imports_filepath, self.parsed_packages, set()), # no difference
(
self.imports_filepath,
self.parsed_packages[1:],
set([self.parsed_packages[0]["name"]]),
), # common case
]
for test_case in test_cases:
with self.subTest(test_case):
filename, imports, expected_modules_not_imported = test_case
modules_not_imported = pipreqs.compare_modules(filename, imports)
self.assertSetEqual(modules_not_imported, expected_modules_not_imported)
def test_output_requirements(self):
"""
Test --print parameter
It should print to stdout the same content as requeriments.txt
"""
capturedOutput = StringIO()
sys.stdout = capturedOutput
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": True,
"--use-local": None,
"--force": None,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
pipreqs.init(
{
"<path>": self.project,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
with open(self.requirements_path, "r") as f:
file_content = f.read().lower()
stdout_content = capturedOutput.getvalue().lower()
self.assertTrue(file_content == stdout_content)
def test_import_notebooks(self):
"""
Test the function get_all_imports() using .ipynb file
"""
self.mock_scan_notebooks()
imports = pipreqs.get_all_imports(self.project_with_notebooks)
for item in imports:
self.assertTrue(item.lower() in self.modules, "Import is missing: " + item)
not_desired_imports = ["time", "logging", "curses", "__future__", "django", "models", "FastAPI", "sklearn"]
for not_desired_import in not_desired_imports:
self.assertFalse(
not_desired_import in imports,
f"{not_desired_import} was imported, but it should not have been."
)
def test_invalid_notebook(self):
"""
Test that invalid notebook files cannot be imported.
"""
self.mock_scan_notebooks()
self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_with_invalid_notebooks)
def test_ipynb_2_py(self):
"""
Test the function ipynb_2_py() which converts .ipynb file to .py format
"""
python_imports = pipreqs.get_all_imports(self.python_path_same_imports)
notebook_imports = pipreqs.get_all_imports(self.notebook_path_same_imports)
self.assertEqual(python_imports, notebook_imports)
def test_file_ext_is_allowed(self):
"""
Test the function file_ext_is_allowed()
"""
self.assertTrue(pipreqs.file_ext_is_allowed("main.py", [".py"]))
self.assertTrue(pipreqs.file_ext_is_allowed("main.py", [".py", ".ipynb"]))
self.assertFalse(pipreqs.file_ext_is_allowed("main.py", [".ipynb"]))
def test_parse_requirements(self):
"""
Test parse_requirements function
"""
test_cases = [
(self.empty_filepath, []), # empty file
(self.imports_filepath, self.parsed_packages), # imports with versions
(
self.imports_no_version_filepath,
self.parsed_packages_no_version,
), # imports without versions
(
self.imports_any_version_filepath,
self.parsed_packages_any_version,
), # imports with and without versions
]
for test in test_cases:
with self.subTest(test):
filename, expected_parsed_requirements = test
parsed_requirements = pipreqs.parse_requirements(filename)
self.assertListEqual(parsed_requirements, expected_parsed_requirements)
@patch("sys.exit")
def test_parse_requirements_handles_file_not_found(self, exit_mock):
captured_output = StringIO()
sys.stdout = captured_output
# This assertion is needed, because since "sys.exit" is mocked, the program won't end,
# and the code that is after the except block will be run
with self.assertRaises(UnboundLocalError):
pipreqs.parse_requirements(self.non_existing_filepath)
exit_mock.assert_called_once_with(1)
printed_text = captured_output.getvalue().strip()
sys.stdout = sys.__stdout__
self.assertEqual(printed_text, "File xpto was not found. Please, fix it and run again.")
def test_ignore_notebooks(self):
"""
Test if notebooks are ignored when the scan-notebooks parameter is False
"""
notebook_requirement_path = os.path.join(self.project_with_notebooks, "requirements.txt")
pipreqs.init(
{
"<path>": self.project_with_notebooks,
"--savepath": None,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--print": False,
"--diff": None,
"--clean": None,
"--mode": None,
"--scan-notebooks": False,
}
)
assert os.path.exists(notebook_requirement_path) == 1
assert os.path.getsize(notebook_requirement_path) == 1 # file only has a "\n", meaning it's empty
def test_pipreqs_get_imports_from_pyw_file(self):
pyw_test_dirpath = os.path.join(os.path.dirname(__file__), "_data_pyw")
requirements_path = os.path.join(pyw_test_dirpath, "requirements.txt")
pipreqs.init(
{
"<path>": pyw_test_dirpath,
"--savepath": None,
"--print": False,
"--use-local": None,
"--force": True,
"--proxy": None,
"--pypi-server": None,
"--diff": None,
"--clean": None,
"--mode": None,
}
)
self.assertTrue(os.path.exists(requirements_path))
expected_imports = [
"airflow",
"matplotlib",
"numpy",
"pandas",
"tensorflow",
]
with open(requirements_path, "r") as f:
imports_data = f.read().lower()
for _import in expected_imports:
self.assertTrue(
_import.lower() in imports_data,
f"'{_import}' import was expected but not found.",
)
os.remove(requirements_path)
def mock_scan_notebooks(self):
pipreqs.scan_noteboooks = Mock(return_value=True)
pipreqs.handle_scan_noteboooks()
def tearDown(self): def tearDown(self):
""" """
@ -184,5 +693,5 @@ class TestPipreqs(unittest.TestCase):
pass pass
if __name__ == '__main__': if __name__ == "__main__":
unittest.main() unittest.main()

30
tox.ini
View File

@ -1,9 +1,31 @@
[tox] [tox]
envlist = py26, py27, py33, py34 isolated_build = true
envlist = py39, py310, py311, py312, py313, pypy3, flake8
[gh-actions]
python =
3.9: py39
3.10: py310
3.11: py311
3.12: py312
3.13: py313
pypy-3.10: pypy3
[testenv] [testenv]
setenv = setenv =
PYTHONPATH = {toxinidir}:{toxinidir}/pipreqs PYTHONPATH = {toxinidir}:{toxinidir}/pipreqs
commands = python setup.py test commands =
deps = python -m unittest discover
-r{toxinidir}/requirements.txt
[testenv:flake8]
deps = flake8
commands = flake8 pipreqs tests
[flake8]
exclude =
tests/_data/
tests/_data_clean/
tests/_data_duplicated_deps/
tests/_data_ignore/
tests/_invalid_data/
max-line-length = 120