mirror of
https://github.com/bndr/pipreqs.git
synced 2025-06-03 01:50:11 +00:00
Merge branch 'next' into readme/badge
This commit is contained in:
commit
eb37d03ff7
23
.github/workflows/tests.yml
vendored
23
.github/workflows/tests.yml
vendored
@ -1,19 +1,23 @@
|
||||
name: Tests and Codecov
|
||||
on: pull_request
|
||||
on:
|
||||
push:
|
||||
pull_request:
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
run_tests:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version: [3.7, 3.8, 3.9, pypy-3.7]
|
||||
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12', 'pypy-3.9-7.3.12']
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v2
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
@ -30,21 +34,22 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install coverage docopt yarg requests
|
||||
pip install poetry
|
||||
poetry install --with dev
|
||||
|
||||
- name: Calculate coverage
|
||||
run: coverage run --source=pipreqs -m unittest discover
|
||||
run: poetry run coverage run --source=pipreqs -m unittest discover
|
||||
|
||||
- name: Create XML report
|
||||
run: coverage xml
|
||||
run: poetry run coverage xml
|
||||
|
||||
- name: Upload coverage to Codecov
|
||||
uses: codecov/codecov-action@v2
|
||||
uses: codecov/codecov-action@v3
|
||||
with:
|
||||
files: coverage.xml
|
||||
fail_ci_if_error: true
|
||||
|
6
.python-version
Normal file
6
.python-version
Normal file
@ -0,0 +1,6 @@
|
||||
3.12
|
||||
3.11
|
||||
3.10
|
||||
3.9
|
||||
3.8
|
||||
pypy3.9-7.3.12
|
1
.tool-versions
Normal file
1
.tool-versions
Normal file
@ -0,0 +1 @@
|
||||
python 3.12 3.11 3.10 3.9 3.8 pypy3.9-7.3.12
|
@ -61,12 +61,11 @@ Ready to contribute? Here's how to set up `pipreqs` for local development.
|
||||
2. Clone your fork locally::
|
||||
|
||||
$ git clone git@github.com:your_name_here/pipreqs.git
|
||||
|
||||
3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::
|
||||
|
||||
$ mkvirtualenv pipreqs
|
||||
$ cd pipreqs/
|
||||
$ python setup.py develop
|
||||
|
||||
3. Pipreqs is developed using Poetry. Refer to the `documentation <https://python-poetry.org/docs/>`_ to install Poetry in your local environment. Next, you should install pipreqs's dependencies::
|
||||
|
||||
$ poetry install --with dev
|
||||
|
||||
4. Create a branch for local development::
|
||||
|
||||
@ -76,11 +75,11 @@ Ready to contribute? Here's how to set up `pipreqs` for local development.
|
||||
|
||||
5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::
|
||||
|
||||
$ flake8 pipreqs tests
|
||||
$ python setup.py test
|
||||
$ tox
|
||||
|
||||
To get flake8 and tox, just pip install them into your virtualenv.
|
||||
$ poetry run flake8 pipreqs tests
|
||||
$ poetry run python -m unittest discover
|
||||
$ poetry run tox
|
||||
|
||||
To test all versions of python using tox you need to have them installed and for this two options are recommended: `pyenv` or `asdf`.
|
||||
|
||||
6. Commit your changes and push your branch to GitHub::
|
||||
|
||||
@ -99,7 +98,7 @@ Before you submit a pull request, check that it meets these guidelines:
|
||||
2. If the pull request adds functionality, the docs should be updated. Put
|
||||
your new functionality into a function with a docstring, and add the
|
||||
feature to the list in README.rst.
|
||||
3. The pull request should work for Python 3.7 to 3.11, and PyPy. Check
|
||||
3. The pull request should work for currently supported Python and PyPy versions. Check
|
||||
https://travis-ci.org/bndr/pipreqs/pull_requests and make sure that the
|
||||
tests pass for all supported Python versions.
|
||||
|
||||
@ -108,4 +107,4 @@ Tips
|
||||
|
||||
To run a subset of tests::
|
||||
|
||||
$ python -m unittest tests.test_pipreqs
|
||||
$ poetry run python -m unittest tests.test_pipreqs
|
||||
|
13
MANIFEST.in
13
MANIFEST.in
@ -1,13 +0,0 @@
|
||||
include AUTHORS.rst
|
||||
include CONTRIBUTING.rst
|
||||
include HISTORY.rst
|
||||
include LICENSE
|
||||
include README.rst
|
||||
include pipreqs/stdlib
|
||||
include pipreqs/mapping
|
||||
|
||||
recursive-include tests *
|
||||
recursive-exclude * __pycache__
|
||||
recursive-exclude * *.py[co]
|
||||
|
||||
recursive-include docs *.rst conf.py Makefile make.bat stdlib mapping
|
31
Makefile
31
Makefile
@ -6,13 +6,14 @@ help:
|
||||
@echo "clean-pyc - remove Python file artifacts"
|
||||
@echo "clean-test - remove test and coverage artifacts"
|
||||
@echo "lint - check style with flake8"
|
||||
@echo "test - run tests quickly with the default Python"
|
||||
@echo "test - run tests quickly using the default Python"
|
||||
@echo "test-all - run tests on every Python version with tox"
|
||||
@echo "coverage - check code coverage quickly with the default Python"
|
||||
@echo "docs - generate Sphinx HTML documentation, including API docs"
|
||||
@echo "release - package and upload a release"
|
||||
@echo "dist - package"
|
||||
@echo "install - install the package to the active Python's site-packages"
|
||||
@echo "publish - package and upload a release"
|
||||
@echo "publish-to-test - package and upload a release to test-pypi"
|
||||
@echo "build - build the package"
|
||||
@echo "install - install the dependencies into the Poetry virtual environment"
|
||||
|
||||
clean: clean-build clean-pyc clean-test
|
||||
|
||||
@ -35,14 +36,13 @@ clean-test:
|
||||
rm -fr htmlcov/
|
||||
|
||||
lint:
|
||||
flake8 pipreqs tests
|
||||
poetry run flake8 pipreqs tests
|
||||
|
||||
test:
|
||||
pip install -r requirements.txt
|
||||
python setup.py test
|
||||
poetry run python -m unittest discover
|
||||
|
||||
test-all:
|
||||
tox
|
||||
poetry run tox
|
||||
|
||||
coverage:
|
||||
coverage run --source pipreqs setup.py test
|
||||
@ -58,13 +58,14 @@ docs:
|
||||
$(MAKE) -C docs html
|
||||
open docs/_build/html/index.html
|
||||
|
||||
release: clean
|
||||
python setup.py sdist bdist_wheel upload -r pypi
|
||||
publish: build
|
||||
poetry publish
|
||||
|
||||
dist: clean
|
||||
python setup.py sdist
|
||||
python setup.py bdist_wheel
|
||||
ls -l dist
|
||||
publish-to-test: build
|
||||
poetry publish --repository test-pypi
|
||||
|
||||
build: clean
|
||||
poetry build
|
||||
|
||||
install: clean
|
||||
python setup.py install
|
||||
poetry install --with dev
|
||||
|
11
README.rst
11
README.rst
@ -21,10 +21,18 @@
|
||||
Installation
|
||||
------------
|
||||
|
||||
::
|
||||
.. code-block:: sh
|
||||
|
||||
pip install pipreqs
|
||||
|
||||
Obs.: if you don't want support for jupyter notebooks, you can install pipreqs without the dependencies that give support to it.
|
||||
To do so, run:
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
pip install --no-deps pipreqs
|
||||
pip install yarg==0.1.9 docopt==0.6.2
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
@ -57,6 +65,7 @@ Usage
|
||||
<compat> | e.g. Flask~=1.1.2
|
||||
<gt> | e.g. Flask>=1.1.2
|
||||
<no-pin> | e.g. Flask
|
||||
--scan-notebooks Look for imports in jupyter notebook files.
|
||||
|
||||
Example
|
||||
-------
|
||||
|
@ -35,6 +35,7 @@ Options:
|
||||
<compat> | e.g. Flask~=1.1.2
|
||||
<gt> | e.g. Flask>=1.1.2
|
||||
<no-pin> | e.g. Flask
|
||||
--scan-notebooks Look for imports in jupyter notebook files.
|
||||
"""
|
||||
from contextlib import contextmanager
|
||||
import os
|
||||
@ -50,14 +51,23 @@ from yarg.exceptions import HTTPError
|
||||
|
||||
from pipreqs import __version__
|
||||
|
||||
REGEXP = [
|
||||
re.compile(r'^import (.+)$'),
|
||||
re.compile(r'^from ((?!\.+).*?) import (?:.*)$')
|
||||
]
|
||||
REGEXP = [re.compile(r"^import (.+)$"), re.compile(r"^from ((?!\.+).*?) import (?:.*)$")]
|
||||
DEFAULT_EXTENSIONS = [".py", ".pyw"]
|
||||
|
||||
scan_noteboooks = False
|
||||
|
||||
|
||||
class NbconvertNotInstalled(ImportError):
|
||||
default_message = (
|
||||
"In order to scan jupyter notebooks, please install the nbconvert and ipython libraries"
|
||||
)
|
||||
|
||||
def __init__(self, message=default_message):
|
||||
super().__init__(message)
|
||||
|
||||
|
||||
@contextmanager
|
||||
def _open(filename=None, mode='r'):
|
||||
def _open(filename=None, mode="r"):
|
||||
"""Open a file or ``sys.stdout`` depending on the provided filename.
|
||||
|
||||
Args:
|
||||
@ -70,13 +80,13 @@ def _open(filename=None, mode='r'):
|
||||
A file handle.
|
||||
|
||||
"""
|
||||
if not filename or filename == '-':
|
||||
if not mode or 'r' in mode:
|
||||
if not filename or filename == "-":
|
||||
if not mode or "r" in mode:
|
||||
file = sys.stdin
|
||||
elif 'w' in mode:
|
||||
elif "w" in mode:
|
||||
file = sys.stdout
|
||||
else:
|
||||
raise ValueError('Invalid mode for file: {}'.format(mode))
|
||||
raise ValueError("Invalid mode for file: {}".format(mode))
|
||||
else:
|
||||
file = open(filename, mode)
|
||||
|
||||
@ -87,13 +97,21 @@ def _open(filename=None, mode='r'):
|
||||
file.close()
|
||||
|
||||
|
||||
def get_all_imports(
|
||||
path, encoding=None, extra_ignore_dirs=None, follow_links=True):
|
||||
def get_all_imports(path, encoding="utf-8", extra_ignore_dirs=None, follow_links=True):
|
||||
imports = set()
|
||||
raw_imports = set()
|
||||
candidates = []
|
||||
ignore_errors = False
|
||||
ignore_dirs = [".hg", ".svn", ".git", ".tox", "__pycache__", "env", "venv"]
|
||||
ignore_dirs = [
|
||||
".hg",
|
||||
".svn",
|
||||
".git",
|
||||
".tox",
|
||||
"__pycache__",
|
||||
"env",
|
||||
"venv",
|
||||
".ipynb_checkpoints",
|
||||
]
|
||||
|
||||
if extra_ignore_dirs:
|
||||
ignore_dirs_parsed = []
|
||||
@ -101,18 +119,22 @@ def get_all_imports(
|
||||
ignore_dirs_parsed.append(os.path.basename(os.path.realpath(e)))
|
||||
ignore_dirs.extend(ignore_dirs_parsed)
|
||||
|
||||
extensions = get_file_extensions()
|
||||
|
||||
walk = os.walk(path, followlinks=follow_links)
|
||||
for root, dirs, files in walk:
|
||||
dirs[:] = [d for d in dirs if d not in ignore_dirs]
|
||||
|
||||
candidates.append(os.path.basename(root))
|
||||
files = [fn for fn in files if os.path.splitext(fn)[1] == ".py"]
|
||||
py_files = [file for file in files if file_ext_is_allowed(file, DEFAULT_EXTENSIONS)]
|
||||
candidates.extend([os.path.splitext(filename)[0] for filename in py_files])
|
||||
|
||||
files = [fn for fn in files if file_ext_is_allowed(fn, extensions)]
|
||||
|
||||
candidates += [os.path.splitext(fn)[0] for fn in files]
|
||||
for file_name in files:
|
||||
file_name = os.path.join(root, file_name)
|
||||
with open(file_name, "r", encoding=encoding) as f:
|
||||
contents = f.read()
|
||||
contents = read_file_content(file_name, encoding)
|
||||
|
||||
try:
|
||||
tree = ast.parse(contents)
|
||||
for node in ast.walk(tree):
|
||||
@ -137,11 +159,11 @@ def get_all_imports(
|
||||
# Cleanup: We only want to first part of the import.
|
||||
# Ex: from django.conf --> django.conf. But we only want django
|
||||
# as an import.
|
||||
cleaned_name, _, _ = name.partition('.')
|
||||
cleaned_name, _, _ = name.partition(".")
|
||||
imports.add(cleaned_name)
|
||||
|
||||
packages = imports - (set(candidates) & imports)
|
||||
logging.debug('Found packages: {0}'.format(packages))
|
||||
logging.debug("Found packages: {0}".format(packages))
|
||||
|
||||
with open(join("stdlib"), "r") as f:
|
||||
data = {x.strip() for x in f}
|
||||
@ -149,66 +171,95 @@ def get_all_imports(
|
||||
return list(packages - data)
|
||||
|
||||
|
||||
def filter_line(line):
|
||||
return len(line) > 0 and line[0] != "#"
|
||||
def get_file_extensions():
|
||||
return DEFAULT_EXTENSIONS + [".ipynb"] if scan_noteboooks else DEFAULT_EXTENSIONS
|
||||
|
||||
|
||||
def read_file_content(file_name: str, encoding="utf-8"):
|
||||
if file_ext_is_allowed(file_name, DEFAULT_EXTENSIONS):
|
||||
with open(file_name, "r", encoding=encoding) as f:
|
||||
contents = f.read()
|
||||
elif file_ext_is_allowed(file_name, [".ipynb"]) and scan_noteboooks:
|
||||
contents = ipynb_2_py(file_name, encoding=encoding)
|
||||
return contents
|
||||
|
||||
|
||||
def file_ext_is_allowed(file_name, acceptable):
|
||||
return os.path.splitext(file_name)[1] in acceptable
|
||||
|
||||
|
||||
def ipynb_2_py(file_name, encoding="utf-8"):
|
||||
"""
|
||||
|
||||
Args:
|
||||
file_name (str): notebook file path to parse as python script
|
||||
encoding (str): encoding of file
|
||||
|
||||
Returns:
|
||||
str: parsed string
|
||||
|
||||
"""
|
||||
exporter = PythonExporter()
|
||||
(body, _) = exporter.from_filename(file_name)
|
||||
|
||||
return body.encode(encoding)
|
||||
|
||||
|
||||
def generate_requirements_file(path, imports, symbol):
|
||||
with _open(path, "w") as out_file:
|
||||
logging.debug('Writing {num} requirements: {imports} to {file}'.format(
|
||||
num=len(imports),
|
||||
file=path,
|
||||
imports=", ".join([x['name'] for x in imports])
|
||||
))
|
||||
fmt = '{name}' + symbol + '{version}'
|
||||
out_file.write('\n'.join(
|
||||
fmt.format(**item) if item['version'] else '{name}'.format(**item)
|
||||
for item in imports) + '\n')
|
||||
logging.debug(
|
||||
"Writing {num} requirements: {imports} to {file}".format(
|
||||
num=len(imports), file=path, imports=", ".join([x["name"] for x in imports])
|
||||
)
|
||||
)
|
||||
fmt = "{name}" + symbol + "{version}"
|
||||
out_file.write(
|
||||
"\n".join(
|
||||
fmt.format(**item) if item["version"] else "{name}".format(**item)
|
||||
for item in imports
|
||||
)
|
||||
+ "\n"
|
||||
)
|
||||
|
||||
|
||||
def output_requirements(imports, symbol):
|
||||
generate_requirements_file('-', imports, symbol)
|
||||
generate_requirements_file("-", imports, symbol)
|
||||
|
||||
|
||||
def get_imports_info(
|
||||
imports, pypi_server="https://pypi.python.org/pypi/", proxy=None):
|
||||
def get_imports_info(imports, pypi_server="https://pypi.python.org/pypi/", proxy=None):
|
||||
result = []
|
||||
|
||||
for item in imports:
|
||||
try:
|
||||
logging.warning(
|
||||
'Import named "%s" not found locally. '
|
||||
'Trying to resolve it at the PyPI server.',
|
||||
item
|
||||
'Import named "%s" not found locally. ' "Trying to resolve it at the PyPI server.",
|
||||
item,
|
||||
)
|
||||
response = requests.get(
|
||||
"{0}{1}/json".format(pypi_server, item), proxies=proxy)
|
||||
response = requests.get("{0}{1}/json".format(pypi_server, item), proxies=proxy)
|
||||
if response.status_code == 200:
|
||||
if hasattr(response.content, 'decode'):
|
||||
if hasattr(response.content, "decode"):
|
||||
data = json2package(response.content.decode())
|
||||
else:
|
||||
data = json2package(response.content)
|
||||
elif response.status_code >= 300:
|
||||
raise HTTPError(status_code=response.status_code,
|
||||
reason=response.reason)
|
||||
raise HTTPError(status_code=response.status_code, reason=response.reason)
|
||||
except HTTPError:
|
||||
logging.warning(
|
||||
'Package "%s" does not exist or network problems', item)
|
||||
logging.warning('Package "%s" does not exist or network problems', item)
|
||||
continue
|
||||
logging.warning(
|
||||
'Import named "%s" was resolved to "%s:%s" package (%s).\n'
|
||||
'Please, verify manually the final list of requirements.txt '
|
||||
'to avoid possible dependency confusions.',
|
||||
"Please, verify manually the final list of requirements.txt "
|
||||
"to avoid possible dependency confusions.",
|
||||
item,
|
||||
data.name,
|
||||
data.latest_release_id,
|
||||
data.pypi_url
|
||||
data.pypi_url,
|
||||
)
|
||||
result.append({'name': item, 'version': data.latest_release_id})
|
||||
result.append({"name": item, "version": data.latest_release_id})
|
||||
return result
|
||||
|
||||
|
||||
def get_locally_installed_packages(encoding=None):
|
||||
def get_locally_installed_packages(encoding="utf-8"):
|
||||
packages = []
|
||||
ignore = ["tests", "_tests", "egg", "EGG", "info"]
|
||||
for path in sys.path:
|
||||
@ -229,29 +280,27 @@ def get_locally_installed_packages(encoding=None):
|
||||
filtered_top_level_modules = list()
|
||||
|
||||
for module in top_level_modules:
|
||||
if (
|
||||
(module not in ignore) and
|
||||
(package[0] not in ignore)
|
||||
):
|
||||
if (module not in ignore) and (package[0] not in ignore):
|
||||
# append exported top level modules to the list
|
||||
filtered_top_level_modules.append(module)
|
||||
|
||||
version = None
|
||||
if len(package) > 1:
|
||||
version = package[1].replace(
|
||||
".dist", "").replace(".egg", "")
|
||||
version = package[1].replace(".dist", "").replace(".egg", "")
|
||||
|
||||
# append package: top_level_modules pairs
|
||||
# instead of top_level_module: package pairs
|
||||
packages.append({
|
||||
'name': package[0],
|
||||
'version': version,
|
||||
'exports': filtered_top_level_modules
|
||||
})
|
||||
packages.append(
|
||||
{
|
||||
"name": package[0],
|
||||
"version": version,
|
||||
"exports": filtered_top_level_modules,
|
||||
}
|
||||
)
|
||||
return packages
|
||||
|
||||
|
||||
def get_import_local(imports, encoding=None):
|
||||
def get_import_local(imports, encoding="utf-8"):
|
||||
local = get_locally_installed_packages()
|
||||
result = []
|
||||
for item in imports:
|
||||
@ -260,14 +309,14 @@ def get_import_local(imports, encoding=None):
|
||||
# if candidate import name matches export name
|
||||
# or candidate import name equals to the package name
|
||||
# append it to the result
|
||||
if item in package['exports'] or item == package['name']:
|
||||
if item in package["exports"] or item == package["name"]:
|
||||
result.append(package)
|
||||
|
||||
# removing duplicates of package/version
|
||||
# had to use second method instead of the previous one,
|
||||
# because we have a list in the 'exports' field
|
||||
# https://stackoverflow.com/questions/9427163/remove-duplicate-dict-in-list-in-python
|
||||
result_unique = [i for n, i in enumerate(result) if i not in result[n+1:]]
|
||||
result_unique = [i for n, i in enumerate(result) if i not in result[n + 1:]]
|
||||
|
||||
return result_unique
|
||||
|
||||
@ -298,7 +347,7 @@ def get_name_without_alias(name):
|
||||
match = REGEXP[0].match(name.strip())
|
||||
if match:
|
||||
name = match.groups(0)[0]
|
||||
return name.partition(' as ')[0].partition('.')[0].strip()
|
||||
return name.partition(" as ")[0].partition(".")[0].strip()
|
||||
|
||||
|
||||
def join(f):
|
||||
@ -312,6 +361,9 @@ def parse_requirements(file_):
|
||||
delimiter, get module name by element index, create a dict consisting of
|
||||
module:version, and add dict to list of parsed modules.
|
||||
|
||||
If file ´file_´ is not found in the system, the program will print a
|
||||
helpful message and end its execution immediately.
|
||||
|
||||
Args:
|
||||
file_: File to parse.
|
||||
|
||||
@ -319,7 +371,7 @@ def parse_requirements(file_):
|
||||
OSerror: If there's any issues accessing the file.
|
||||
|
||||
Returns:
|
||||
tuple: The contents of the file, excluding comments.
|
||||
list: The contents of the file, excluding comments.
|
||||
"""
|
||||
modules = []
|
||||
# For the dependency identifier specification, see
|
||||
@ -328,9 +380,12 @@ def parse_requirements(file_):
|
||||
|
||||
try:
|
||||
f = open(file_, "r")
|
||||
except OSError:
|
||||
logging.error("Failed on file: {}".format(file_))
|
||||
raise
|
||||
except FileNotFoundError:
|
||||
print(f"File {file_} was not found. Please, fix it and run again.")
|
||||
sys.exit(1)
|
||||
except OSError as error:
|
||||
logging.error(f"There was an error opening the file {file_}: {str(error)}")
|
||||
raise error
|
||||
else:
|
||||
try:
|
||||
data = [x.strip() for x in f.readlines() if x != "\n"]
|
||||
@ -366,8 +421,8 @@ def compare_modules(file_, imports):
|
||||
imports (tuple): Modules being imported in the project.
|
||||
|
||||
Returns:
|
||||
tuple: The modules not imported in the project, but do exist in the
|
||||
specified file.
|
||||
set: The modules not imported in the project, but do exist in the
|
||||
specified file.
|
||||
"""
|
||||
modules = parse_requirements(file_)
|
||||
|
||||
@ -384,7 +439,8 @@ def diff(file_, imports):
|
||||
|
||||
logging.info(
|
||||
"The following modules are in {} but do not seem to be imported: "
|
||||
"{}".format(file_, ", ".join(x for x in modules_not_imported)))
|
||||
"{}".format(file_, ", ".join(x for x in modules_not_imported))
|
||||
)
|
||||
|
||||
|
||||
def clean(file_, imports):
|
||||
@ -431,31 +487,55 @@ def dynamic_versioning(scheme, imports):
|
||||
return imports, symbol
|
||||
|
||||
|
||||
def handle_scan_noteboooks():
|
||||
if not scan_noteboooks:
|
||||
logging.info("Not scanning for jupyter notebooks.")
|
||||
return
|
||||
|
||||
try:
|
||||
global PythonExporter
|
||||
from nbconvert import PythonExporter
|
||||
except ImportError:
|
||||
raise NbconvertNotInstalled()
|
||||
|
||||
|
||||
def init(args):
|
||||
encoding = args.get('--encoding')
|
||||
extra_ignore_dirs = args.get('--ignore')
|
||||
follow_links = not args.get('--no-follow-links')
|
||||
input_path = args['<path>']
|
||||
global scan_noteboooks
|
||||
encoding = args.get("--encoding")
|
||||
extra_ignore_dirs = args.get("--ignore")
|
||||
follow_links = not args.get("--no-follow-links")
|
||||
|
||||
scan_noteboooks = args.get("--scan-notebooks", False)
|
||||
handle_scan_noteboooks()
|
||||
|
||||
input_path = args["<path>"]
|
||||
|
||||
if encoding is None:
|
||||
encoding = "utf-8"
|
||||
if input_path is None:
|
||||
input_path = os.path.abspath(os.curdir)
|
||||
|
||||
if extra_ignore_dirs:
|
||||
extra_ignore_dirs = extra_ignore_dirs.split(',')
|
||||
extra_ignore_dirs = extra_ignore_dirs.split(",")
|
||||
|
||||
path = (args["--savepath"] if args["--savepath"] else
|
||||
os.path.join(input_path, "requirements.txt"))
|
||||
if (not args["--print"]
|
||||
and not args["--savepath"]
|
||||
and not args["--force"]
|
||||
and os.path.exists(path)):
|
||||
logging.warning("requirements.txt already exists, "
|
||||
"use --force to overwrite it")
|
||||
path = (
|
||||
args["--savepath"] if args["--savepath"] else os.path.join(input_path, "requirements.txt")
|
||||
)
|
||||
if (
|
||||
not args["--print"]
|
||||
and not args["--savepath"]
|
||||
and not args["--force"]
|
||||
and os.path.exists(path)
|
||||
):
|
||||
logging.warning("requirements.txt already exists, " "use --force to overwrite it")
|
||||
return
|
||||
|
||||
candidates = get_all_imports(input_path,
|
||||
encoding=encoding,
|
||||
extra_ignore_dirs=extra_ignore_dirs,
|
||||
follow_links=follow_links)
|
||||
candidates = get_all_imports(
|
||||
input_path,
|
||||
encoding=encoding,
|
||||
extra_ignore_dirs=extra_ignore_dirs,
|
||||
follow_links=follow_links,
|
||||
)
|
||||
candidates = get_pkg_names(candidates)
|
||||
logging.debug("Found imports: " + ", ".join(candidates))
|
||||
pypi_server = "https://pypi.python.org/pypi/"
|
||||
@ -464,11 +544,10 @@ def init(args):
|
||||
pypi_server = args["--pypi-server"]
|
||||
|
||||
if args["--proxy"]:
|
||||
proxy = {'http': args["--proxy"], 'https': args["--proxy"]}
|
||||
proxy = {"http": args["--proxy"], "https": args["--proxy"]}
|
||||
|
||||
if args["--use-local"]:
|
||||
logging.debug(
|
||||
"Getting package information ONLY from local installation.")
|
||||
logging.debug("Getting package information ONLY from local installation.")
|
||||
imports = get_import_local(candidates, encoding=encoding)
|
||||
else:
|
||||
logging.debug("Getting packages information from Local/PyPI")
|
||||
@ -478,20 +557,21 @@ def init(args):
|
||||
# the list of exported modules, installed locally
|
||||
# and the package name is not in the list of local module names
|
||||
# it add to difference
|
||||
difference = [x for x in candidates if
|
||||
# aggregate all export lists into one
|
||||
# flatten the list
|
||||
# check if candidate is in exports
|
||||
x.lower() not in [y for x in local for y in x['exports']]
|
||||
and
|
||||
# check if candidate is package names
|
||||
x.lower() not in [x['name'] for x in local]]
|
||||
difference = [
|
||||
x
|
||||
for x in candidates
|
||||
if
|
||||
# aggregate all export lists into one
|
||||
# flatten the list
|
||||
# check if candidate is in exports
|
||||
x.lower() not in [y for x in local for y in x["exports"]] and
|
||||
# check if candidate is package names
|
||||
x.lower() not in [x["name"] for x in local]
|
||||
]
|
||||
|
||||
imports = local + get_imports_info(difference,
|
||||
proxy=proxy,
|
||||
pypi_server=pypi_server)
|
||||
imports = local + get_imports_info(difference, proxy=proxy, pypi_server=pypi_server)
|
||||
# sort imports based on lowercase name of package, similar to `pip freeze`.
|
||||
imports = sorted(imports, key=lambda x: x['name'].lower())
|
||||
imports = sorted(imports, key=lambda x: x["name"].lower())
|
||||
|
||||
if args["--diff"]:
|
||||
diff(args["--diff"], imports)
|
||||
@ -506,8 +586,9 @@ def init(args):
|
||||
if scheme in ["compat", "gt", "no-pin"]:
|
||||
imports, symbol = dynamic_versioning(scheme, imports)
|
||||
else:
|
||||
raise ValueError("Invalid argument for mode flag, "
|
||||
"use 'compat', 'gt' or 'no-pin' instead")
|
||||
raise ValueError(
|
||||
"Invalid argument for mode flag, " "use 'compat', 'gt' or 'no-pin' instead"
|
||||
)
|
||||
else:
|
||||
symbol = "=="
|
||||
|
||||
@ -521,8 +602,8 @@ def init(args):
|
||||
|
||||
def main(): # pragma: no cover
|
||||
args = docopt(__doc__, version=__version__)
|
||||
log_level = logging.DEBUG if args['--debug'] else logging.INFO
|
||||
logging.basicConfig(level=log_level, format='%(levelname)s: %(message)s')
|
||||
log_level = logging.DEBUG if args["--debug"] else logging.INFO
|
||||
logging.basicConfig(level=log_level, format="%(levelname)s: %(message)s")
|
||||
|
||||
try:
|
||||
init(args)
|
||||
@ -530,5 +611,5 @@ def main(): # pragma: no cover
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
main() # pragma: no cover
|
||||
|
1792
poetry.lock
generated
Normal file
1792
poetry.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
2
poetry.toml
Normal file
2
poetry.toml
Normal file
@ -0,0 +1,2 @@
|
||||
[virtualenvs]
|
||||
prefer-active-python = true
|
42
pyproject.toml
Normal file
42
pyproject.toml
Normal file
@ -0,0 +1,42 @@
|
||||
[tool.poetry]
|
||||
name = "pipreqs"
|
||||
version = "0.5.0"
|
||||
description = "Pip requirements.txt generator based on imports in project"
|
||||
authors = ["Vadim Kravcenko <vadim.kravcenko@gmail.com>"]
|
||||
license = "Apache-2.0"
|
||||
readme = ["README.rst", "HISTORY.rst"]
|
||||
packages = [{include = "pipreqs"}]
|
||||
repository = "https://github.com/bndr/pipreqs"
|
||||
keywords = ["pip", "requirements", "imports"]
|
||||
classifiers = [
|
||||
"Development Status :: 4 - Beta",
|
||||
"Intended Audience :: Developers",
|
||||
"License :: OSI Approved :: Apache Software License",
|
||||
"Natural Language :: English",
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3.8",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: 3.11",
|
||||
"Programming Language :: Python :: 3.12"
|
||||
]
|
||||
|
||||
[tool.poetry.scripts]
|
||||
pipreqs = "pipreqs.pipreqs:main"
|
||||
|
||||
[tool.poetry.dependencies]
|
||||
python = ">=3.8.1,<3.13"
|
||||
yarg = "0.1.9"
|
||||
docopt = "0.6.2"
|
||||
nbconvert = "^7.11.0"
|
||||
ipython = "8.12.3"
|
||||
|
||||
[tool.poetry.group.dev.dependencies]
|
||||
flake8 = "^6.1.0"
|
||||
tox = "^4.11.3"
|
||||
coverage = "^7.3.2"
|
||||
sphinx = { version = "^7.2.6", python = ">=3.9" }
|
||||
|
||||
[build-system]
|
||||
requires = ["poetry-core"]
|
||||
build-backend = "poetry.core.masonry.api"
|
@ -1,3 +0,0 @@
|
||||
wheel==0.38.1
|
||||
Yarg==0.1.9
|
||||
docopt==0.6.2
|
59
setup.py
59
setup.py
@ -1,59 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
try:
|
||||
from setuptools import setup
|
||||
except ImportError:
|
||||
from distutils.core import setup
|
||||
|
||||
from pipreqs import __version__
|
||||
|
||||
|
||||
with open('README.rst') as readme_file:
|
||||
readme = readme_file.read()
|
||||
|
||||
with open('HISTORY.rst') as history_file:
|
||||
history = history_file.read().replace('.. :changelog:', '')
|
||||
|
||||
requirements = [
|
||||
'docopt', 'yarg'
|
||||
]
|
||||
|
||||
setup(
|
||||
name='pipreqs',
|
||||
version=__version__,
|
||||
description='Pip requirements.txt generator based on imports in project',
|
||||
long_description=readme + '\n\n' + history,
|
||||
author='Vadim Kravcenko',
|
||||
author_email='vadim.kravcenko@gmail.com',
|
||||
url='https://github.com/bndr/pipreqs',
|
||||
packages=[
|
||||
'pipreqs',
|
||||
],
|
||||
package_dir={'pipreqs':
|
||||
'pipreqs'},
|
||||
include_package_data=True,
|
||||
package_data={'': ['stdlib', 'mapping']},
|
||||
install_requires=requirements,
|
||||
license='Apache License',
|
||||
zip_safe=False,
|
||||
keywords='pip requirements imports',
|
||||
classifiers=[
|
||||
'Development Status :: 4 - Beta',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Natural Language :: English',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
],
|
||||
test_suite='tests',
|
||||
entry_points={
|
||||
'console_scripts': [
|
||||
'pipreqs=pipreqs.pipreqs:main',
|
||||
],
|
||||
},
|
||||
python_requires='>=3.7',
|
||||
)
|
0
tests/_data/empty.txt
Normal file
0
tests/_data/empty.txt
Normal file
3
tests/_data/imports.txt
Normal file
3
tests/_data/imports.txt
Normal file
@ -0,0 +1,3 @@
|
||||
pandas==2.0.0
|
||||
numpy>=1.2.3
|
||||
torch<4.0.0
|
4
tests/_data/imports_any_version.txt
Normal file
4
tests/_data/imports_any_version.txt
Normal file
@ -0,0 +1,4 @@
|
||||
numpy
|
||||
pandas==2.0.0
|
||||
tensorflow
|
||||
torch<4.0.0
|
3
tests/_data/imports_no_version.txt
Normal file
3
tests/_data/imports_no_version.txt
Normal file
@ -0,0 +1,3 @@
|
||||
pandas
|
||||
tensorflow
|
||||
torch
|
65
tests/_data_notebook/magic_commands.ipynb
Normal file
65
tests/_data_notebook/magic_commands.ipynb
Normal file
@ -0,0 +1,65 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Magic test"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%automagic true"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ls -la\n",
|
||||
"logstate"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ls -la"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%automagic false"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ls -la"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"language_info": {
|
||||
"name": "python"
|
||||
},
|
||||
"orig_nbformat": 4
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
37
tests/_data_notebook/markdown_test.ipynb
Normal file
37
tests/_data_notebook/markdown_test.ipynb
Normal file
@ -0,0 +1,37 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Markdown test\n",
|
||||
"import sklearn\n",
|
||||
"\n",
|
||||
"```python\n",
|
||||
"import FastAPI\n",
|
||||
"```"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.8.1"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 4
|
||||
}
|
0
tests/_data_notebook/models.py
Normal file
0
tests/_data_notebook/models.py
Normal file
102
tests/_data_notebook/test.ipynb
Normal file
102
tests/_data_notebook/test.ipynb
Normal file
@ -0,0 +1,102 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"\"\"\"unused import\"\"\"\n",
|
||||
"# pylint: disable=undefined-all-variable, import-error, no-absolute-import, too-few-public-methods, missing-docstring\n",
|
||||
"import xml.etree # [unused-import]\n",
|
||||
"import xml.sax # [unused-import]\n",
|
||||
"import os.path as test # [unused-import]\n",
|
||||
"from sys import argv as test2 # [unused-import]\n",
|
||||
"from sys import flags # [unused-import]\n",
|
||||
"# +1:[unused-import,unused-import]\n",
|
||||
"from collections import deque, OrderedDict, Counter\n",
|
||||
"# All imports above should be ignored\n",
|
||||
"import requests # [unused-import]\n",
|
||||
"\n",
|
||||
"# setuptools\n",
|
||||
"import zipimport # command/easy_install.py\n",
|
||||
"\n",
|
||||
"# twisted\n",
|
||||
"from importlib import invalidate_caches # python/test/test_deprecate.py\n",
|
||||
"\n",
|
||||
"# astroid\n",
|
||||
"import zipimport # manager.py\n",
|
||||
"# IPython\n",
|
||||
"from importlib.machinery import all_suffixes # core/completerlib.py\n",
|
||||
"import importlib # html/notebookapp.py\n",
|
||||
"\n",
|
||||
"from IPython.utils.importstring import import_item # Many files\n",
|
||||
"\n",
|
||||
"# pyflakes\n",
|
||||
"# test/test_doctests.py\n",
|
||||
"from pyflakes.test.test_imports import Test as TestImports\n",
|
||||
"\n",
|
||||
"# Nose\n",
|
||||
"from nose.importer import Importer, add_path, remove_path # loader.py\n",
|
||||
"\n",
|
||||
"import atexit\n",
|
||||
"from __future__ import print_function\n",
|
||||
"from docopt import docopt\n",
|
||||
"import curses, logging, sqlite3\n",
|
||||
"import logging\n",
|
||||
"import os\n",
|
||||
"import sqlite3\n",
|
||||
"import time\n",
|
||||
"import sys\n",
|
||||
"import signal\n",
|
||||
"import bs4\n",
|
||||
"import nonexistendmodule\n",
|
||||
"import boto as b, peewee as p\n",
|
||||
"# import django\n",
|
||||
"import flask.ext.somext # # #\n",
|
||||
"from sqlalchemy import model"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"try:\n",
|
||||
" import ujson as json\n",
|
||||
"except ImportError:\n",
|
||||
" import json\n",
|
||||
"\n",
|
||||
"import models\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def main():\n",
|
||||
" pass\n",
|
||||
"\n",
|
||||
"import after_method_is_valid_even_if_not_pep8"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.8.1"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 4
|
||||
}
|
5
tests/_data_pyw/py.py
Normal file
5
tests/_data_pyw/py.py
Normal file
@ -0,0 +1,5 @@
|
||||
import airflow
|
||||
import numpy
|
||||
|
||||
airflow
|
||||
numpy
|
3
tests/_data_pyw/pyw.pyw
Normal file
3
tests/_data_pyw/pyw.pyw
Normal file
@ -0,0 +1,3 @@
|
||||
import matplotlib
|
||||
import pandas
|
||||
import tensorflow
|
34
tests/_invalid_data_notebook/invalid.ipynb
Normal file
34
tests/_invalid_data_notebook/invalid.ipynb
Normal file
@ -0,0 +1,34 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"cd ."
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.4"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 4
|
||||
}
|
@ -8,53 +8,93 @@ test_pipreqs
|
||||
Tests for `pipreqs` module.
|
||||
"""
|
||||
|
||||
from io import StringIO
|
||||
import logging
|
||||
from unittest.mock import patch, Mock
|
||||
import unittest
|
||||
import os
|
||||
import requests
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
from pipreqs import pipreqs
|
||||
|
||||
|
||||
class TestPipreqs(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.modules = [
|
||||
'flask', 'requests', 'sqlalchemy', 'docopt', 'boto', 'ipython',
|
||||
'pyflakes', 'nose', 'analytics', 'flask_seasurf', 'peewee',
|
||||
'ujson', 'nonexistendmodule', 'bs4',
|
||||
'after_method_is_valid_even_if_not_pep8'
|
||||
]
|
||||
self.modules2 = ['beautifulsoup4']
|
||||
self.local = ["docopt", "requests", "nose", 'pyflakes']
|
||||
self.project = os.path.join(os.path.dirname(__file__), "_data")
|
||||
self.project_clean = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
"_data_clean"
|
||||
)
|
||||
self.project_invalid = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
"_invalid_data"
|
||||
)
|
||||
self.project_with_ignore_directory = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
"_data_ignore"
|
||||
)
|
||||
self.project_with_duplicated_deps = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
"_data_duplicated_deps"
|
||||
)
|
||||
self.requirements_path = os.path.join(self.project, "requirements.txt")
|
||||
self.alt_requirement_path = os.path.join(
|
||||
self.project,
|
||||
"requirements2.txt"
|
||||
)
|
||||
@classmethod
|
||||
def setUpClass(cls):
|
||||
# Disable all logs for not spamming the terminal when running tests.
|
||||
logging.disable(logging.CRITICAL)
|
||||
|
||||
# Specific warning not covered by the above command:
|
||||
warnings.filterwarnings("ignore", category=DeprecationWarning, module="jupyter_client")
|
||||
|
||||
cls.modules = [
|
||||
"flask",
|
||||
"requests",
|
||||
"sqlalchemy",
|
||||
"docopt",
|
||||
"boto",
|
||||
"ipython",
|
||||
"pyflakes",
|
||||
"nose",
|
||||
"analytics",
|
||||
"flask_seasurf",
|
||||
"peewee",
|
||||
"ujson",
|
||||
"nonexistendmodule",
|
||||
"bs4",
|
||||
"after_method_is_valid_even_if_not_pep8",
|
||||
]
|
||||
cls.modules2 = ["beautifulsoup4"]
|
||||
cls.local = ["docopt", "requests", "nose", "pyflakes", "ipython"]
|
||||
cls.project = os.path.join(os.path.dirname(__file__), "_data")
|
||||
cls.empty_filepath = os.path.join(cls.project, "empty.txt")
|
||||
cls.imports_filepath = os.path.join(cls.project, "imports.txt")
|
||||
cls.imports_no_version_filepath = os.path.join(cls.project, "imports_no_version.txt")
|
||||
cls.imports_any_version_filepath = os.path.join(cls.project, "imports_any_version.txt")
|
||||
cls.non_existent_filepath = os.path.join(cls.project, "non_existent_file.txt")
|
||||
|
||||
cls.parsed_packages = [
|
||||
{"name": "pandas", "version": "2.0.0"},
|
||||
{"name": "numpy", "version": "1.2.3"},
|
||||
{"name": "torch", "version": "4.0.0"},
|
||||
]
|
||||
|
||||
cls.parsed_packages_no_version = [
|
||||
{"name": "pandas", "version": None},
|
||||
{"name": "tensorflow", "version": None},
|
||||
{"name": "torch", "version": None},
|
||||
]
|
||||
|
||||
cls.parsed_packages_any_version = [
|
||||
{"name": "numpy", "version": None},
|
||||
{"name": "pandas", "version": "2.0.0"},
|
||||
{"name": "tensorflow", "version": None},
|
||||
{"name": "torch", "version": "4.0.0"},
|
||||
]
|
||||
|
||||
cls.project_clean = os.path.join(os.path.dirname(__file__), "_data_clean")
|
||||
cls.project_invalid = os.path.join(os.path.dirname(__file__), "_invalid_data")
|
||||
cls.project_with_ignore_directory = os.path.join(os.path.dirname(__file__), "_data_ignore")
|
||||
cls.project_with_duplicated_deps = os.path.join(os.path.dirname(__file__), "_data_duplicated_deps")
|
||||
|
||||
cls.requirements_path = os.path.join(cls.project, "requirements.txt")
|
||||
cls.alt_requirement_path = os.path.join(cls.project, "requirements2.txt")
|
||||
cls.non_existing_filepath = "xpto"
|
||||
|
||||
cls.project_with_notebooks = os.path.join(os.path.dirname(__file__), "_data_notebook")
|
||||
cls.project_with_invalid_notebooks = os.path.join(os.path.dirname(__file__), "_invalid_data_notebook")
|
||||
|
||||
cls.python_path_same_imports = os.path.join(os.path.dirname(__file__), "_data/test.py")
|
||||
cls.notebook_path_same_imports = os.path.join(os.path.dirname(__file__), "_data_notebook/test.ipynb")
|
||||
|
||||
def test_get_all_imports(self):
|
||||
imports = pipreqs.get_all_imports(self.project)
|
||||
self.assertEqual(len(imports), 15)
|
||||
for item in imports:
|
||||
self.assertTrue(
|
||||
item.lower() in self.modules, "Import is missing: " + item)
|
||||
self.assertTrue(item.lower() in self.modules, "Import is missing: " + item)
|
||||
self.assertFalse("time" in imports)
|
||||
self.assertFalse("logging" in imports)
|
||||
self.assertFalse("curses" in imports)
|
||||
@ -72,8 +112,7 @@ class TestPipreqs(unittest.TestCase):
|
||||
"""
|
||||
Test that invalid python files cannot be imported.
|
||||
"""
|
||||
self.assertRaises(
|
||||
SyntaxError, pipreqs.get_all_imports, self.project_invalid)
|
||||
self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_invalid)
|
||||
|
||||
def test_get_imports_info(self):
|
||||
"""
|
||||
@ -86,13 +125,14 @@ class TestPipreqs(unittest.TestCase):
|
||||
self.assertEqual(len(with_info), 13)
|
||||
for item in with_info:
|
||||
self.assertTrue(
|
||||
item['name'].lower() in self.modules,
|
||||
"Import item appears to be missing " + item['name'])
|
||||
item["name"].lower() in self.modules,
|
||||
"Import item appears to be missing " + item["name"],
|
||||
)
|
||||
|
||||
def test_get_pkg_names(self):
|
||||
pkgs = ['jury', 'Japan', 'camel', 'Caroline']
|
||||
pkgs = ["jury", "Japan", "camel", "Caroline"]
|
||||
actual_output = pipreqs.get_pkg_names(pkgs)
|
||||
expected_output = ['camel', 'Caroline', 'Japan', 'jury']
|
||||
expected_output = ["camel", "Caroline", "Japan", "jury"]
|
||||
self.assertEqual(actual_output, expected_output)
|
||||
|
||||
def test_get_use_local_only(self):
|
||||
@ -107,22 +147,33 @@ class TestPipreqs(unittest.TestCase):
|
||||
# should find only docopt and requests
|
||||
imports_with_info = pipreqs.get_import_local(self.modules)
|
||||
for item in imports_with_info:
|
||||
self.assertTrue(item['name'].lower() in self.local)
|
||||
self.assertTrue(item["name"].lower() in self.local)
|
||||
|
||||
def test_init(self):
|
||||
"""
|
||||
Test that all modules we will test upon are in requirements file
|
||||
"""
|
||||
pipreqs.init({'<path>': self.project, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': True, '--proxy':None, '--pypi-server':None,
|
||||
'--diff': None, '--clean': None, '--mode': None})
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(self.requirements_path) == 1
|
||||
with open(self.requirements_path, "r") as f:
|
||||
data = f.read().lower()
|
||||
for item in self.modules[:-3]:
|
||||
self.assertTrue(item.lower() in data)
|
||||
# It should be sorted based on names.
|
||||
data = data.strip().split('\n')
|
||||
data = data.strip().split("\n")
|
||||
self.assertEqual(data, sorted(data))
|
||||
|
||||
def test_init_local_only(self):
|
||||
@ -130,9 +181,20 @@ class TestPipreqs(unittest.TestCase):
|
||||
Test that items listed in requirements.text are the same
|
||||
as locals expected
|
||||
"""
|
||||
pipreqs.init({'<path>': self.project, '--savepath': None, '--print': False,
|
||||
'--use-local': True, '--force': True, '--proxy':None, '--pypi-server':None,
|
||||
'--diff': None, '--clean': None, '--mode': None})
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": True,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(self.requirements_path) == 1
|
||||
with open(self.requirements_path, "r") as f:
|
||||
data = f.readlines()
|
||||
@ -145,9 +207,19 @@ class TestPipreqs(unittest.TestCase):
|
||||
Test that we can save requirements.txt correctly
|
||||
to a different path
|
||||
"""
|
||||
pipreqs.init({'<path>': self.project, '--savepath': self.alt_requirement_path,
|
||||
'--use-local': None, '--proxy':None, '--pypi-server':None, '--print': False,
|
||||
'--diff': None, '--clean': None, '--mode': None})
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": self.alt_requirement_path,
|
||||
"--use-local": None,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--print": False,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(self.alt_requirement_path) == 1
|
||||
with open(self.alt_requirement_path, "r") as f:
|
||||
data = f.read().lower()
|
||||
@ -163,9 +235,20 @@ class TestPipreqs(unittest.TestCase):
|
||||
"""
|
||||
with open(self.requirements_path, "w") as f:
|
||||
f.write("should_not_be_overwritten")
|
||||
pipreqs.init({'<path>': self.project, '--savepath': None, '--use-local': None,
|
||||
'--force': None, '--proxy':None, '--pypi-server':None, '--print': False,
|
||||
'--diff': None, '--clean': None, '--mode': None})
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--use-local": None,
|
||||
"--force": None,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--print": False,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(self.requirements_path) == 1
|
||||
with open(self.requirements_path, "r") as f:
|
||||
data = f.read().lower()
|
||||
@ -180,41 +263,49 @@ class TestPipreqs(unittest.TestCase):
|
||||
"""
|
||||
import_name_with_alias = "requests as R"
|
||||
expected_import_name_without_alias = "requests"
|
||||
import_name_without_aliases = pipreqs.get_name_without_alias(
|
||||
import_name_with_alias)
|
||||
self.assertEqual(
|
||||
import_name_without_aliases,
|
||||
expected_import_name_without_alias
|
||||
)
|
||||
import_name_without_aliases = pipreqs.get_name_without_alias(import_name_with_alias)
|
||||
self.assertEqual(import_name_without_aliases, expected_import_name_without_alias)
|
||||
|
||||
def test_custom_pypi_server(self):
|
||||
"""
|
||||
Test that trying to get a custom pypi sever fails correctly
|
||||
"""
|
||||
self.assertRaises(
|
||||
requests.exceptions.MissingSchema, pipreqs.init,
|
||||
{'<path>': self.project, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': True, '--proxy': None,
|
||||
'--pypi-server': 'nonexistent'}
|
||||
)
|
||||
requests.exceptions.MissingSchema,
|
||||
pipreqs.init,
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": "nonexistent",
|
||||
},
|
||||
)
|
||||
|
||||
def test_ignored_directory(self):
|
||||
"""
|
||||
Test --ignore parameter
|
||||
"""
|
||||
pipreqs.init(
|
||||
{'<path>': self.project_with_ignore_directory, '--savepath': None,
|
||||
'--print': False, '--use-local': None, '--force': True,
|
||||
'--proxy':None, '--pypi-server':None,
|
||||
'--ignore':'.ignored_dir,.ignore_second',
|
||||
'--diff': None,
|
||||
'--clean': None,
|
||||
'--mode': None
|
||||
}
|
||||
{
|
||||
"<path>": self.project_with_ignore_directory,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--ignore": ".ignored_dir,.ignore_second",
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
|
||||
data = f.read().lower()
|
||||
for item in ['click', 'getpass']:
|
||||
for item in ["click", "getpass"]:
|
||||
self.assertFalse(item.lower() in data)
|
||||
|
||||
def test_dynamic_version_no_pin_scheme(self):
|
||||
@ -222,17 +313,22 @@ class TestPipreqs(unittest.TestCase):
|
||||
Test --mode=no-pin
|
||||
"""
|
||||
pipreqs.init(
|
||||
{'<path>': self.project_with_ignore_directory, '--savepath': None,
|
||||
'--print': False, '--use-local': None, '--force': True,
|
||||
'--proxy': None, '--pypi-server': None,
|
||||
'--diff': None,
|
||||
'--clean': None,
|
||||
'--mode': 'no-pin'
|
||||
}
|
||||
{
|
||||
"<path>": self.project_with_ignore_directory,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": "no-pin",
|
||||
}
|
||||
)
|
||||
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
|
||||
data = f.read().lower()
|
||||
for item in ['beautifulsoup4', 'boto']:
|
||||
for item in ["beautifulsoup4", "boto"]:
|
||||
self.assertTrue(item.lower() in data)
|
||||
|
||||
def test_dynamic_version_gt_scheme(self):
|
||||
@ -240,20 +336,24 @@ class TestPipreqs(unittest.TestCase):
|
||||
Test --mode=gt
|
||||
"""
|
||||
pipreqs.init(
|
||||
{'<path>': self.project_with_ignore_directory, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': True,
|
||||
'--proxy': None,
|
||||
'--pypi-server': None,
|
||||
'--diff': None,
|
||||
'--clean': None,
|
||||
'--mode': 'gt'
|
||||
}
|
||||
{
|
||||
"<path>": self.project_with_ignore_directory,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": "gt",
|
||||
}
|
||||
)
|
||||
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
|
||||
data = f.readlines()
|
||||
for item in data:
|
||||
symbol = '>='
|
||||
message = 'symbol is not in item'
|
||||
symbol = ">="
|
||||
message = "symbol is not in item"
|
||||
self.assertIn(symbol, item, message)
|
||||
|
||||
def test_dynamic_version_compat_scheme(self):
|
||||
@ -261,20 +361,24 @@ class TestPipreqs(unittest.TestCase):
|
||||
Test --mode=compat
|
||||
"""
|
||||
pipreqs.init(
|
||||
{'<path>': self.project_with_ignore_directory, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': True,
|
||||
'--proxy': None,
|
||||
'--pypi-server': None,
|
||||
'--diff': None,
|
||||
'--clean': None,
|
||||
'--mode': 'compat'
|
||||
}
|
||||
{
|
||||
"<path>": self.project_with_ignore_directory,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": "compat",
|
||||
}
|
||||
)
|
||||
with open(os.path.join(self.project_with_ignore_directory, "requirements.txt"), "r") as f:
|
||||
data = f.readlines()
|
||||
for item in data:
|
||||
symbol = '~='
|
||||
message = 'symbol is not in item'
|
||||
symbol = "~="
|
||||
message = "symbol is not in item"
|
||||
self.assertIn(symbol, item, message)
|
||||
|
||||
def test_clean(self):
|
||||
@ -282,18 +386,34 @@ class TestPipreqs(unittest.TestCase):
|
||||
Test --clean parameter
|
||||
"""
|
||||
pipreqs.init(
|
||||
{'<path>': self.project, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': True, '--proxy': None,
|
||||
'--pypi-server': None, '--diff': None, '--clean': None,
|
||||
'--mode': None}
|
||||
)
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(self.requirements_path) == 1
|
||||
pipreqs.init(
|
||||
{'<path>': self.project, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': None, '--proxy': None,
|
||||
'--pypi-server': None, '--diff': None,
|
||||
'--clean': self.requirements_path, '--mode': 'non-pin'}
|
||||
)
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": None,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": self.requirements_path,
|
||||
"--mode": "non-pin",
|
||||
}
|
||||
)
|
||||
with open(self.requirements_path, "r") as f:
|
||||
data = f.read().lower()
|
||||
for item in self.modules[:-3]:
|
||||
@ -303,25 +423,255 @@ class TestPipreqs(unittest.TestCase):
|
||||
"""
|
||||
Test --clean parameter when there are imports to clean
|
||||
"""
|
||||
cleaned_module = 'sqlalchemy'
|
||||
cleaned_module = "sqlalchemy"
|
||||
pipreqs.init(
|
||||
{'<path>': self.project, '--savepath': None, '--print': False,
|
||||
'--use-local': None, '--force': True, '--proxy': None,
|
||||
'--pypi-server': None, '--diff': None, '--clean': None,
|
||||
'--mode': None}
|
||||
)
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(self.requirements_path) == 1
|
||||
modules_clean = [m for m in self.modules if m != cleaned_module]
|
||||
pipreqs.init(
|
||||
{'<path>': self.project_clean, '--savepath': None,
|
||||
'--print': False, '--use-local': None, '--force': None,
|
||||
'--proxy': None, '--pypi-server': None, '--diff': None,
|
||||
'--clean': self.requirements_path, '--mode': 'non-pin'}
|
||||
)
|
||||
{
|
||||
"<path>": self.project_clean,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": None,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": self.requirements_path,
|
||||
"--mode": "non-pin",
|
||||
}
|
||||
)
|
||||
with open(self.requirements_path, "r") as f:
|
||||
data = f.read().lower()
|
||||
self.assertTrue(cleaned_module not in data)
|
||||
|
||||
def test_compare_modules(self):
|
||||
test_cases = [
|
||||
(self.empty_filepath, [], set()), # both empty
|
||||
(self.empty_filepath, self.parsed_packages, set()), # only file empty
|
||||
(
|
||||
self.imports_filepath,
|
||||
[],
|
||||
set(package["name"] for package in self.parsed_packages),
|
||||
), # only imports empty
|
||||
(self.imports_filepath, self.parsed_packages, set()), # no difference
|
||||
(
|
||||
self.imports_filepath,
|
||||
self.parsed_packages[1:],
|
||||
set([self.parsed_packages[0]["name"]]),
|
||||
), # common case
|
||||
]
|
||||
|
||||
for test_case in test_cases:
|
||||
with self.subTest(test_case):
|
||||
filename, imports, expected_modules_not_imported = test_case
|
||||
|
||||
modules_not_imported = pipreqs.compare_modules(filename, imports)
|
||||
|
||||
self.assertSetEqual(modules_not_imported, expected_modules_not_imported)
|
||||
|
||||
def test_output_requirements(self):
|
||||
"""
|
||||
Test --print parameter
|
||||
It should print to stdout the same content as requeriments.txt
|
||||
"""
|
||||
|
||||
capturedOutput = StringIO()
|
||||
sys.stdout = capturedOutput
|
||||
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": True,
|
||||
"--use-local": None,
|
||||
"--force": None,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
|
||||
with open(self.requirements_path, "r") as f:
|
||||
file_content = f.read().lower()
|
||||
stdout_content = capturedOutput.getvalue().lower()
|
||||
self.assertTrue(file_content == stdout_content)
|
||||
|
||||
def test_import_notebooks(self):
|
||||
"""
|
||||
Test the function get_all_imports() using .ipynb file
|
||||
"""
|
||||
self.mock_scan_notebooks()
|
||||
imports = pipreqs.get_all_imports(self.project_with_notebooks)
|
||||
for item in imports:
|
||||
self.assertTrue(item.lower() in self.modules, "Import is missing: " + item)
|
||||
not_desired_imports = ["time", "logging", "curses", "__future__", "django", "models", "FastAPI", "sklearn"]
|
||||
for not_desired_import in not_desired_imports:
|
||||
self.assertFalse(
|
||||
not_desired_import in imports,
|
||||
f"{not_desired_import} was imported, but it should not have been."
|
||||
)
|
||||
|
||||
def test_invalid_notebook(self):
|
||||
"""
|
||||
Test that invalid notebook files cannot be imported.
|
||||
"""
|
||||
self.mock_scan_notebooks()
|
||||
self.assertRaises(SyntaxError, pipreqs.get_all_imports, self.project_with_invalid_notebooks)
|
||||
|
||||
def test_ipynb_2_py(self):
|
||||
"""
|
||||
Test the function ipynb_2_py() which converts .ipynb file to .py format
|
||||
"""
|
||||
python_imports = pipreqs.get_all_imports(self.python_path_same_imports)
|
||||
notebook_imports = pipreqs.get_all_imports(self.notebook_path_same_imports)
|
||||
self.assertEqual(python_imports, notebook_imports)
|
||||
|
||||
def test_file_ext_is_allowed(self):
|
||||
"""
|
||||
Test the function file_ext_is_allowed()
|
||||
"""
|
||||
self.assertTrue(pipreqs.file_ext_is_allowed("main.py", [".py"]))
|
||||
self.assertTrue(pipreqs.file_ext_is_allowed("main.py", [".py", ".ipynb"]))
|
||||
self.assertFalse(pipreqs.file_ext_is_allowed("main.py", [".ipynb"]))
|
||||
|
||||
def test_parse_requirements(self):
|
||||
"""
|
||||
Test parse_requirements function
|
||||
"""
|
||||
test_cases = [
|
||||
(self.empty_filepath, []), # empty file
|
||||
(self.imports_filepath, self.parsed_packages), # imports with versions
|
||||
(
|
||||
self.imports_no_version_filepath,
|
||||
self.parsed_packages_no_version,
|
||||
), # imports without versions
|
||||
(
|
||||
self.imports_any_version_filepath,
|
||||
self.parsed_packages_any_version,
|
||||
), # imports with and without versions
|
||||
]
|
||||
|
||||
for test in test_cases:
|
||||
with self.subTest(test):
|
||||
filename, expected_parsed_requirements = test
|
||||
|
||||
parsed_requirements = pipreqs.parse_requirements(filename)
|
||||
|
||||
self.assertListEqual(parsed_requirements, expected_parsed_requirements)
|
||||
|
||||
@patch("sys.exit")
|
||||
def test_parse_requirements_handles_file_not_found(self, exit_mock):
|
||||
captured_output = StringIO()
|
||||
sys.stdout = captured_output
|
||||
|
||||
# This assertion is needed, because since "sys.exit" is mocked, the program won't end,
|
||||
# and the code that is after the except block will be run
|
||||
with self.assertRaises(UnboundLocalError):
|
||||
pipreqs.parse_requirements(self.non_existing_filepath)
|
||||
|
||||
exit_mock.assert_called_once_with(1)
|
||||
|
||||
printed_text = captured_output.getvalue().strip()
|
||||
sys.stdout = sys.__stdout__
|
||||
|
||||
self.assertEqual(printed_text, "File xpto was not found. Please, fix it and run again.")
|
||||
|
||||
def test_ignore_notebooks(self):
|
||||
"""
|
||||
Test if notebooks are ignored when the scan-notebooks parameter is False
|
||||
"""
|
||||
notebook_requirement_path = os.path.join(self.project_with_notebooks, "requirements.txt")
|
||||
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": self.project_with_notebooks,
|
||||
"--savepath": None,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--print": False,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
"--scan-notebooks": False,
|
||||
}
|
||||
)
|
||||
assert os.path.exists(notebook_requirement_path) == 1
|
||||
assert os.path.getsize(notebook_requirement_path) == 1 # file only has a "\n", meaning it's empty
|
||||
|
||||
def test_pipreqs_get_imports_from_pyw_file(self):
|
||||
pyw_test_dirpath = os.path.join(os.path.dirname(__file__), "_data_pyw")
|
||||
requirements_path = os.path.join(pyw_test_dirpath, "requirements.txt")
|
||||
|
||||
pipreqs.init(
|
||||
{
|
||||
"<path>": pyw_test_dirpath,
|
||||
"--savepath": None,
|
||||
"--print": False,
|
||||
"--use-local": None,
|
||||
"--force": True,
|
||||
"--proxy": None,
|
||||
"--pypi-server": None,
|
||||
"--diff": None,
|
||||
"--clean": None,
|
||||
"--mode": None,
|
||||
}
|
||||
)
|
||||
|
||||
self.assertTrue(os.path.exists(requirements_path))
|
||||
|
||||
expected_imports = [
|
||||
"airflow",
|
||||
"matplotlib",
|
||||
"numpy",
|
||||
"pandas",
|
||||
"tensorflow",
|
||||
]
|
||||
|
||||
with open(requirements_path, "r") as f:
|
||||
imports_data = f.read().lower()
|
||||
for _import in expected_imports:
|
||||
self.assertTrue(
|
||||
_import.lower() in imports_data,
|
||||
f"'{_import}' import was expected but not found.",
|
||||
)
|
||||
|
||||
os.remove(requirements_path)
|
||||
|
||||
def mock_scan_notebooks(self):
|
||||
pipreqs.scan_noteboooks = Mock(return_value=True)
|
||||
pipreqs.handle_scan_noteboooks()
|
||||
|
||||
def tearDown(self):
|
||||
"""
|
||||
Remove requiremnts.txt files that were written
|
||||
@ -336,5 +686,5 @@ class TestPipreqs(unittest.TestCase):
|
||||
pass
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
27
tox.ini
27
tox.ini
@ -1,16 +1,31 @@
|
||||
[tox]
|
||||
envlist = py37, py38, py39, pypy3, flake8
|
||||
isolated_build = true
|
||||
envlist = py38, py39, py310, py311, py312, pypy3, flake8
|
||||
|
||||
[gh-actions]
|
||||
python =
|
||||
3.7: py37
|
||||
3.8: py38
|
||||
3.9: py39
|
||||
pypy-3.7: pypy3
|
||||
3.10: py310
|
||||
3.11: py311
|
||||
3.12: py312
|
||||
pypy-3.9-7.3.12: pypy3
|
||||
|
||||
[testenv]
|
||||
setenv =
|
||||
PYTHONPATH = {toxinidir}:{toxinidir}/pipreqs
|
||||
commands = python setup.py test
|
||||
deps =
|
||||
-r{toxinidir}/requirements.txt
|
||||
commands =
|
||||
python -m unittest discover
|
||||
|
||||
[testenv:flake8]
|
||||
deps = flake8
|
||||
commands = flake8 pipreqs tests
|
||||
|
||||
[flake8]
|
||||
exclude =
|
||||
tests/_data/
|
||||
tests/_data_clean/
|
||||
tests/_data_duplicated_deps/
|
||||
tests/_data_ignore/
|
||||
tests/_invalid_data/
|
||||
max-line-length = 120
|
Loading…
x
Reference in New Issue
Block a user