How to install python package namespace from private bitbucket-git repository - python

I have several related projects that I think will be a good fit for Python's namespace-packages. I'm currently running python 3.8, and have created the following directory structure for testing.
├── namespace-package-test.package1
│   ├── LICENSE.txt
│   ├── README.md
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── pkg1
│   │   ├── cli
│   │   │   ├── __init__.py
│   │   │   └── pkg1_cli.py
│   │   └── __init__.py
│   └── tests
├── namespace-package-test.package2
│   ├── AUTHORS.rst
│   ├── CHANGELOG.rst
│   ├── LICENSE.txt
│   ├── README.md
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── pkg2
│   │   ├── cli
│   │   │   ├── __init__.py
│   │   │   └── pkg2_cli.py
│   │   └── __init__.py
│   └── tests
The entire project is on a private bitbucket (cloud) server at;
git#bitbucket.org:<my-company>/namespace-package-test.git
I would like to install, locally, only package 1. I've tried every iteration I can imagine of the following, but nothing seems to get me there. I either get a repository not found error or a setup.py not found error.
pip install git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git
Is this possible?
Is my project structure correct for what I am doing?
What should the pip install command look like?
Bonus, what if I only want to install a specific spec using pipx?
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git"

I think I figured it out ... for posterity sake
Pip install (into virtual environment)
pip install git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1
pipx install - with spec
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1"

Related

requirments.txt with actual dependencies

Is there a way to create a requirements.txt file that only contains the modules that my script actually needs?
I usually just do a pip freeze and then remove unused modules.
You can use pipreqs to analyze your project files, and automatically generate a requirements.txt for you.
First step is to install pipreqs. You can do so, by executing the following command in your console:
pip install pipreqs
Then, the only thing you need to do is to execute the command pipreqs, specifying the path where your files are located at. For example, to generate a requirements.txt, based on the modules on the current working directory, you could execute:
pipreqs .
For some other directory:
pipreqs "PATH/TO/YOUR/PROJECT/PYTHON/FILES"
Real World Example
Here's the tree-view of the folder we're going to use as example:
.
├── data
│   └── EXCEL_FILES
│   ├── DIVISION_MAP.xlsx
│   ├── INPUT_CONTAINER.xlsx
│   ├── KITS.xlsx
│   ├── LOADING.xlsx
│   ├── MOQ_CBM_SHIPPING_TYPE.xlsx
│   ├── OUTBOUND_OTM.XLSX
│   ├── PLANT_SOURCE_MAP.xlsx
│   ├── PORT_STATE.xlsx
│   └── SALABLE_STORAGE_LOCATION.xlsx
├── loading
│   ├── __init__.py
│   ├── configs
│   │   ├── Initialization.py
│   │   ├── __init__.py
│   │   ├── config.ini
│   │   └── logconfig.py
│   ├── constants.py
│   ├── read_files.py
│   ├── reports.py
│   └── utils
│   ├── __init__.py
│   ├── date_utils.py
│   ├── file_utils.py
│   └── utils.py
├── logs
│   └── po_opt.log
├── main.py
└── outputs
   ├── 2022-10-03
   ├── 2022-10-04
   ├── 2022-10-05
   ├── 2022-10-06
   ├── 2022-11-23
    ├── 2022-11-24
   └── 2022-11-28
Executing the command pipreqs . from the root path I get the following requirements.txt:
❯ pipreqs .
INFO: Successfully saved requirements file in ./requirements.txt
Output:
holidays==0.17.2
numpy==1.20.1
pandas==1.2.4
python_dateutil==2.8.2
six==1.16.0

Module not found error in pytest when tests folder contains __init__.py

My tests were working a few hours before. But suddenly tests located in root folder/tests are giving me ModuleNotFoundError: No module named 'tests.test_* I imported the tests.test_a in a top level python file and it worked. Any guesses why cant pytest load them?
I am asking this because this is weird that the test folder located inside root folder/some folder/tests also contains __init__.py and they work fine. I read this question and deleted the __init__.py and it worked. But it is confusing as to why the top level tests would not work.
The folder structure is along the lines of:
.
├── authenticate
│   ├── exceptions.py
│   ├── __init__.py
│   ├── __pycache__
│   │   ├── exceptions.cpython-38.pyc
│   │   ├── __init__.cpython-38.pyc
│   │   ├── repository.cpython-38.pyc
│   │   └── use_case.cpython-38.pyc
│   ├── Readme.md
│   ├── repository.py
│   ├── tests
│   │   ├── __init__.py
│   │   ├── __pycache__
│   │   └── test_authenticate.py
│   └── use_case.py
├── tests
│   ├── __init__.py
│   ├── __pycache__
│   │   ├── test_authenticate.cpython-38-pytest-5.4.2.pyc
│   ├── test_authenticate.py
As weird as it sounds, I had forgotten to add __init__.py in one of my packages. I don't know why it was causing tests in my top level directory to fail

What's the purpose of package.egg-info folder?

I'm developing a python package foo. My project structure looks like this:
.
├── foo
│   ├── foo
│   │   ├── bar.py
│   │   ├── foo.py
│   │   ├── __init__.py
│   ├── README.md
│   └── setup.py
├── footest
│   ├── test.py
test.py only has 1 line: import foo
In order for test.py to be able to import the package foo I install it with the command pip3 install -e foo.
Now a new folder called foo.egg-info is created under foo/
.
├── foo
│   ├── foo
│   │   ├── bar.py
│   │   ├── foo.py
│   │   ├── __init__.py
│   ├── foo.egg-info
│   │   ├── dependency_links.txt
│   │   ├── PKG-INFO.txt
│   │   ├── requires.txt
│   │   ├── SOURCES.txt
│   │   ├── top_level.txt
│   ├── README.md
│   └── setup.py
├── footest
│   ├── test.py
What's the purpose of this folder? I tried deleting it and test.py still ran properly. Is is just leftover garbage, similar to the .o files when compiling C projects? If so, is there a way to automatically remove it?
The package.egg-info saves meta data about your installed package like version.
It is used when you for example uninstall it, or "pip list" to see what is installed.
you can open it and take a look.

Unit test packages Maven style convention

I want to create a pybuilder project with unit tests and packages. As an example, modified simple python app example, with "helloworld" moved to package "hello".
My first instinct was to match the package structure with "main" and "unittest" sources:
+---src
+---main
| \---python
| \---hello
| helloworld.py
| __init__.py
|
\---unittest
\---python
\---hello
helloworld_tests.py
__init__.py
This does not work because of conflicting "hello" package.
BUILD FAILED - 'module' object has no attribute 'helloworld_tests'
I see pybuilder itself just skips the top-level pybuilder package in unittests, but won't do if there are multiple top-level packages.
My second guess would be to create extra top level package for unittests.
\---unittest
\---python
\---tests
| __init__.py
\---hello
helloworld_tests.py
__init__.py
Is there a better solution or established convention how to organize python tests in packages?
Probably nothing really new for the OP, but I just wanted to collect all options that I could come up with in one place:
1) Just append _tests to names of top-level packages
The easiest way to mirror the structure of src/main/python in the src/test/python almost 1:1 would be by simply appending _tests to the names of the top-level packages. For example, if I have only one top-level package rootPkg, then I can add the corresponding rootPkg_tests to the test/ subdirectory:
src
├── main
│   ├── python
│   │   └── rootPkg
│   │   ├── __init__.py
│   │   ├── pkgA
│   │   │   ├── __init__.py
│   │   │   └── modA.py
│   │   └── pkgB
│   │   ├── __init__.py
│   │   └── modB.py
│   └── scripts
│   └── entryPointScript.py
└── test
└── python
└── rootPkg_tests
├── __init__.py
├── pkgA
│   ├── __init__.py
│   └── modA_tests.py
└── pkgB
├── __init__.py
└── modB_tests.py
This seems to work nicely with PyBuilder 0.11.15 and unittest plugin (notice that I've deviated from PyBuilders convention, and put tests in test instead of unittest, you probably shouldn't do this if you intend to use multiple testing frameworks).
2) If there is only one package: do it like PyBuilder
The PyBuilder is itself built with PyBuilder. This is what its source directory looks like (drastically reduced, unnecessary details omitted):
src
├── main
│   ├── python
│   │   └── pybuilder
│ │    ├── __init__.py
│   │   ├── cli.py
│   │   ├── core.py
│ │      └── plugins
│ │    ├── __init__.py
│   │      ├── core_plugin.py
│   │      └── exec_plugin.py
│   └── scripts
│   └── pyb
└── unittest
└── python
├── cli_tests.py
├── core_tests.py
├── plugins
│   ├── exec_plugin_tests.py
│   ├── __init__.py
│   ├── python
│   │   ├── core_plugin_tests.py
│   │   ├── __init__.py
If I understand it correctly, the tree in unittest mirrors the tree in src, but the directory for the top-level package pybuilder is omitted. That's what you have described in your question as first workaround. The drawback is that it doesn't really work if there are multiple top-level packages.
3) Add one additional tests top-level package
That's what you have proposed as a workaround: mirror the tree in main, but wrap everything in an additional tests-package. This works with many top-level packages in /src/main/python and prevents any package name collisions.
I'm not aware of any convention. Upvote one of the comments below if you have an opinion on that matter.

Installation of MySQL-python in shared hosting

I'm using Hostgator as a testing environment and I had a problem installing MySQL-python, after using:
pip install MySQL-python
Next error raises:
unable to execute gcc: Permission denied
enter code here`error: command 'gcc' failed with exit status 1
I ask technical support about this and they answer me:
This script requires a compiler, which shared accounts do not have
access to. You would need to upload any Python scripts that you want
to use as a precompiled script. You should be able to compile it
elsewhere and then upload to the account to use it.
This is my first project using Python and I have not idea how to do this.
Thanks
======
UPDATE
As André proposed, What I did was using my linux I created two virtual environments (using virtualenv) one with and one without MySQL-python installed.
Checking the file structure, missing files were:
.
├── MySQLdb
│   ├── connections.py
│   ├── connections.pyc
│   ├── constants
│   │   ├── CLIENT.py
│   │   ├── CLIENT.pyc
│   │   ├── CR.py
│   │   ├── CR.pyc
│   │   ├── ER.py
│   │   ├── ER.pyc
│   │   ├── FIELD_TYPE.py
│   │   ├── FIELD_TYPE.pyc
│   │   ├── FLAG.py
│   │   ├── FLAG.pyc
│   │   ├── __init__.py
│   │   ├── __init__.pyc
│   │   ├── REFRESH.py
│   │   └── REFRESH.pyc
│   ├── converters.py
│   ├── converters.pyc
│   ├── cursors.py
│   ├── cursors.pyc
│   ├── __init__.py
│   ├── __init__.pyc
│   ├── release.py
│   ├── release.pyc
│   ├── times.py
│   └── times.pyc
├── _mysql_exceptions.py
├── _mysql_exceptions.pyc
├── MySQL_python-1.2.5-py2.7.egg-info
│   ├── dependency_links.txt
│   ├── installed-files.txt
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   └── top_level.txt
└── _mysql.so
So I copied those files to:
/venv/lib/python2.7/site-packages/
Were /venv/ is the folder of the virtual enviroment created in the hosting.
Thanks again
There is a really simple solution to this. If the root user on the shared hosting has MySQLdb python module installed, then you can create a user specific virtual environment by using the --sytem-site-package flag. This creates a virtual environment with all the modules of the root python installed onto the local venv.
virtualenv --sytem-site-package
You can look up: Make virtualenv inherit specific packages from your global site-packages
You don't have permission to compile things using gcc. You will have to install MySQL-python in another place and then move the files back onto your server.
See py_compile for compiling python scripts
If they mean precompiled I am assuming to make an executable, if your HostGator account is using Windows then you can use py2exe and create an executable. Py2exe makes it so you can run your script on other computers without having to install python.
First create a setup.py which tells what script and all its dependencies, and then run python setup.py py2exe and it will create two folders. You will just need the dist folder with the executable located in there.
There are many nice tutorials on how to do this, good luck!

Categories

Resources