I have created a project called NumerMP a few days ago which will provide multi-threaded numerical methods (using OpenMP library), and will provide python API for end users (using wrapper functons-f2py). I had to use f2py smart way, due to customizing fortran signature files (.pyf) before generating extension modules. To do so I ran commands below:
python -m numpy.f2py trial_file.f90 -m module_name -h trial_file.pyf
python -m numpy.f2py -c --f90flags=’-fopenmp’ -lgomp trial_file.pyf trial_file.f90
Then I tested generated extension modules (.so files) inside package itself and they worked correctly, Then I built whole project and uploaded to test pypi and installed it using pip, But I got an ImportError.
The structure of the package is as follows:
When I run each sample wrapper function inside package itself, as you can see they run correctly.
But I got error below when I used it inside other project:
Update
You have some sibling submodules:
module
|- first
| |- functions.py
|- second
| |- wrapper.py
Then in second/wrappdr.py:
from ..first import functions
functions.function1()
Related
I have an R script that can be executed in a terminal with Rscript app/myapp.R. The R script is stored in an R package project so that I can benefit from the documentation, check and unit tests tools.
Just for the sake of user-friendlyness, I was wondering if there is a method to mimic python's behaviour of a __main__.py inside a module.
EDIT 6/1/2020 : I actually need other users, that are not accustumed to R, to use the script. So once the package is installed, finding the full path of the script is not really an option for them.
In a python project, when I have the following package structure :
mypackage
├── mymodule
│ ├── __init__.py
│ └── __main__.py
└── setup.py
I can do python -m mymodule from any folder in the terminal provided the package was indeed installed in my python library. The command will execute mypackage/mymodule/__main__.py.
I'd like to have the same behaviour for an R package. Let's assume that the R package mypackage is already installed in my R user's library. I would like to be able to run mypackage/app/myapp.R from anywhere in the terminal (actually, I'd like others to be able to install the package and run the app without having to clone the repo).
I know that I can do
Rscript app/myapp.R
but that will only work if I cd into path/to/mypackage. I'd like to be able to do something like below from anywhere in the terminal provided the package is installed in the R user's library.
Rscript -m myapp
demo() seems to be made for interactive sessions and I need a non-interactive session.
As it turns out there was a really simple solution using R command and not Rscript. Just wrap up the script within a function from the package, then use :
R -e "mypackage::myfunc()"
I just come across this line in python3.6 unittest (/usr/lib/python3.6/unittest/loader.py:286):
is_not_importable = not os.path.isfile(os.path.join(start_dir, '__init__.py'))
which caused the unittest discovery to fail to run my tests. Why is this line still present in python3.6 library (Ubuntu 17.10, I don't know if it matters), if __init__.py is no longer required since python3.3?
I believe that's a bug, but I want a confirmation.
When there's no __init__.py in the foo directory, the following command runs fine ({PROJECT_HOME} being a placeholder):
python3.6 -m unittest discover tests.foo -t {PROJECT_HOME} -p "*.py"
while this fails (with ImportError: Start directory is not importable):
python3.6 -m unittest discover tests/foo -t {PROJECT_HOME} -p "*.py"
The difference being . -> / When there is __init__.py, both commands work the same.
Python repository directories depend on init.py to control python's behaviour when importing modules.
So you have to follow the guidelines in packaging namespace packages
Most cases will use the native namespace packages
Another important thing is the PYTHONPATH because it is an environment variable which you can set to add additional directories where python will look for modules and packages. For most installations, you should not set these variables since they are not needed for Python to run. Python knows where to find its standard library.
The only reason to set is to maintain directories of custom Python libraries that you do not want to install in the global default location (i.e., the site-packages directory).
I have a shared python library that I use in multiple projects, so the structure looks like this:
Project1
main.py <--- (One of the projects that uses the library)
...
sharedlib
__init__.py
ps_lib.py
another.py
Now in each project's main.py I use the following hack to make it work:
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import sharedlib.ps_lib
...
Is there a way to do it without using this hack? Or is there a better way to organize the projects structure?
I think the best way would be to make sharedlib a real package. That means changing the structure a bit:
sharedlib/
sharedlib/
__init__.py
ps_lib.py
another.py
setup.py
And using something like this in the setup.py (taken partially from Python-packaging "Minimal Structure"):
from setuptools import setup
setup(name='sharedlib',
version='0.1',
description='...',
license='...',
packages=['sharedlib'], # you might need to change this if you have subfolders.
zip_safe=False)
Then install it with python setup.py develop or pip install -e . when in the root folder of the sharedlib package.
That way (using the develop or -e option) changes to the contents of sharedlib/sharedlib/* files will be visible without re-installing the sharedlib package - although you may need to restart the interpreter if you're working in an interactive interpreter. That's because the interpreter caches already imported packages.
From the setuptools documentation:
Setuptools allows you to deploy your projects for use in a common directory or staging area, but without copying any files. Thus, you can edit each project’s code in its checkout directory, and only need to run build commands when you change a project’s C extensions or similarly compiled files. [...]
To do this, use the setup.py develop command.
(emphasis mine)
The most important thing is that you can import sharedlib everywhere now - no need to insert the sharedlib package in the PATH or PYTHONPATH anymore because Python (or at least the Python where you installed it) now treats sharedlib like any other installed package.
The way we do it is to use bash entry-scripts for the python scripts. Our directory structure would look similar to the following:
/opt/stackoverflow/
-> bin
-> conf
-> lib
-> log
Our lib folder then contains all of our sub-projects
/opt/stackoverflow/lib/
-> python_algorithms
-> python_data_structures
-> python_shared_libraries
and then when we want to execute a python script, we'll execute it via a bash script within the bin directory
/opt/stackoverflow/bin/
-> quick_sort.sh
-> merge_sort.sh
and if we cat one of our entry scripts
cat merge_sort.sh
#!/bin/bash
export STACKOVERFLOW_HOME=/opt/stackoverflow
export STACKOVERFLOW_BIN=${STACKOVERFLOW_HOME}/bin
export STACKOVERFLOW_LIB=${STACKOVERFLOW_HOME}/lib
export STACKOVERFLOW_LOG=${STACKOVERFLOW_HOME}/log
export STACKOVERFLOW_CONF=${STACKOVERFLOW_HOME}/conf
# Do any pre-script server work here
export PYTHONPATH=${PYTHONPATH}:${STACKOVERFLOW_LIB}
/usr/bin/python "${STACKOVERFLOW_LIB}/python_algorithms/merge_sort.py" $* 2>&1
I have a simple Python project with effectively one package (called forcelib) containing one module (also called forcelib):
- setup.py
- forcelib
|- __init__.py
|- forcelib.py
My setup.py is copied from the official example and has the obvious edits.
The problem is that I can install the forcelib package using pip but when I import forcelib, it only has the "double-underscore" attributes visible. That is, I cannot see the forcelib module.
Example to replicate:
git clone https://github.com/blokeley/forcelib
cd forcelib
pip install -e .
python
import forcelib
print(forcelib.__version__) # Correctly prints 0.1.2
dir(forcelib) # The only contents are the __version__, __path__ etc. double-underscore attributes. I had expected to see forcelib, example_read etc.
Perhaps I'm supposed to distribute just the module rather than bother with a package.
The (very small) project is on GitHub.
Any advice would be much appreciated.
It seems that there are 2 ways of doing it:
Keep the same directory structure but put the following in __init__.py
from .forcelib import *
Distribute a module, not a package. Follow the instructions to use the py_modules argument rather than the packages argument in setup.py. This would mean restructuring the project to:
setup.py
forcelib.py
Approach (1) can be seen here. It has the advantage of hiding the private functions and attributes (anything not in __all__), but the client can still see the module forcelib.forcelib, which I don't think it should.
Approach (2) can be seen here. It is simpler, but has the disadvantage that it does not hide private functions and attributes.
Download zip file and extract zip after go to forcelib-master directory
then open command prompt and go to forcelib-master directory in command prompt and run command
python setup.py install
It will install packages successfully
Hi I currently have a python project that uses subprocess. Popen to run some batch files.
Is it possible to package the batch file as source. Thus, when some of our other python project use setup.py to include the current python project in install_requires, the other project could install and update those batch files and uses it from source (i.e. run these script with subprocess. Popen as well)?
Anyone have some idea how should I do it?
Thanks in advance!
If you have bash scripts that are required to run your python package, you could simply store them within your package folder and they should be included when installing the package using setuptools. Here is an example of a possible folder structure:
/myproject
/myproject
__init__.py
main.py
batch.sh
setup.py
In the main.py you could access the batch file by:
import os.path
import subprocess
current_dir = os.path.dirname(os.path.abspath(__file__))
batch_script = os.path.join(current_dir, 'batch.sh')
subprocess.call(batch_script)
UPDATE
Based on other comments, if you instead need a way to make batch scripts accessible to third party packages, you could specify the scripts in the 'scripts' key in setuptools. You can see this available option in setuptools here.