Hi I currently have a python project that uses subprocess. Popen to run some batch files.
Is it possible to package the batch file as source. Thus, when some of our other python project use setup.py to include the current python project in install_requires, the other project could install and update those batch files and uses it from source (i.e. run these script with subprocess. Popen as well)?
Anyone have some idea how should I do it?
Thanks in advance!
If you have bash scripts that are required to run your python package, you could simply store them within your package folder and they should be included when installing the package using setuptools. Here is an example of a possible folder structure:
/myproject
/myproject
__init__.py
main.py
batch.sh
setup.py
In the main.py you could access the batch file by:
import os.path
import subprocess
current_dir = os.path.dirname(os.path.abspath(__file__))
batch_script = os.path.join(current_dir, 'batch.sh')
subprocess.call(batch_script)
UPDATE
Based on other comments, if you instead need a way to make batch scripts accessible to third party packages, you could specify the scripts in the 'scripts' key in setuptools. You can see this available option in setuptools here.
Related
When we run a python script/module .py file, then the interpreter looks for any imports in the directory in which the running script is located, and not in the current working directory.
When we run python module using the -m switch then it loads the current working directory into the path.
However, when we run python package using the -m switch then which directory is loaded into the path? The current directory or directory containing the package or the package itself?
Can someone throw light on this concept.
The current directory.
Seems likes the behavior is similar between a module and a package in this regard. The documentation on running with -m:
https://docs.python.org/3/using/cmdline.html#cmdoption-m
... the current directory will be added to the start of sys.path.
Running a package with -m will add the current directory to the path, not the package's directory or the directory containing the package.
From the -m docs:
As with the -c option, the current directory will be added to the start of sys.path.
This does not depend on whether the specified module is a package or a normal module. (It could even be a namespace package, which does not have a single directory for its contents or a single containing directory.)
Note that the current directory is added to the path before attempting to resolve the name of the package or module to run, which is why it is even possible to use -m to run modules and packages in the current directory.
From my last experience it is from the directory containing the package.
However to check that, run this to see all directories where python would look into:
import sys
print('\n'.join(sys.path))
Tip: If you would like to load custom .py files from other directories, do:
import sys
sys.path.insert(1, "<path">)
OR
import sys
sys.path.append("<path>")
My system administrator will not allow global installation of python packages.
I'm writing a script that people will invoke to perform certain actions for them. The script I'm writing needs certain libraries like sqlalchemy and coloredlogs. I am however allowed to install python libs any local folder. i.e not site-packages.
How would I go about installing the libs in the same folder as the script so that the script has access to them?
My folder hierarchy is like so
script_to_invoke.py
scriptpack/
bin/
coloredlogs
coloredlogs.egg
...
utils/
util1.py
util2.py
(all the folders indicated have an __init__.py)
What I've tried so far:
within script_to_invoke.py I use
from scriptpack.utils invoke util1 # no problem here
from scriptpack.bin import coloredlogs # fails to find the import
I've looked at some other SO answers abut I'm not sure how to correlate them with my problem.
I figured it out!
Python had to be directed to find the .egg files
This can be done by either
Editing the PYTHONPATH var BEFORE the interpreter is started (or)
Appending the full path to the eggs to the sys path
Code Below:
import sys
for entry in [<list of full path to egg files in bin dir>]:
sys.path.append(str(entry))
# Proceed with local imports
If you might want to try packaging up everything as a zipapp. Doing so makes a single zip file that acts as a Python script, but can contain a whole multitude of embedded packages. The steps to make it are:
Make a folder with the name of your program (testapp in my example)
Name your main script __main__.py and put it in that folder
Using pip, install the required packages to the folder with --target=/path/to/testapp
Run python3 -mzipapp testapp -p='/usr/bin/env python3' (providing the shebang line is optional; without it, users will need to run the package with python3 testapp.pyz, while with the shebag, they can just do ./testapp.pyz)
That creates a zip file with all your requirements embedded in it alongside your script, that doesn't even need to be unpacked to run (Python knows how to run zip apps natively). As a trivial example:
$ mkdir testapp
$ echo -e '#!/usr/bin/python3\nimport sqlalchemy\nprint(sqlalchemy)' > __main__.py
$ pip3 install --target=./testapp sqlalchemy
$ python3 -mzipapp testapp -p='/usr/bin/env python3'
$ ./testapp.pyz
<module 'sqlalchemy' from './testapp.pyz/sqlalchemy/__init__.py'>
showing how the simple main was able to access sqlalchemy from within the same zipapp. It's also smaller (thanks to the zipping) that distributing the uncompressed modules:
$ du -s -h testapp*
13M testapp
8.1M testapp.pyz
You can install these packages in a non-global location (generally in ~/.local/lib/python<x.y>) using the --user flag, e.g.:
pip install --user sqlalchemy coloredlogs
That way you don't have to worry about changing how imports work, and you're still compliant with your sysadmins policies.
I have a shared python library that I use in multiple projects, so the structure looks like this:
Project1
main.py <--- (One of the projects that uses the library)
...
sharedlib
__init__.py
ps_lib.py
another.py
Now in each project's main.py I use the following hack to make it work:
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import sharedlib.ps_lib
...
Is there a way to do it without using this hack? Or is there a better way to organize the projects structure?
I think the best way would be to make sharedlib a real package. That means changing the structure a bit:
sharedlib/
sharedlib/
__init__.py
ps_lib.py
another.py
setup.py
And using something like this in the setup.py (taken partially from Python-packaging "Minimal Structure"):
from setuptools import setup
setup(name='sharedlib',
version='0.1',
description='...',
license='...',
packages=['sharedlib'], # you might need to change this if you have subfolders.
zip_safe=False)
Then install it with python setup.py develop or pip install -e . when in the root folder of the sharedlib package.
That way (using the develop or -e option) changes to the contents of sharedlib/sharedlib/* files will be visible without re-installing the sharedlib package - although you may need to restart the interpreter if you're working in an interactive interpreter. That's because the interpreter caches already imported packages.
From the setuptools documentation:
Setuptools allows you to deploy your projects for use in a common directory or staging area, but without copying any files. Thus, you can edit each project’s code in its checkout directory, and only need to run build commands when you change a project’s C extensions or similarly compiled files. [...]
To do this, use the setup.py develop command.
(emphasis mine)
The most important thing is that you can import sharedlib everywhere now - no need to insert the sharedlib package in the PATH or PYTHONPATH anymore because Python (or at least the Python where you installed it) now treats sharedlib like any other installed package.
The way we do it is to use bash entry-scripts for the python scripts. Our directory structure would look similar to the following:
/opt/stackoverflow/
-> bin
-> conf
-> lib
-> log
Our lib folder then contains all of our sub-projects
/opt/stackoverflow/lib/
-> python_algorithms
-> python_data_structures
-> python_shared_libraries
and then when we want to execute a python script, we'll execute it via a bash script within the bin directory
/opt/stackoverflow/bin/
-> quick_sort.sh
-> merge_sort.sh
and if we cat one of our entry scripts
cat merge_sort.sh
#!/bin/bash
export STACKOVERFLOW_HOME=/opt/stackoverflow
export STACKOVERFLOW_BIN=${STACKOVERFLOW_HOME}/bin
export STACKOVERFLOW_LIB=${STACKOVERFLOW_HOME}/lib
export STACKOVERFLOW_LOG=${STACKOVERFLOW_HOME}/log
export STACKOVERFLOW_CONF=${STACKOVERFLOW_HOME}/conf
# Do any pre-script server work here
export PYTHONPATH=${PYTHONPATH}:${STACKOVERFLOW_LIB}
/usr/bin/python "${STACKOVERFLOW_LIB}/python_algorithms/merge_sort.py" $* 2>&1
I have just create Python command line tool(xiber) called xiber.py that creating a iPad.xib file from a iPhone.xib. On my own computer I do:
alias xiber='python path_to/xiber.py'
Then I can use 'xiber' anywhere in my own computer. I want share this tool to other developer.
My question is: How could they use it without do the alias xiber='....' stuff and just using the xiber command.
Thanks.
Setup your Git repo like this:
bin/
xiber # Formerly index.py
README.md
setup.py
In the setup.py file.
from distutils.core import setup
# Typing this from memory (not tested).
setup(
name = 'Xiber',
version = '0.1',
scripts = ['bin/xiber'],
# Other options if desired ...
)
People who want to install your script can use ordinary tools like pip or easy_install, which will put your script in the appropriate bin directory of the Python installation. For example:
pip install git+git://github.com/liaa/xiber
a way is to provide a xiber bash script and a xiber.bat batch file in a /bin/ folder that is distributed with the project. Then the end user can make sure to add the bin folder to their PATH
Create executable by PyInstaller
Another option could be creating standalone executable file using Pyinstaller.
If you check the tutorial, you would find example starting with your scenario - turning Python script into executable (I used it and finally modified to the method "single file"), it works.
I'm attempting to make a Pyinstaller build for Windows 7 using Pyinstaller 2.1. The module uses relative imports because the package is typically used in Linux as a 'regular' Python package. Is there a way to create a spec file to handle this type of setup? Ideally I would like to have a Python package that I can make a Pyinstaller exe with for Windows and have a 'regular' pip-installable Python package in Linux/OS X.
I was thinking of maybe using hidden imports or something to achieve this.
I've tried using the default Pyinstaller settings and pointing it at my 'main' python script. I get the following from the resulting exe:
'Attempted relative import in non-package'
This makes sense because I'm pointing Pyinstaller at my main.py file in the package and Pyinstaller is NOT picking up my entire package. This is just the starting point for using the module from the command-line. However, you can import it and use it in your own code as well.
Sidenote:
The reasoning is this package requires numpy and scipy. Yes, I know there are good ways to get these working in Windows with Anaconda, etc. However, I'm stuck with an exe setup now for legacy reasons.
I can't find a way to get Pyinstaller to do this. However, I don't think it's the fault of Pyinstaller. It's more of a problem with the way I structured my package.
I was passing a script to Pyinstaller that was a part of my package. The better way to do that would be to provide a simple Python script outside of the package that serves as the cli front-end to the package.
For example, consider a package layout like this (assume files use relative imports):
repo_dir/
setup.py
README.md
package_a/
main.py
support_module.py
__init__.py
My previous attempt was trying to build main.py by passing it to Pyinstaller. This resulted in the error mentioned in the above question.
However, I later added a cli.py script that does something like this:
from package_a.main import main
if __name__ == '__main__':
main()
Now, I can pass cli.py to Pyinstaller and my explicit relative imports are only used inside of the package. So, it all works. Here's a sample directory layout that works just for reference:
repo_dir/
setup.py
cli.py
README.md
package_a/
main.py
support_module.py
__init__.py
In my case, MacOS 12.x, I had some relative imports similar to this
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from utils import *
and I had a structure like this
dirA
utils.py
dirB
my_app.py
and the created .app file didn't work. The problem was that I was running pyinstaller from w/in dirB, where, the utils.py module couldn't be found by pyinstaller. The solution was to specify any paths from which my_app.py imported modules. This is possible either explicitly w/ option/flag -p of pyinstaller or implicitly, by running pyinstaller from dirA and adding __init__.py(empty) in dirB. Adding the __init__.py "forces" pyinstaller to extend PYTHONPATH by dirA and not dirB.