I have just create Python command line tool(xiber) called xiber.py that creating a iPad.xib file from a iPhone.xib. On my own computer I do:
alias xiber='python path_to/xiber.py'
Then I can use 'xiber' anywhere in my own computer. I want share this tool to other developer.
My question is: How could they use it without do the alias xiber='....' stuff and just using the xiber command.
Thanks.
Setup your Git repo like this:
bin/
xiber # Formerly index.py
README.md
setup.py
In the setup.py file.
from distutils.core import setup
# Typing this from memory (not tested).
setup(
name = 'Xiber',
version = '0.1',
scripts = ['bin/xiber'],
# Other options if desired ...
)
People who want to install your script can use ordinary tools like pip or easy_install, which will put your script in the appropriate bin directory of the Python installation. For example:
pip install git+git://github.com/liaa/xiber
a way is to provide a xiber bash script and a xiber.bat batch file in a /bin/ folder that is distributed with the project. Then the end user can make sure to add the bin folder to their PATH
Create executable by PyInstaller
Another option could be creating standalone executable file using Pyinstaller.
If you check the tutorial, you would find example starting with your scenario - turning Python script into executable (I used it and finally modified to the method "single file"), it works.
Related
My system administrator will not allow global installation of python packages.
I'm writing a script that people will invoke to perform certain actions for them. The script I'm writing needs certain libraries like sqlalchemy and coloredlogs. I am however allowed to install python libs any local folder. i.e not site-packages.
How would I go about installing the libs in the same folder as the script so that the script has access to them?
My folder hierarchy is like so
script_to_invoke.py
scriptpack/
bin/
coloredlogs
coloredlogs.egg
...
utils/
util1.py
util2.py
(all the folders indicated have an __init__.py)
What I've tried so far:
within script_to_invoke.py I use
from scriptpack.utils invoke util1 # no problem here
from scriptpack.bin import coloredlogs # fails to find the import
I've looked at some other SO answers abut I'm not sure how to correlate them with my problem.
I figured it out!
Python had to be directed to find the .egg files
This can be done by either
Editing the PYTHONPATH var BEFORE the interpreter is started (or)
Appending the full path to the eggs to the sys path
Code Below:
import sys
for entry in [<list of full path to egg files in bin dir>]:
sys.path.append(str(entry))
# Proceed with local imports
If you might want to try packaging up everything as a zipapp. Doing so makes a single zip file that acts as a Python script, but can contain a whole multitude of embedded packages. The steps to make it are:
Make a folder with the name of your program (testapp in my example)
Name your main script __main__.py and put it in that folder
Using pip, install the required packages to the folder with --target=/path/to/testapp
Run python3 -mzipapp testapp -p='/usr/bin/env python3' (providing the shebang line is optional; without it, users will need to run the package with python3 testapp.pyz, while with the shebag, they can just do ./testapp.pyz)
That creates a zip file with all your requirements embedded in it alongside your script, that doesn't even need to be unpacked to run (Python knows how to run zip apps natively). As a trivial example:
$ mkdir testapp
$ echo -e '#!/usr/bin/python3\nimport sqlalchemy\nprint(sqlalchemy)' > __main__.py
$ pip3 install --target=./testapp sqlalchemy
$ python3 -mzipapp testapp -p='/usr/bin/env python3'
$ ./testapp.pyz
<module 'sqlalchemy' from './testapp.pyz/sqlalchemy/__init__.py'>
showing how the simple main was able to access sqlalchemy from within the same zipapp. It's also smaller (thanks to the zipping) that distributing the uncompressed modules:
$ du -s -h testapp*
13M testapp
8.1M testapp.pyz
You can install these packages in a non-global location (generally in ~/.local/lib/python<x.y>) using the --user flag, e.g.:
pip install --user sqlalchemy coloredlogs
That way you don't have to worry about changing how imports work, and you're still compliant with your sysadmins policies.
I have a shared python library that I use in multiple projects, so the structure looks like this:
Project1
main.py <--- (One of the projects that uses the library)
...
sharedlib
__init__.py
ps_lib.py
another.py
Now in each project's main.py I use the following hack to make it work:
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import sharedlib.ps_lib
...
Is there a way to do it without using this hack? Or is there a better way to organize the projects structure?
I think the best way would be to make sharedlib a real package. That means changing the structure a bit:
sharedlib/
sharedlib/
__init__.py
ps_lib.py
another.py
setup.py
And using something like this in the setup.py (taken partially from Python-packaging "Minimal Structure"):
from setuptools import setup
setup(name='sharedlib',
version='0.1',
description='...',
license='...',
packages=['sharedlib'], # you might need to change this if you have subfolders.
zip_safe=False)
Then install it with python setup.py develop or pip install -e . when in the root folder of the sharedlib package.
That way (using the develop or -e option) changes to the contents of sharedlib/sharedlib/* files will be visible without re-installing the sharedlib package - although you may need to restart the interpreter if you're working in an interactive interpreter. That's because the interpreter caches already imported packages.
From the setuptools documentation:
Setuptools allows you to deploy your projects for use in a common directory or staging area, but without copying any files. Thus, you can edit each project’s code in its checkout directory, and only need to run build commands when you change a project’s C extensions or similarly compiled files. [...]
To do this, use the setup.py develop command.
(emphasis mine)
The most important thing is that you can import sharedlib everywhere now - no need to insert the sharedlib package in the PATH or PYTHONPATH anymore because Python (or at least the Python where you installed it) now treats sharedlib like any other installed package.
The way we do it is to use bash entry-scripts for the python scripts. Our directory structure would look similar to the following:
/opt/stackoverflow/
-> bin
-> conf
-> lib
-> log
Our lib folder then contains all of our sub-projects
/opt/stackoverflow/lib/
-> python_algorithms
-> python_data_structures
-> python_shared_libraries
and then when we want to execute a python script, we'll execute it via a bash script within the bin directory
/opt/stackoverflow/bin/
-> quick_sort.sh
-> merge_sort.sh
and if we cat one of our entry scripts
cat merge_sort.sh
#!/bin/bash
export STACKOVERFLOW_HOME=/opt/stackoverflow
export STACKOVERFLOW_BIN=${STACKOVERFLOW_HOME}/bin
export STACKOVERFLOW_LIB=${STACKOVERFLOW_HOME}/lib
export STACKOVERFLOW_LOG=${STACKOVERFLOW_HOME}/log
export STACKOVERFLOW_CONF=${STACKOVERFLOW_HOME}/conf
# Do any pre-script server work here
export PYTHONPATH=${PYTHONPATH}:${STACKOVERFLOW_LIB}
/usr/bin/python "${STACKOVERFLOW_LIB}/python_algorithms/merge_sort.py" $* 2>&1
EDIT: Based on discussions below, I think my question really could apply to any language. Python naturally has a packaging system and installation procedure via pip. But let's say this was C code or a perl script. Users would still download the program files, and have a way to execute the code in the command line via capsall input.txt.
I have a python script that takes an input file and gives an output file.
For a concrete example, this script file1.py takes in an input text file and outputs a text file with all letters capitalized:
import sys
inFile = sys.argv[1]
outFile = sys.argv[2]
with open(inFile,'r') as input_file:
lines = input_file.readlines()
# process the input file somehow
# here we simply capitalize the text,
# but naturally something more complex is possible
capitalized_lines = []
for line in lines:
capitalized_lines.append(line.upper())
with open(outFile,'w') as output_file:
for line in capitalized_lines:
output_file.write(line)
The way users execute this code now with
python file1.py input.txt output.txt
Let's say I wanted to distribute this code such that users would download a tarball and be able to execute the above in the command line with (for example)
capsall input.txt
which would run python file1.py and output the file output.txt. Does one write a bash script? If so, how do you distribute the code such that users will have this in their PATH?
Add a "hash bang" at the top of the script file to tell bash to invoke the Python interpreter. Also make your script executable:
#!/usr/bin/env python
import sys
inFile = sys.argv[1]
outFile = sys.argv[2]
...
Make the script file executable:
$ cp file1.py capsall
$ chmod +x capsall
Now you can run the script with:
$ ./capsall input.txt output.txt
Or if capsall is on your path:
$ capsall input.txt output.txt
I would recommend using python packaging (e.g., [pip]) for this. You could use the OS specific packaging method as well, something like apt, yum, msiexec, whatever, but I wouldn't unless you have to. Your users already have python installed since they are used to explicitly passing your script to the interpreter already. If you are interested in using python packaging, then read on.
First, I would put your script into a package along with whatever other processing is necessary. Let's call it mylibrary. Your project should look something like:
myproject
|-- README.rst
`-- myproject
|-- __init__.py
`-- capsall.py
Keep __init__.py simple and make sure that you can import it without dependencies. I use something like:
version_info = (0, 0, 0)
version = '.'.join(str(v) for v in version_info)
Next, add a file named setup.py at the root. This is what will define your package, it's metadata, and the scripts to install into your user's $PATH. The following will work nicely:
#!/usr/bin/env python
import setuptools
import myproject
setuptools.setup(
name='myproject',
version=myproject.version,
description='Package of useful utilities.',
long_description=open('README.rst').read(),
url='https://github.com/me/myproject',
author='Me',
author_email='me#example.com',
packages=['myproject'],
entry_points={
'console_scripts': ['capsall=myproject.capsall:main'],
},
license='BSD',
)
If you plan on uploading to pypi.python.org, then you will need at least that much metadata. I would recommend adding classifiers as well. The Python Packaging Authority docs on writing a setup.py are invaluable.
The part that you are most interested in is the entry_points keyword parameter. It defines various ways that your package can be invoked. You are looking for console_scripts since it creates shell scripts that are installed into the local path. See the setuptools documentation for more details.
The console_scripts definition that I gave above creates a script named capsall that invokes the my project.capsall.main function. I would repackage your code into a project. Bundle your script into the main function of the capsall.py module. Then generate a source repository and upload it to pypi.python.org (./setup.py sdist register upload). Once you have it uploaded, your users can install it using pip install myproject.
Chances are that you have code that you don't want to give to the outside world. If that is the case, then you can generate a source distribution and make the tarball available on an internal server -- ./setup.py sdist will generate a tarball in the dist directory. Then pip install whatever/myproject-0.0.0.tar.gz will install your script.
Hi I currently have a python project that uses subprocess. Popen to run some batch files.
Is it possible to package the batch file as source. Thus, when some of our other python project use setup.py to include the current python project in install_requires, the other project could install and update those batch files and uses it from source (i.e. run these script with subprocess. Popen as well)?
Anyone have some idea how should I do it?
Thanks in advance!
If you have bash scripts that are required to run your python package, you could simply store them within your package folder and they should be included when installing the package using setuptools. Here is an example of a possible folder structure:
/myproject
/myproject
__init__.py
main.py
batch.sh
setup.py
In the main.py you could access the batch file by:
import os.path
import subprocess
current_dir = os.path.dirname(os.path.abspath(__file__))
batch_script = os.path.join(current_dir, 'batch.sh')
subprocess.call(batch_script)
UPDATE
Based on other comments, if you instead need a way to make batch scripts accessible to third party packages, you could specify the scripts in the 'scripts' key in setuptools. You can see this available option in setuptools here.
I can't figure out how to make setup.py add a scrip to the the user's /bin or /usr/bin or whatever.
E.g., I'd like to add a myscript.py to /usr/bin so that the user can call myscript.py from any directory.
Consider using console_scripts:
from setuptools import setup
setup(name='some-name',
...
entry_points = {
'console_scripts': [
'command-name = package.module:main_func_name',
],
},
)
Where main_func_name is a main function in your main module.
command-name is a name under which it will be saved in /usr/local/bin/ (usually)
The Python documentation explains it under the installing scripts section.
Scripts are files containing Python source code, intended to be started from the command line.
setup(...,
scripts=['scripts/xmlproc_parse', 'scripts/xmlproc_val']
)
As mentioned here, beside scripts, there is an entry_points mechanism, which is more cross-platform.
With entry_points you connect a command line tool name with a function of your choice, whereas scripts could point to any file (e.g. a shell script).
There are two ways in order to get a working command line tool from setuptools and PyPI infrastructure:
The "scripts" Keyword Argument
This allows the command-line execution of everything you want, it can be a Python script, a shell script or something completely different.
The "console_scripts" Entry Point
This allows Python functions (not scripts!) to be directly registered as command-line accessible tools.
If you're willing to build and install the entire python package, this is how I would go about it:
Edit the setup() function in setup.py to contain a parameter named scripts and set its argument as the location of the file(s) you wish to run from anywhere. e.g.
setup(name='myproject',author='',author_email='',scripts=['bin/myscript.py'])
Within the directory that contains setup.py, create a bin directory by typing mkdir bin
Add myscript.py to this newly-created bin directory (and make sure it's executable!)
cd into the directory that contains setup.py again, and install the entire python package by typing python setup.py install
Once the package is installed, you should be able to run myscript.py from anywhere on the system!