setup.py and adding file to /bin/ - python

I can't figure out how to make setup.py add a scrip to the the user's /bin or /usr/bin or whatever.
E.g., I'd like to add a myscript.py to /usr/bin so that the user can call myscript.py from any directory.

Consider using console_scripts:
from setuptools import setup
setup(name='some-name',
...
entry_points = {
'console_scripts': [
'command-name = package.module:main_func_name',
],
},
)
Where main_func_name is a main function in your main module.
command-name is a name under which it will be saved in /usr/local/bin/ (usually)

The Python documentation explains it under the installing scripts section.
Scripts are files containing Python source code, intended to be started from the command line.
setup(...,
scripts=['scripts/xmlproc_parse', 'scripts/xmlproc_val']
)
As mentioned here, beside scripts, there is an entry_points mechanism, which is more cross-platform.
With entry_points you connect a command line tool name with a function of your choice, whereas scripts could point to any file (e.g. a shell script).

There are two ways in order to get a working command line tool from setuptools and PyPI infrastructure:
The "scripts" Keyword Argument
This allows the command-line execution of everything you want, it can be a Python script, a shell script or something completely different.
The "console_scripts" Entry Point
This allows Python functions (not scripts!) to be directly registered as command-line accessible tools.

If you're willing to build and install the entire python package, this is how I would go about it:
Edit the setup() function in setup.py to contain a parameter named scripts and set its argument as the location of the file(s) you wish to run from anywhere. e.g.
setup(name='myproject',author='',author_email='',scripts=['bin/myscript.py'])
Within the directory that contains setup.py, create a bin directory by typing mkdir bin
Add myscript.py to this newly-created bin directory (and make sure it's executable!)
cd into the directory that contains setup.py again, and install the entire python package by typing python setup.py install
Once the package is installed, you should be able to run myscript.py from anywhere on the system!

Related

How to write an installable python package for CLI based application [duplicate]

one_file.py contains some general functions, classes, and a main().
I'd like to make it pip installable with a command line script that calls the main() function.
What's a directory structure and setup.py that will do this?
You can get away with this with just a setup.py and your module--no additional directories. In your setup.py just use setup(..., py_modules=['one_file'], ...) (you might want to check on the exact spelling). To install the script you can use the console_scripts entry-point:
from setuptools import setup
setup(
name='one-file',
version='1.0',
py_modules=['one_file'],
entry_points={'console_scripts': ['one-file = one_file:main']}
)

How to install package with pip that has access to an R script

The issue
I need to create a python package that will have access to an R script and make it pip installable
The setup:
My package structure is as follows
foo/
setup.py
foo/
foo.R
The contents of setup.py are:
from setuptools import setup, find_packages
setup(
name='foo',
packages=find_packages(),
scripts=['foo/foo.R'],
zip_safe=False,
)
and the contents of foo.R are:
#!/usr/bin/env Rscript
R.version
I am installing this package with:
pip install -e .
When I look at the R script that is installed it is no longer an R script so when it is called it is being run as python and therefore fails
$ cat $(which foo.R) to view the contents of the file
What I am expecting:
#!/usr/bin/env Rscript
R.version
What I get:
#!/Users/jc33/miniconda3/bin/python
# EASY-INSTALL-DEV-SCRIPT: 'foo==0.0.0','foo.R'
__requires__ = 'foo==0.0.0'
__import__('pkg_resources').require('foo==0.0.0')
__file__ = '/Users/jc33/Desktop/foo/foo/foo.R'
exec(compile(open(__file__).read(), __file__, 'exec'))
additional scope
This is a very contrived example that was created for the sole purpose of finding a solution, not to debate the merits of locating an R script using setuptools. For a bit more information I am creating a more complex python package similar to this that will need to call R at some point this is done with subprocess.run however I believe that is out of the scope of this question
Python=='3.5.2'
setuptools=='27.2.0'
pip=='10.0.1
This is happening because your R script is insufficiently unlike Python.
If you were to put some R code that causes a syntax error such as a leading . or any usage of $ or foo[[0]] you would find that setuptools (but really distutils+easy_install) does not produce the develop wrapper.
The source that controls the script wrapper can be found here. It tests with is_python_script which calls is_python. This in turn just compiles the file. A successful compile returns True and causes easy_install to make the wrapper script, and unsuccessful one returns False and the file is simply copied verbatim (which is what you want).
scripts are for Python scripts. If you want to install something
that isn't Python code at all you need to treat that as data. In
setuptools it's called package_data:
setup(
name='foo',
packages=find_packages(),
package_data={
'foo': ['foo.R'],
},
zip_safe=False,
)
The file will be installed as foo/foo.R and you can find it from a module in foo/, let's say it's foo/foo.py. First thing in foo.py is to find its directory and then call foo.R:
import os, subprocess
foo_dir = os.path.dirname(__file__)
subprocess.check_call(['r', os.path.join(foo_dir, 'foo.R')])
PS. Please note the code is supposed to be in a module in foo/ directory. For a different directory it has to manipulate with __file__ in a different way.
This is normal behavior for "editable" installs. From the setuptools docs:
In addition, the develop command creates wrapper scripts in the target script directory that will run your in-development scripts after ensuring that all your install_requires packages are available on sys.path.
pip install -e isn't quite the same as python setup.py develop, but it's intended to be very close. I believe the implementation is a thin wrapper around python setup.py develop.

Including a python library with my script

My system administrator will not allow global installation of python packages.
I'm writing a script that people will invoke to perform certain actions for them. The script I'm writing needs certain libraries like sqlalchemy and coloredlogs. I am however allowed to install python libs any local folder. i.e not site-packages.
How would I go about installing the libs in the same folder as the script so that the script has access to them?
My folder hierarchy is like so
script_to_invoke.py
scriptpack/
bin/
coloredlogs
coloredlogs.egg
...
utils/
util1.py
util2.py
(all the folders indicated have an __init__.py)
What I've tried so far:
within script_to_invoke.py I use
from scriptpack.utils invoke util1 # no problem here
from scriptpack.bin import coloredlogs # fails to find the import
I've looked at some other SO answers abut I'm not sure how to correlate them with my problem.
I figured it out!
Python had to be directed to find the .egg files
This can be done by either
Editing the PYTHONPATH var BEFORE the interpreter is started (or)
Appending the full path to the eggs to the sys path
Code Below:
import sys
for entry in [<list of full path to egg files in bin dir>]:
sys.path.append(str(entry))
# Proceed with local imports
If you might want to try packaging up everything as a zipapp. Doing so makes a single zip file that acts as a Python script, but can contain a whole multitude of embedded packages. The steps to make it are:
Make a folder with the name of your program (testapp in my example)
Name your main script __main__.py and put it in that folder
Using pip, install the required packages to the folder with --target=/path/to/testapp
Run python3 -mzipapp testapp -p='/usr/bin/env python3' (providing the shebang line is optional; without it, users will need to run the package with python3 testapp.pyz, while with the shebag, they can just do ./testapp.pyz)
That creates a zip file with all your requirements embedded in it alongside your script, that doesn't even need to be unpacked to run (Python knows how to run zip apps natively). As a trivial example:
$ mkdir testapp
$ echo -e '#!/usr/bin/python3\nimport sqlalchemy\nprint(sqlalchemy)' > __main__.py
$ pip3 install --target=./testapp sqlalchemy
$ python3 -mzipapp testapp -p='/usr/bin/env python3'
$ ./testapp.pyz
<module 'sqlalchemy' from './testapp.pyz/sqlalchemy/__init__.py'>
showing how the simple main was able to access sqlalchemy from within the same zipapp. It's also smaller (thanks to the zipping) that distributing the uncompressed modules:
$ du -s -h testapp*
13M testapp
8.1M testapp.pyz
You can install these packages in a non-global location (generally in ~/.local/lib/python<x.y>) using the --user flag, e.g.:
pip install --user sqlalchemy coloredlogs
That way you don't have to worry about changing how imports work, and you're still compliant with your sysadmins policies.

How to distribute code to be executed by an external command? Bash script?

EDIT: Based on discussions below, I think my question really could apply to any language. Python naturally has a packaging system and installation procedure via pip. But let's say this was C code or a perl script. Users would still download the program files, and have a way to execute the code in the command line via capsall input.txt.
I have a python script that takes an input file and gives an output file.
For a concrete example, this script file1.py takes in an input text file and outputs a text file with all letters capitalized:
import sys
inFile = sys.argv[1]
outFile = sys.argv[2]
with open(inFile,'r') as input_file:
lines = input_file.readlines()
# process the input file somehow
# here we simply capitalize the text,
# but naturally something more complex is possible
capitalized_lines = []
for line in lines:
capitalized_lines.append(line.upper())
with open(outFile,'w') as output_file:
for line in capitalized_lines:
output_file.write(line)
The way users execute this code now with
python file1.py input.txt output.txt
Let's say I wanted to distribute this code such that users would download a tarball and be able to execute the above in the command line with (for example)
capsall input.txt
which would run python file1.py and output the file output.txt. Does one write a bash script? If so, how do you distribute the code such that users will have this in their PATH?
Add a "hash bang" at the top of the script file to tell bash to invoke the Python interpreter. Also make your script executable:
#!/usr/bin/env python
import sys
inFile = sys.argv[1]
outFile = sys.argv[2]
...
Make the script file executable:
$ cp file1.py capsall
$ chmod +x capsall
Now you can run the script with:
$ ./capsall input.txt output.txt
Or if capsall is on your path:
$ capsall input.txt output.txt
I would recommend using python packaging (e.g., [pip]) for this. You could use the OS specific packaging method as well, something like apt, yum, msiexec, whatever, but I wouldn't unless you have to. Your users already have python installed since they are used to explicitly passing your script to the interpreter already. If you are interested in using python packaging, then read on.
First, I would put your script into a package along with whatever other processing is necessary. Let's call it mylibrary. Your project should look something like:
myproject
|-- README.rst
`-- myproject
|-- __init__.py
`-- capsall.py
Keep __init__.py simple and make sure that you can import it without dependencies. I use something like:
version_info = (0, 0, 0)
version = '.'.join(str(v) for v in version_info)
Next, add a file named setup.py at the root. This is what will define your package, it's metadata, and the scripts to install into your user's $PATH. The following will work nicely:
#!/usr/bin/env python
import setuptools
import myproject
setuptools.setup(
name='myproject',
version=myproject.version,
description='Package of useful utilities.',
long_description=open('README.rst').read(),
url='https://github.com/me/myproject',
author='Me',
author_email='me#example.com',
packages=['myproject'],
entry_points={
'console_scripts': ['capsall=myproject.capsall:main'],
},
license='BSD',
)
If you plan on uploading to pypi.python.org, then you will need at least that much metadata. I would recommend adding classifiers as well. The Python Packaging Authority docs on writing a setup.py are invaluable.
The part that you are most interested in is the entry_points keyword parameter. It defines various ways that your package can be invoked. You are looking for console_scripts since it creates shell scripts that are installed into the local path. See the setuptools documentation for more details.
The console_scripts definition that I gave above creates a script named capsall that invokes the my project.capsall.main function. I would repackage your code into a project. Bundle your script into the main function of the capsall.py module. Then generate a source repository and upload it to pypi.python.org (./setup.py sdist register upload). Once you have it uploaded, your users can install it using pip install myproject.
Chances are that you have code that you don't want to give to the outside world. If that is the case, then you can generate a source distribution and make the tarball available on an internal server -- ./setup.py sdist will generate a tarball in the dist directory. Then pip install whatever/myproject-0.0.0.tar.gz will install your script.

How could I share my python command line tool to the wild

I have just create Python command line tool(xiber) called xiber.py that creating a iPad.xib file from a iPhone.xib. On my own computer I do:
alias xiber='python path_to/xiber.py'
Then I can use 'xiber' anywhere in my own computer. I want share this tool to other developer.
My question is: How could they use it without do the alias xiber='....' stuff and just using the xiber command.
Thanks.
Setup your Git repo like this:
bin/
xiber # Formerly index.py
README.md
setup.py
In the setup.py file.
from distutils.core import setup
# Typing this from memory (not tested).
setup(
name = 'Xiber',
version = '0.1',
scripts = ['bin/xiber'],
# Other options if desired ...
)
People who want to install your script can use ordinary tools like pip or easy_install, which will put your script in the appropriate bin directory of the Python installation. For example:
pip install git+git://github.com/liaa/xiber
a way is to provide a xiber bash script and a xiber.bat batch file in a /bin/ folder that is distributed with the project. Then the end user can make sure to add the bin folder to their PATH
Create executable by PyInstaller
Another option could be creating standalone executable file using Pyinstaller.
If you check the tutorial, you would find example starting with your scenario - turning Python script into executable (I used it and finally modified to the method "single file"), it works.

Categories

Resources