Where to put python module i need for several projects? - python

Let's say I have a python module called custom_colors.py in which I create some variables I am using in multiple projects in different working directories. So I am wondering what's the best practice of how to work with such helper files?
copy them into each project's folder?
So far I thought about creating a folder for all helper files like this and import whenever needed
import os
# change workding directory to helper files folder and load own modules
os.chdir(helper_files_path)
import utils.custom_colors as cc
but for this approach I always need to change the working directory first.
How do you guys handle such stuff?

You keep them in a central location, but you don't make your script responsible for finding that location. Instead, you tell the interpreter where it is by using PYTHONPATH.
import os
import utils.customer_colors as cc
...
If the path to your module is something like /path/to/lib/utils.py, then run Python with
PYTHONPATH=/path/to/lib python myscript.py
You can add this change to your environment so that you Python will always look in /path/to/lib when searching for a module.
Taking this one step further, package your library so that you can install it using python setup.py install or pip install mylib, then create a virtual environment for each project and install mylib in the virtual environment.

Related

Import module works in PyCharm but giving error in terminal python 3.7

I have project where I have created multiple python files based on its usage. It works completely fine when I run from the pycharm. However, when I run the same from the terminal, I get the error: ModuleNotFoundError: No module named 'dataflow'
I need to make the dataflow out of this and need to deploy and it is giving an error while doing so.
Folder structure of the project, this works when I run from PyCharm
Error while running it from the terminal
Adducated guess, runs pychar your code in venv too? If not you might check if you installed the package that is missing in your venv.
Update
if you intent to have a dataflow package that you want to import and use the modules in it you need a __init__.py file in your dataflow folder. this makes it a package for python. If you want to use the modules in dataflow with the . in an import you need to do an import in __init__.py
like so
import .driver_main
this makes stuff from driver_main available in dataflow but better practice would be to specifiy what you want to access from driver_main like
from .driver_main import MyDriver
this will gibe you access to my driver via
dataflow.MyDriver
If you really just want to acces stuff from one module on the same lvl you should able to do so with the same approach. so in you exaple you showed in the picture try to change
from dataflow import driver_main
to
from . import driver_main
this would apply to an import in a module on the same lvl as driver_main.py like app.py
Update on Comments in original post
btw the env in pychar has nothing to do with the venv in the console. You simply telling pycharm to use python 3.7 but with your venv you copy binaries in a folder stucture. that said if you run an virtual environment all stuff you pip there gets copied in this folder structure not in the global site-packages. This means if you installed stuff global you wont have it right away in the virtual env and the other way around!

How can we build and distribute python scripts in a windows environment

My team is enjoying using python to solve problems for our business. We write many small independent scripty applications.
However, we have to have a central windows box that runs these along with legacy applications.
Our challenge is going through a build and deploy process.
We want to have Bamboo check the script out of git, install requirements and run tests, then if all is green, just deploy to our production box.
We'd like libraries to be isolated from script to script so we don't have dependency issues.
We've tried to get virtualenvs to be portable but that seems a no go.
Pex looked promising, but it doesn't work on windows.
Ideally you'd see a folder like so:
AppOne
/Script.py
/Libs
/bar.egg
/foo.egg
AppTwo
/Script2.py
/Libs
/fnord.egg
/fleebly.py
Are we thinking about this wrong? What's the pythonic way to distribute scripts within an enterprise?
You may be able to do that with a neat if relatively unknown feature that was sneaked into Python 2.6 without much ado: executing zip files as Python applications. It got a bit (just a bit) more of publicity after PEP 441 (which is the one PEX is inspired in), although I think most people is still unaware of it. The idea is that you create a zip file (the recommeded extension is .pyz or .pyzw for windowed applications, but that's obviously not important) with all the code and modules that you want and then you simply run it with Python. The interpreter will add the contents of the zip file to sys.path and look for a top level module named __main__ and run it. Python 3.5 even introduced the convenience module zipapp to create such packaged applications, but there is really no magic in it and you may as well create it by hand or script.
In your case, I guess Bamboo could do the check out, dependency install and tests in virtualenvs and then package the application along with the environment libraries. It's not a one-click solution but it may do the trick without additional tools.
TL:DR:
Use Docker
A short story long:
You can use docker to create an independent image for every script that you want to deploy.
You can install a python image (slim is the lightest) as a base environment for each script or a group of scripts/applications and use it like a "virtualenv" in which you can install all your dependencies for that script.
There is also an integration for Bamboo and Docker which you may find useful.
Here is the Docker documentation for reference.
You can test each script individually in a separated container and if it passes then you can use the same container to deploy it in your main server.
It is not exactly what you are asking, but you can use this solution in every platform (Windows, Linux, etc.), you can deploy all your scripts to the enterprise server (or anywhere for that matter) and use them across your company.
Disclaimer: This is not THE solution, it is a solution that I am aware of which applies to the time of this answer (2017)
Another possibility is pyinstaller. It creates an executable that can be deployed. Python is not even required to be installed on the deployed production box. It is harder to debug problems that occur only on the deployed box. You also can't modify the scripts on the deployed box which depending on your trust of the owners of the machine is either a positive or negative. See http://www.pyinstaller.org/
As I understand it, you want to create self-contained application directories on a build server, then copy them over to a production server and run scripts directly from them. In particular, you want all dependencies (your own and external packages) installed within a Libs subdirectory in each application directory. Here's a fairly robust way to do that:
Create the top-level application directory (AppOne) and the Libs subdirectory inside it.
Use pip install --ignore-installed --target=Libs package_name to install dependencies into the Libs subdirectory.
Copy your own packages and modules into the Libs subdirectory (or install them there with pip).
Copy Script.py into the top-level directory.
Include code at the top of Script.py to add the Libs directory to sys.path:
import os, sys
app_path = os.path.dirname(__file__)
lib_path = os.path.abspath(os.path.join(app_path, 'Libs'))
sys.path.insert(0, lib_path)
This will make packages like Libs\bar.egg and modules like Libs\fleebly.py available to your script via import bar or import fleebly. Without code like this, there is no way for your script to find those packages and modules.
If you want to streamline this part of your script, there are a couple of options: (1) Put these lines in a separate fix_path.py module in the top-level directory and just call import fix_path at the start of your script. (2) Create a Libs\__init__.py file with the line sys.path.insert(0, os.path.dirname(__file__)), and then call import Libs from your script. After that, Libs\x can be imported via import x. This is neat, but it's a nonstandard use of the package and path mechanisms (it uses Libs as both a library directory and a package), so it could create some confusion about how importing works.
Once these directories and files are in place, you can copy this whole structure over to any Windows system with Python installed, and then run it using cd AppOne; python Script.py or python AppOne\Script.py. If you name your top-level script __main__.py instead of Script.py, then you can run your app just by executing python AppOne.
Further, as #jdehesa pointed out, if your script is named __main__.py, you can compress the contents of the AppOne directory (but not the AppOne directory itself) into a file called AppOne.zip, and then copy that to your production server and run it by calling python AppOne.zip. (On Python 3.5 or later, you can also create the zip file via python -m zipapp AppOne if your script is called __main__.py. You may also be able to use python -m zipapp AppOne -m Script if your script is called Script.py. See https://docs.python.org/3/library/zipapp.html.)
This kind of thing can be easily dealt with python setup.py
Sample setup.py
from setuptools import setup
setup(
name=name_for_distribution,
version=version_number,
py_modules=[pythonfiles],
install_requires=[
python packages that need to be installed
]
)
Create a virtual environment , activate it and run :
python setup.py install
I feel this is the most pythonic way to distribute and package your project.
Reading links:
https://pythonhosted.org/an_example_pypi_project/setuptools.html
https://docs.python.org/2/distutils/setupscript.html

Hidden Markov Models (HMM) in Python

I am working with Hidden Markov Models in Python. For that I came across a package/module named hmmpytk. The problem is hmmpytk isnt pre-installed and when I download the hmmpytk module, i only get codes without the installation file. I use windows operating system. If I run a code simply with "from hmmpytk import hmm_faster" I get an Import error. ..so no idea how i get started with hmmpytk.
You need to make sure that the folder hmmpytk (and possibly also lame_tagger) is "in the directory containing the script that was used to invoke the Python interpreter." See the documentation about the Python path sys.path
EDIT: Alternatively, you can make sure that those folders are on your Python path. To see the folders in your path, type import sys; sys.path - the CWD is the first entry in the path if you started the interactive interpreter.
You probably want PIP or easy_install to install python packages.

Python import library from tar.gz?

I am working on a box which I don't have root access. However, there is a folder /share which would be accessed for everyone to read and write.
I want to figure out a way to put python libraries so that everyone could access and use them.
I figured out that I can put the egg file in the /share/pythonLib folder and in the python script.
import sys
sys.path.append("/share/pythonLib/foo.egg")
import foo
and it would work for everyone, however, I am not sure every library has egg version. For example, I am trying to install BeautifulSoup4 , however, there is only tar.gz file and I am not sure if it would be possible to convert to egg and ..etc.
OR! I am wrong right at the BEGINNING, and there are indeed some pythonic magics like below:
magicadd /share/pythonLib/foo.tar.gz
import foo
tar.gz is the source code of the library. You should unpack it, and you will find a setup.py script inside. Run:
python setup.py install --prefix=/share/pythonLib
This will create:
/share/pythonLib/lib/python2.7/site-packages/
In your scripts append that path to sys.path and everything should work fine.

Simple way to import python modules in Linux using symlinks

I am tinkering with some pet projects with Python in Linux (Mint 13) and I plan to do the following:
Create a Dropbox subfolder named "pybin" where I put all my home-made python modules;
Put a symlink to this folder somewhere in the system (first candidate: /usr/lib/python2.7/dist-packages, which is in sys.path, or some similar path);
Then I just do import mymodule from any python session, and the module is imported.
I tried it and it didn't work. I suspect this has to do with differences between modules and packages, and __init__.py files, but I confess that everytime I read something about this stuff I get pretty confused. Besides learning a bit more about this, all I really want to do is find a way to import my modules the described way. It is crucial that the actual folder is inside Dropbox (or any other file-syncing folder), not in a system folder.
Thanks for any help!
Why not simply set the PYTHONPATH envvar in your .bash_profile. That way every time you execute a bash shell (normally happens upon login), this environment variable will be set the wherever you place your user defined modules. The python interpreter uses this variable to determine where to search for module imports:
PYTHONPATH="${PYTHONPATH}:/path/to/some/cool/python/package/:/path/to/another/cool/python/package/"
export PYTHONPATH

Categories

Resources