Why is the `bin` directory named differently (`Scripts`) on Windows? - python

The venv module (shipped with Python 3.3 or later), and virtualenv, still widely in use, allow to install a project's dependencies not to the system-wide Python installation, but to a directory specific to that project.
One of the subdirectories of such a "virtual environment" contains a copy of the Python interpreter as well as "activate" and "deactivate" scripts - but this subdirectory is called Scripts on Windows, and bin on all other systems.
This is somewhat surprising. Why did they special-case Windows?
(Neither PEP 405, nor the venv or virtualenv sources (or docs) contain any explanation - a commit message in virtualenv refers to "convention")

"I think the commit message is the best you'll get. Everything else will be pure speculation." (Bryan's comment, refering to the commit message in virtualenv)

Most ms-windows programs have a GUI which is started by an icon or menu-entry. So there is no need for a standardized location for binaries (which is then put in the $PATH) such as UNIX has. Also, the name bin wouldn't mean anything like it does to UNIX users.
Additionaly, ms-windows only has a very primitive package management (if you can even call it package management), so applications tend to be installed in their own directory tree where they won't interfere with each other.

Related

PyCharm and buildout with another package under development

I have a specialized buildout package, say, my.buildout, and under src there is MyProject package (which does not have buidlout.cfg by itself, but of course do have setup.py). These are the relevant lines in buildout.cfg in addition to sources:
develop = src/MyProject
auto-checkout = MyProject
When I run bootstrap and bin/buildout, src/MyProject is automatically checked out, and bin/pserve script (and many other scripts under bin) contains paths to all dependencies.
The python interpreter is coming from one of the virtual envs I have. Note, that dependencies are installed by buildout under eggs, not in the virtual env.
I want PyCharm to understand whereabouts of both MyProject and the eggs, that is, everything, which is normally available when the project is running.
Tried to add my.buildout as a project, added correct python interpreter. When I go into MyProject, dependencies are underlined in red.
Also tried to add MyProject as a project, with the same result.
I am aware of this answer:
PyCharm doesn't recognize Buildout dependencies
but I have set up interpreter to the one in the #! of the bin/pserve, and "bin/py" script, which is used as a wrapper, naturally can't be added as an interpreter.
Is it possible to have my.buildout and MyProject as they are, or is buildout support of PyCharm intended for some different buildout structure / development workflow?
(and sometimes there are many projects under development is src, I've simplified)
I am very new to PyCharm (trying it out), so may be missing something obvious.
Update: Of course, I've enabled Buildout support in the settings, and used the full path to my.buildout/bin/buildout script.
Just to make it clearer. In the bin/pserve script (generated by buildout), which is used to run the app, a lot of paths are inserted into sys.path. PyCharm just does not read those. The question is how to make it aware of those paths.
Update 2: And even more: when I give bin/py as a buidlout script in the Settings, the project panel faithfully lists "Buildout Eggs" (from sys.path?), but still shows "Package requirements .... " Install / Ignore suggestion.
Ok, so after adding .../bin/py (instead of bin/buildout, which does not contain but two paths) as a "Use paths from script" in the Setting > Build, Execution, Deployment > Buildout Support I am getting lookups right!
(Even though PyCharm still suggests to install requirements)

Why use Pythons 'virtualenv' on Linux when one has 'chroot' (and union/overlay filesystems)?

First of all let me state that I am a proponent of generic software (in general ;-). I am no expert on Python, but it seems that the 'virtualenv' utility solves pretty much the same problem 'chroot' can help to solve - bootstrapping a directory tree that can be passed as root, thus effectively protecting the real directory tree, if needed.
Since I am no expert in Python as already mentioned, I wonder - what problem can virtualenv solve that chroot cannot? I mean, can't I just set up a nice fake root tree (possibly using union mounting), chroot into it, and do pip install a package I want in my new environment, and then play around within the bounds of my new environment, running python scripts and what not?
Am I missing something here?
Update:
Can't one install packages/modules locally in whatever application directory, I mean, without root privileges and subsequently without overwriting or adding files to /usr/lib or /usr/local/lib? It appears that this is what virtualenv does, however I think it has to symlink or otherwise provide a python interpreter for each environment one creates, does it not?
bootstrapping a directory tree that can be passed as root
That's not what virtualenv does, except (to some degree) for Python packages. It provides a place where these can be installed without replacing the rest of the filesystem. It also works without root privileges and it's portable as it needs no kernel support, unlike chroot, which (I presume) won't work on Windows.
Can't one install packages/modules locally in whatever application directory
Yes, but virtualenv does one more thing, which is that it disables (by default at least) the system's Python package directories. That means you can test whether your package correctly installs all of its dependencies (you might have forgotten to list one because it's already installed on your system) and it allows installing different versions in case you need either newer or older versions. The ability to install older versions should not be overlooked because sometimes new versions of packages introduce bugs.

bundling/executing python script + modules to a remote machine

I have looked into other python module distribution questions. My need is a bit different (I think!, I am python newbie+)
I have a bunch of python scripts that I need to execute on remote machines. Here is what the target environment looks like;
The machines will have base python run time installed
I will have a SSH account; I can login or execute commands remotely using ssh
i can copy files (scp) into my home dir
I am NOT allowed to install any thing on the machine; the machines may not even have access to Internet
my scripts may use some 'exotic' python modules -- most likely they won't be present in the target machine
after the audit, my home directory will be nuked from the machine (leave no trace)
So what I like to do is:
copy a directory structure of python scripts + modules to remote machine (say in /home/audituser/scripts). And modules can be copied into /home/audituser/scripts/pythhon_lib)
then execute a script (say /home/audituser/scripts/myscript.py). This script will need to resolve all modules used from 'python_lib' sub directory.
is this possible? or is there a better way of doing this? I guess what I am looking is to 'relocate' the 3rd party modules into the scripts dir.
thanks in advance!
Are the remote machines the same as each other? And, if so, can you set up a development machine that's effectively the same as the remote machines?
If so, virtualenv makes this almost trivial. Create a virtualenv on your dev machine, use the virtualenv copy of pip to install any third-party modules into it, build your script within it, then just copy that entire environment to each remote machine.
There are three things that make it potentially non-trivial:
If the remote machines don't (and can't) have virtualenv installed, you need to do one of the following:
In many cases, just copying a --relocatable environment over just works. See the documentation section on "Making Environments Relocatable".
You can always bundle virtualenv itself, and pip install --user virtualenv (and, if they don't even have pip, a few steps before that) on each machine. This will leave the user account in a permanently-changed state. (But fortunately, your user account is going to be nuked, so who cares?)
You can write your own manual bootstrapping. See the section on "Creating Your Own Bootstrap Scripts".
By default, you get a lot more than you need—the Python executable, the standard library, etc.
If the machines aren't identical, this may not work, or at least might not be as efficient.
Even if they are, you're still often making your bundle orders of magnitude bigger.
See the documentation sections on Using Virtualenv without bin/python, --system-site-packages, and possibly bootstrapping.
If any of the Python modules you're installing also need C libraries (e.g., libxml2 for lxml), virtualenv doesn't help with that. In fact, you will need the C libraries to be almost exactly the same (same path, compatible version).
Three other alternatives:
If your needs are simple enough (or the least-simple parts involve things that virtualenv doesn't help with, like installing libxml2), it may be easier to just bundle the .egg/.tgz/whatever files for third-party modules, and write a script that does a pip install --user and so on for each one, and then you're done.
Just because you don't need a full app-distribution system doesn't mean you can't use one. py2app, py2exe, cx_freeze, etc. aren't all that complicated, especially in simple cases, and having a click-and-go executable to copy around is even easier than having an explicit environment.
zc.buildout is an amazingly flexible and manageable tool that can do the equivalent of any of the three alternatives. The main downside is that there's a much, much steeper learning curve.
You can use virtualenv to create a self-contained environment for your project. This can house your own script, as well as any dependency libraries. Then you can make the env relocatable (--relocatable), and sync it over to the target machine, activate it, and run your scripts.
If these machines do have network access (not internet, but just local network), you can also place the virtualenv on a shared location and activate from there.
It looks something like this:
virtualenv --no-site-packages portable_proj
cd portable_proj/
source bin/activate
# install some deps
pip install xyz
virtualenv --relocatable .
Now portable_proj can be disted to other machines.

multi platform portable python

I want to install python on a flash drive in a virtual environment so that I can develop code wherever I am. Is this possible to do in such a way that I can use my flash drive on windows/mac/linux computers?
For windows, head to Portable Python (http://PortablePython.com) to see various options you have,
For linux and Mac you don't need to install it on USB drive as those systems usually come with Python pre-installed. If you need specific packages for those systems, bring them on USB together with a command line script that can load them with one call in virtualenv on those systems and you are good to go !
Be aware that this is never 100% bullet proof as you are depending on Python version you are using/bringing packages for.
You could try looking at setting up something using some VirtualEnv type environments, with the various Python versions installed on your machines.
Not sure how you'd get round the different paths on the different operating systems though.
Virtualenv: http://pypi.python.org/pypi/virtualenv
As #millimoose pointed out, you could install three different versions of Python.
For each Python package you are working on, you can create a .pth file in the site-packages directory of each Python version that you would like to use the package from.
Note that, as described here:
If you put a .pth file in the site-packages directory containing a path, python searches this path for imports.
For example, if you have a package named my_package that you are working on that resides at the path C:\Users\Me\Documents\dev_packages\my_package, you can add a file with extension .pth (note that the name doesn't matter, specifically it doesn't have to have any relation to the package name), with the contents:
C:\Users\Me\Documents\dev_packages
This will add C:\Users\Me\Documents\dev_packages to the Python import search-path, causing the my_package package to be discovered. By placing this .pth file in the site-packages directory of each Python version, my_package will be available in all corresponding versions of Python.

Developing and using the same Python on the same computer

I'm developing a Python utility module to help with file downloads, archives, etc. I have a project set up in a virtual environment along with my unit tests. When I want to use this module on the same computer (essentially as "Production"), I move the files to the mymodule directory in the ~/dev/modules/mymodule
I keep all 3rd-party modules under ~/dev/modules/contrib. This contrib path is on my PYTHONPATH, but mymodule is NOT because I've noticed that if mymodule is on my PYTHONPATH, my unit tests cannot distinguish between the "Development" version and the "Production" version. But now if I want to use this common utility module, I have to manually add it to the PYTHONPATH.
This works, but I'm sure there's a better, more automated way.
What is the best way to have a Development and Production module on the same computer? For instance, is there a way to set PYTHONPATH dynamically?
You can add/modify python paths at sys.path, just make sure that the first path is the current directory ".", because some third-party modules rely on importing from the directory of the current module.
More information on python paths:
http://djangotricks.blogspot.com/2008/09/note-on-python-paths.html
I'm guessing by virtual environment you mean the virtualenv package?
http://pypi.python.org/pypi/virtualenv
What I'd try (and apologies if I've not understood the question right) is:
Keep the source somewhere that isn't referenced by PYTHONPATH (e.g. ~/projects/myproject)
Write a simple setuptools or distutils script for installing it (see Python distutils - does anyone know how to use it?)
Use the virtualenv package to create a dev virtual environment with the --no-site-packages option - this way your "dev" version won't see any packages installed in the default python installation.
(Also make sure your PYTHONPATH doesn't have any of your source directories)
Then, for testing:
Activate dev virtual environment
Run install script, (usually something like python setup.py build install). Your package ends up in /path/to/dev_virtualenv/lib/python2.x/site-packages/
Test, break, fix, repeat
And, for production:
Make sure dev virtualenv isn't activated
Run install script
All good to go, the "dev" version is hidden away in a virtual environment that production can't see...
...And there's no (direct) messing around with PYTHONPATH
That said, I write this with the confidence of someone who's not actually tried setting using virtualenv in anger and the hope I've vaguely understood your question... ;)
You could set the PYTHONPATH as a global environment variable pointing to your Production code, and then in any shell in which you want to use the Development code, change the PYTHONPATH to point to that code.
(Is that too simplistic? Have I missed something?)

Categories

Resources