I installed some packages , but I think either some of the packages corrupted or are conflicting with versions
Is there a good way to just uninstall every package and python itself?
If you just want to remove all the packages you've installed (as opposed to all of Python), you'd want to nuke your site-packages directory.
To find it, from Python run >>> import some_package (where some_package is a package you've installed; setuptools is one you're likely to have), then run some_package.__file__. The output should be something like /path/to/site-packages/distribute-0.6.19-py2.6.egg/setuptools/__init__.pyc. Delete (or, better yet, rename) and recreate /path/to/site-packages. That will get rid of everything you've installed.
Related
Well, I think that the "Pinax" package is no longer maintained ...
Anyway, it refers to a now-deprecated django.utils.tzinfo package. I need to make a one-line change to one source module, and then cause that version of the package to be loaded instead of the one in the module directory. Is there a generic way in Django to do that sort of thing? (I know that some packages such as the Machina forum software can do this.)
I would uninstall the module you want to modify from your virtual environment, then fork the module to your github, commit your changes, and install it to your virtual environment from your github like
python -m pip install git+https://github.com/path/to/repo
I have an installable python package (mypackage) and it needs to use specific versions of a number of dependencies. At the minute I have some .sh scripts that just pip these into an internal package folder (eg C:\Python27\Lib\site-packages\mypackage\site-packages). When mypackage executes it adds this internal folder to the beginning of the python path so that it will override any other versions of the required dependencies elsewhere in the python path.
I know this will only work if the user doesn't import the dependencies prior to importing mypackage but I will document this.
I want to remove the .sh scripts and integrate the above into either dist_utils install or pip standard installation process. What is the best way to do this? I know about install_requires but it does not seem to allow specification of a location.
I eventually found the virtualenv tool which solved the above problem much more elegantly than the solution I'd put in place:
https://docs.python.org/3/tutorial/venv.html
I'm new to installing python packages.
I just installed Biopython by going to the source code (downloaded from github) and running:
python setup.py build
My question is - can I now throw out the source code files (in folder) on my desktop? Or do I have to keep them there for the Biopython package to work?
Possibly, but likely not.
If you were to install it, like this:
python setup.py install
… then you could throw away the whole source directory. That's exactly what pip does, in effect. (Speaking of pip, for most packages, it's better to use pip to install them. That way you can, for example, uninstall them, or upgrade them, or list them out so you know what you had installed before upgrading Python or moving to a new machine. The main exception is, of course, packages that don't work with pip, which you'll discover when you try and it fails.:)
But if all you did was build it, so you can run it from within its own directory, any top-level scripts aren't compiled, other modules may be compiled but may still expect to find source somewhere, etc. (Of course you can always test and see—back up the directory, delete all the .py files, try it, and if it doesn't work, restore the backed up copy…)
I want to use the default (no site packages) of virtualenv.
But some modules are difficult to install in a virtualenv (for example gtk). With "difficult" I mean that you need to have a lot of c-header files installed and a lot of stuff needs to be compiled.
I know that I can solve this by not installing these packages with pip, but to create symlinks to make some modules available from the global site-packages directory.
But is this the right direction?
Is there a way to create the symlinks with pip or virtualenv?
Update
In 2013 I wanted some modules like psycopg2, gtk, python-ldap and other which are installed on my linux server via rpm/dpkg in the virtualenv.
The symlinking or other work-arounds did make things more complicated, not simpler. We use this option today (2017)
--system-site-packages
Give the virtual environment access to the global
site-packages.
I'd say yeah, that's the right direction.
Your questions sounds similar to something I dealt with: installing OpenCV into virtualenv. My problem was that OpenCV wasn't available via pip (Python Package Index). What I ended up doing was querying the system-wide global Python installation for the module in question, and then copy-ing the .so into my virtualenv.
The whole process, including the boilerplate Makefile I used, are captured here: https://stackoverflow.com/a/19213369/1510289
You could do something similar by sym-linking instead of copying. The reason I ended up copying the library was because I use Make, and Make doesn't handle dependencies for symbolic links in a way I needed (as explained in the URL above.)
Hope this helps...
How are you compiling each of these 'hard' packages from scratch?
Are you doing something like:
python setup.py install
If so, replace that with:
python setup.py bdist_wheel
Then look in the ./dist directory for a .whl file. Then take whatever that file is, and do (after activating the environment)
pip install `./dist/whateverTheFileIsCalled.whl`
The following works:
pip install git+git://github.com/pydata/pandas#master
But the following doesn't:
pip install -e git+git://github.com/pydata/pandas#master
The error is:
--editable=git+git://github.com/pydata/pandas#master is not the right format; it must have #egg=Package
Why?
Also, I read that the -e does the following:
--egg
Install as self contained egg file, like easy_install does.
what is the value of this? When would this be helpful? (I always work on a virtualenv and install through pip)
Generally, you don't want to install as a .egg file. However, there are a few rare cases where you might. For example:
It's one of a handful of packages that needs to override a built-in package, and knows how to do so when installed as a .egg. With Apple Python, readline is such a package. I don't know of any other common exceptions.
The egg has binary dependencies that point to other eggs on PyPI, and can serve as a binary dependency for yet other eggs on PyPI. This is pretty rare nowadays, because it doesn't actually work in many important cases.
You want a package embedded in a single file that you can copy-and-paste, FTP, etc. from one installation to another.
You want a package that you can install into another installation straight out of site-packages.
The package is badly broken (and you can't fix it, for whatever reason), so that setup.py install does not work, but it can properly build an egg and run out of an egg.
Meanwhile, if you want to use editable mode, the package, and all other packages it depends on, have to be egg-compatible, whether or not you install them as eggs; pip will add #egg=<project name> to the VCS URL for each one, and if any of them don't understand that, it will fail.