Can anyone explain the relationship between the LD_LIBRARY_PATH and the lib-dynload directory works in Python on a Unix machine.
The reason I ask is because at my place of employment, we have a network install of Python that works across several unix machines (Don't ask why, its a bunch of odd political oddities.) It works fine for most of the systems which are older, but on the newer systems, it runs into problems when people try to use the tkinter framework (since those machines have newer versions of the underlying libraries installed.)
I did some poking around, and inside the lib-dynload directory there is another library file which seems to just direct Python which library to use for tkinter stuff.
Doing some fiddling, I found a way to bypass the problem (essentially, placing the new versions of the library at the front of the user's LD_LIBRARY_PATH seems to solve the problem. I assume it works because it's finding this version of the library before the version in the lib-dynload folder, but it breaks if you attempt to do so on one of the older machines), but this is really an inelegant solution.
Related
I am trying to install GeoDjango what turns out to be much harder than I thought. After I installed the OSGeo4W on my 64 Bit Windows 10 system I set everything up in the settings.py file but now I get this error:
FileNotFoundError: Could not find module 'C:\OSGeo4W\bin\gdal304.dll' (or one of its dependencies). Try using the full path with constructor syntax.
I also set the GDAL_LIBRARY_PATH but it just won't work.
GDAL_LIBRARY_PATH = "C:\\OSGeo4W\\bin\\gdal304.dll"
This is my C:\OSGeo4W\bin path and as you can see the gdal304.dll file is there
My Python is on version 3.10.6
Django is on version 4.1
I already tried to solve it by myself for a week but slowly I have no idea left on what to do
I ran into this problem too, since I updated my old GEO Django Setup today.
You may use a Docker Image as suggested by the others, but I prefere a native solution, since I don't want to spin up Docker every time I start coding.
Your solution is in the brackets: (or one of its dependencies)
You may look up the transitive dependendencies from gdal304.dll. There are several tools for this (see here). I'm using here now the Git integrated MinGW - Shell that has ldd installed. This should be the case for any (newer) Git installation on Windows.
As you can see, some dependencies are already fullfilled from your operating system. Others that are missing, have to be fullfilled from OSGeo4W. If you compare this with your bin directory from OSGeo4W you will see the Problem:
Sadly a simple "renaming" does not the trick. I was lucky and had not yet deleted my old OSGeo4W version. In the old files I then found the necessary DLL.
So, long story short: You need the jpeg.dll file.
There are sites like "windll.com" or "dll-files.com", but I would not recommend using them. I don't trust these sites. You may install something like "MSYS2", "Cygwin" or even "MVSC", install the "libjpeg-turbo" library and then finally copy & paste the necessary DLL file.
This is also suggested on the official Site for libjpeg-turbo: https://libjpeg-turbo.org/Documentation/OfficialBinaries
But this seems like a lot of work for someone who just want to have the DLL file, but then again: Never download a library blindly from the Internet and load it into your application. These libraries could do anthing!
up until recently I have only worked with one version of Python and used virtual environments every now and then. Now, I am working with some libraries that require older version of Python. So, I am very confused. Could anyone please clear up some of my confusion?
How do I install multiple Python versions?
I initially had Python version 3.8.x but upgraded to 3.10.x last month. There is currently only that one version on my PC now.
I wanted to install one of the Python 3.8.x version and went to https://www.python.org/downloads/. It lists a lot of versions and subversions like 3.6, 3.7, 3.8 etc. etc. with 3.8.1, 3.8.2 till 3.8.13. Which one should I pick?
I actually went ahead with 3.8.12 and downloaded the Tarball on the page: https://www.python.org/downloads/release/python-3812/
I extracted the tarball (23.6MB) and it created a folder with a setup.py file.
Is Python 3.8.12 now installed? Clicking on the setup.py file simply flashes the terminal for a second.
I have a few more questions. Hopefully, they won't get me downvoted. I am just confused and couldn't find proper answers for them.
Why does Python have such heavy dependency on the exact versions of libraries and packages etc?
For example, this question
How can I run Mozilla TTS/Coqui TTS training with CUDA on a Windows system?. This seems very beginner unfriendly. Slightly mismatched package
version can prevent any program from running.
Do virtual environments copy all the files from the main Python installation to create a virtual environment and then install specific packages inside it? Isn't that a lot of wasted resources in duplication because almost all projects require there own virtual environment.
Your questions depend a bit on "all the other software". For example, as #leiyang indicated, the answer will be different if you use conda vs just pip on vanilla CPython (the standard Windows Python).
I'm also going to assume you're actually on Windows, because on Linux I would recommend looking at pyenv. There is a pyenv-win, which may be worth looking into, but I don't use it myself because it doesn't play as nice if you also want (mini)conda environments.
1. (a) How do I install multiple Python versions?
Simply download the various installers and install them in sensible locations. E.g. "C:\Program Files\Python39" for Python 3.9, or some other location where you're allowed to install software.
Don't have Python add itself to the PATH though, since that'll only find the last version to do so and can really confuse things.
Also, you probably want to use virtual environments consistently, as this ties a specific project very clearly to a specific Python version, avoiding future confusion or problems.
1. (b) "3.8.1, 3.8.2 till 3.8.13" which should I pick?
Always pick the latest 3.x.y, so if there's a 3.8.13 for Windows, but no 3.8.14, pick that. Check if the version is actually available for your operating system, sometimes there are later versions for one OS, but not for another.
The reason is that between a verion like 3.6 and 3.7, there may be major changes that change how Python works. Generally, there will be backwards compatibility, but some changes may break how some of your packages work. However, when going up a minor version, there won't be any such breaking changes, just fixes and additions that don't get in the way of what was already there. A change from 2.x to 3.x only happens if the language itself goes through a major change, and rarely happens (and perhaps never will again, depending on who you ask).
An exception to the "no minor version change problems" is of course if you run some script that very specifically relies on something that was broken in 3.8.6, but no fixed in 3.8.7+ (as an example). However, that's very bad coding, to rely on what's broken and not fixing it later, so only go along with that if you have no other recourse. Otherwise, just the latest minor version of any version you're after.
Also: make sure you pick the correct architecture. If there's no specific requirement, just pick 64-bit, but if your script needs to interact with other installed software at the binary level, it may require you to install 32-bit Python (and 32-bit packages as well). If you have no such requirement, 64-bit allows more memory access and has some other benefits on modern computers.
2. Why does Python have such heavy dependency on the exact versions of libraries and packages etc?
It's not just Python, this is true for many languages. It's just more visible to the end user for Python, because you run it as an interpreted language. It's only compiled at the very last moment, on the computer it's running on.
This has the advantage that the code can run on a variety of computers and operating systems, but the downside that you need the right environment where you're running it. For people who code in languages like C++, they have to deal with this problem when they're coding, but target a much smaller number of environments (although there's still runtimes to contend with, and DirectX versions, etc.). Other languages just roll everything up into the program that's being distributed, while a Python script by itself can be tiny. It's a design choice.
There are a lot of tools to help you automate the process though and well-written packages will make the process quite painless. If you feel Python is very shakey when it comes to this, that's probable to blame on the packages or scripts you're using, not really the language. The only fault of the language is that it makes it very easy for developers to make such a mess for you and make your life hard with getting specific requirements.
Look for alternatives, but if you can't avoid using a specific script or package, once you figure out how to install or use it, document it or better yet, automate it so you don't have to think about it again.
3. Do virtual environments copy all the files from the main Python installation to create a virtual environment and then install specific packages inside it? Isn't that a lot of wasted resources in duplication because almost all projects require there own virtual environment.
Not all of them, but quite a few of them. However, you still need the original installation to be present on the system. Also, you can't pick up a virtual environment and put it somewhere else, not even on the same PC without some careful changes (often better to just recreate it).
You're right that this is a bit wasteful - but this is a difficult choice.
Either Python would be even more complicated, having to manage many different version of packages in a single environment (Java developers will be able to tell you war stories about this, with their dependency management - or wax lyrically about it, once they get it themselves).
Or you get what we have: a bit wasteful, but in the end diskspace is a lot cheaper than your time. And unlike your time, diskspace is almost infinitely expandable.
You can share virtual environments between very similar projects though, but especially if you get your code from someone else, it's best to not have to worry and just give up a few dozen MB for the project. On the upside: you can just delete a virtual environment directory and that pretty much gets rid of the whole things. Some applications like PyCharm may remember that it was once there, but other than that, that's the virtual environment gone.
Just install them. You can have any number of Python installations side by side. Unless you need to have 2 different minor versions, for example 3.10.1 and 3.10.2, there is no need to do anything special. (And if you do need that then you don't need any advice.) Just set up separate shortcuts for each one.
Remember you have to install any 3rd-party libraries you need in each version. To do this, navigate to the Scripts folder in the version you want to do the install in, and run pip from that folder.
Python's 3rd-party libraries are open-source and come from projects that have release schedules that don't necessarily coincide with Python's. So they will not always have a version available that coincides with the latest version of Python.
Often you can get around this by downloading unofficial binaries from Christoph Gohlke's site. Google Python Gohlke.
Install Python using the windows executable installers from python.org. If the version is 3.x.y, use the highest y that has a windows executable installer. Unless your machine is very old, use the 64-bit versions. Do not have them add python to your PATH environment variable, but in only one of the installs have it install the python launcher py. That will help you in using multiple versions. See e.g. here.
Python itself does not. But some modules/libraries do. Especially those that are not purely written in Python but contain extensions written in C(++). The reason for this is that compiling programs on ms-windows can be a real PITA. Unlike UNIX-like operating systems with Linux, ms-windows doesn't come with development tools as standard. Nor does it have decent package management. Since the official Python installers are built with microsoft tools, you need to use those with C(++) extensions as well. Before 2015, you even had to use exactly the same version of the compiler that Python was built with. That is still a good idea, but no longer strictly necessary. So it is a signigicant amount of work for developers to release binary packages for each supported Python version. It is much easier for them to say "requires Python 3.x".
In continuing to research a solution for this question on ServerFault:
https://serverfault.com/questions/221203/mercurial-hook-fails-on-windows
I discovered an interesting and somewhat disturbing thing: I have seem three different versions of Python on my machine (four if you count the "official" version which doesn't appear to have a DLL with it....). Here's shot from my file search tool:
More Info:
I am running Windows 7 64-bit
Both the TortoiseHG and the Mercurial directories are on my path, with the Mercurial directory listed first.
I have Python 2.6 installed in c:\Python26
I have no entry for any type of PYTHON-based environmental variable. (Should I?)
I suspect that this is the source of the my problem from the question above, but I thought I'd ask here, as this is particular issue is a Python deal.
I tried to replace both DLLs with each other, but when I use the one that comes with Mercurial, then TortoiseHg stops working.
It seems to me that "there should only be one" Python on my machine. How do I achieve that?
For the problem that you mentioned earlier, the mercurial package got installed within python under mercurial home but you are executing scripts under C:\python26. So you need to install and execute your script under mercurial python
As seth mentioned earlier it is perfectly ok to multiple python homes in the same machine but you just to pay attention when installing python libraries to make sure that you are under the right home which means you set the path right before calling python.
Side note: The Python installation in "C:\Python26" installs its DLL to the Windows directory, in your case "C:\Windows\SysWOW64".
Answering your serverfault question: As you installed Mercurial as standalone version, you'll have to place any packages that are accessed by hooks into Mercurial's library folder (if it has one, could also be "library.zip").
I would recommend you to uninstall the Mercurial standalone version and instead install Mercurial with pip. This makes updates easier and you can use your normal "site-packages" directory for both normal Python libraries and hg hooks.
I would assume that tortoise/mercurial have just embedded their own versions of python to do whatever they need to do.
I wouldn't worry about it, the DLLs won't stomp on each other -- the PATH is the last placed that windows searches to find DLLs.
See: http://msdn.microsoft.com/en-us/library/7d83bc18(v=vs.80).aspx
Each DLL is for that application. There is only one in your search path so you don't need to worry about conflicts.
Is something not working that prompted you to worry about this??
Your assumption that there should only be one is wrong, each application has bundled a specific version with a fixed API, you can't just drop another in and hope it'll work.
The Python DLL naming structure only provides the major version and revision numbers. You are probably looking at the DLLs for versions 2.6.1, 2.6.4, 2.6.5, and 2.6.6.
All of this doesn't really matter as long as each application contains its own copy of the python26.dll. Windows will not explore the PATH environment variable if there is a local copy of the file.
I once read about minimal python installation without a lot of the libraries that come with the python default installation but could not find it on the web...
What I want to do is to just pack a script with the python stuff required to execute it and make portable.
Does any one know about something like that?
Thanks
Micro Python is actively maintained and has been ported to a bunch of microcontrollers.
For other small implementations, you might also want to check out tinypy or PyMite.
If you don't care about size, but really just want an easy way to distribute a python program, consider PyInstaller or one of the others on this list.
Portable python might do what you want. It's a python installation for USB thumb drives.
There's now finally Micro Python, claiming to be full reimplementation of Python 3 core, fitting even into medium-size 32bit microcontrollers. API will be different of course, so C modules will require porting. Project is funded via KickStarter, source code will be released some time after the campaign (request for consideration was made to author to not delay release of the source, to help bootstrap Micro Python community sooner).
http://micropython.org/
You can also look for already installed instances.
OpenOffice / LibreOffice
Look at the environment variable UNO_PATH or into the default install directories, for example for Windows and LO5
%ProgramFiles(x86)%\LibreOffice 5\program\python.exe
Gimp
look into the default install directories, for example for Windows
C:\Program Files\GIMP 2\Python
and so on...
I am a member of a team that is about to launch a beta of a python (Django specifically) based web site and accompanying suite of backend tools. The team itself has doubled in size from 2 to 4 over the past few weeks and we expect continued growth for the next couple of months at least. One issue that has started to plague us is getting everyone up to speed in terms of getting their development environment configured and having all the right eggs installed, etc.
I'm looking for ways to simplify this process and make it less error prone. Both zc.buildout and virtualenv look like they would be good tools for addressing this problem but both seem to concentrate primarily on the python-specific issues. We have a couple of small subprojects in other languages (Java and Ruby specifically) as well as numerous python extensions that have to be compiled natively (lxml, MySQL drivers, etc). In fact, one of the biggest thorns in our side has been getting some of these extensions compiled against appropriate versions of the shared libraries so as to avoid segfaults, malloc errors and all sorts of similar issues. It doesn't help that out of 4 people we have 4 different development environments -- 1 leopard on ppc, 1 leopard on intel, 1 ubuntu and 1 windows.
Ultimately what would be ideal would be something that works roughly like this, from the dos/unix prompt:
$ git clone [repository url]
...
$ python setup-env.py
...
that then does what zc.buildout/virtualenv does (copy/symlink the python interpreter, provide a clean space to install eggs) then installs all required eggs, including installing any native shared library dependencies, installs the ruby project, the java project, etc.
Obviously this would be useful for both getting development environments up as well as deploying on staging/production servers.
Ideally I would like for the tool that accomplishes this to be written in/extensible via python, since that is (and always will be) the lingua franca of our team, but I am open to solutions in other languages.
So, my question then is: does anyone have any suggestions for better alternatives or any experiences they can share using one of these solutions to handle larger/broader install bases?
Setuptools may be capable of more of what you're looking for than you realize -- if you need a custom version of lxml to work correctly on MacOS X, for instance, you can put a URL to an appropriate egg inside your setup.py and have setuptools download and install that inside your developers' environments as necessary; it also can be told to download and install a specific version of a dependency from revision control.
That said, I'd lean towards using a scriptably generated virtual environment. It's pretty straightforward to build a kickstart file which installs whichever packages you depend on and then boot virtual machines (or production hardware!) against it, with puppet or similar software doing other administration (adding users, setting up services [where's your database come from?], etc). This comes in particularly handy when your production environment includes multiple machines -- just script the generation of multiple VMs within their handy little sandboxed subnet (I use libvirt+kvm for this; while kvm isn't available on all the platforms you have developers working on, qemu certainly is, or you can do as I do and have a small number of beefy VM hosts shared by multiple developers).
This gets you out of the headaches of supporting N platforms -- you only have a single virtual platform to support -- and means that your deployment process, as defined by the kickstart file and puppet code used for setup, is source-controlled and run through your QA and review processes just like everything else.
I always create a develop.py file at the top level of the project, and have also a packages directory with all of the .tar.gz files from PyPI that I want to install, and also included an unpacked copy of virtualenv that is ready to run right from that file. All of this goes into version control. Every developer can simply check out the trunk, run develop.py, and a few moments later will have a virtual environment ready to use that includes all of our dependencies at exactly the versions the other developers are using. And it works even if PyPI is down, which is very helpful at this point in that service's history.
Basically, you're looking for a cross-platform software/package installer (on the lines of apt-get/yum/etc.) I'm not sure something like that exists?
An alternative might be specifying the list of packages that need to be installed via the OS-specific package management system such as Fink or DarwinPorts for Mac OS X and having a script that sets up the build environment for the in-house code?
I have continued to research this issue since I posted the question. It looks like there are some attempts to address some of the needs I outlined, e.g. Minitage and Puppet which take different approaches but both may accomplish what I want -- although Minitage does not explicitly state that it supports Windows. Lacking any better options I will try to make either one of these or just extensive customized use of zc.buildout work for our needs, but I still feel like there must be better options out there.
You might consider creating virtual machine appliances with whatever production OS you are running, and all of the software dependencies pre-built. Code can be edited either remotely, or with a shared folder. It worked pretty well for me in a past life that had a fairly complicated development environment.
Puppet doesn't (easily) support the Win32 world either. If you're looking for a deployment mechanism and not just a "dev setup" tool, you might consider looking into ControlTier (http://open.controltier.com/) which has a open-source cross-platform solution.
Beyond that you're looking at "enterprise" software such as BladeLogic or OpsWare and typically an outrageous pricetag for the functionality offered (my opinion, obviously).
A lot of folks have been aggressively using a combination of Puppet and Capistrano (even non-rails developers) for deployment automation tools to pretty good effect. Downside, again, is that it's expecting a somewhat homogeneous environment.