Multiple Python Version Env Setup - python

I'm working in the VFX industry and we deal with different software packages that ship with their own Python interpreter. We are running on Linux and use modules to handle our environments to make sure that people are using the correct version of all applications depending on the project they are working on.
Since months, we are trying to setup an environment that supports multiple versions of Python. And what is blocking right now are additional Python libraries that we are using in our in-house tools, like sqlalchemy, psycopg2, openpyxl, zmq, etc.
So far, for each project, we have config file that defines the version of each package to be use including the additional Python modules. And to use the correct Python version of each Python module, we look up the main Python interpreter defined in that same modules definition file. This works as long as the major and minor versions of all Python interpreters do line up.
But now, I would like to start an application that ships with a Python 3.7 interpreter and another application with a Python 3.9 interpreter and so on. All applications do you use our in-house tools which need the additional Python modules. Of course, this fails when trying to import any additional module.
For now, the only solution that I see is to install the corresponding Python modules in the 'site-packages' of each application that comes with its own Python interpreter. That should work. But this means that we have to install for each application version all necessary Python modules (ideally, the same version of it to avoid compatibility issues) and when we decide to update one of them, this needs to be done again for all 3rd party applications.
That does not sound super-efficient to me.
Do you have similar experiences and what did you came up with to handle this? I know, that there are more advanced packages like rez to handle complex environment setups, but although I do not know the details of rez, I could imagine that the problems stays the same. I guess that it is not possible to globally populate PYTHONPATH with additional modules so that it works on multiple Python interpreter versions.
Another solution that I could imagine is to make sure that on startup of each application that needs additional Python modules, we do our own sys.path modification depending on the interpreter version. That would imply some coding but we could keep a global version handling without installing them everywhere.
Anyway, if you have any further hints, please let me know.
Greets,
Carlo

Related

How to install multiple versions of Python in Windows?

up until recently I have only worked with one version of Python and used virtual environments every now and then. Now, I am working with some libraries that require older version of Python. So, I am very confused. Could anyone please clear up some of my confusion?
How do I install multiple Python versions?
I initially had Python version 3.8.x but upgraded to 3.10.x last month. There is currently only that one version on my PC now.
I wanted to install one of the Python 3.8.x version and went to https://www.python.org/downloads/. It lists a lot of versions and subversions like 3.6, 3.7, 3.8 etc. etc. with 3.8.1, 3.8.2 till 3.8.13. Which one should I pick?
I actually went ahead with 3.8.12 and downloaded the Tarball on the page: https://www.python.org/downloads/release/python-3812/
I extracted the tarball (23.6MB) and it created a folder with a setup.py file.
Is Python 3.8.12 now installed? Clicking on the setup.py file simply flashes the terminal for a second.
I have a few more questions. Hopefully, they won't get me downvoted. I am just confused and couldn't find proper answers for them.
Why does Python have such heavy dependency on the exact versions of libraries and packages etc?
For example, this question
How can I run Mozilla TTS/Coqui TTS training with CUDA on a Windows system?. This seems very beginner unfriendly. Slightly mismatched package
version can prevent any program from running.
Do virtual environments copy all the files from the main Python installation to create a virtual environment and then install specific packages inside it? Isn't that a lot of wasted resources in duplication because almost all projects require there own virtual environment.
Your questions depend a bit on "all the other software". For example, as #leiyang indicated, the answer will be different if you use conda vs just pip on vanilla CPython (the standard Windows Python).
I'm also going to assume you're actually on Windows, because on Linux I would recommend looking at pyenv. There is a pyenv-win, which may be worth looking into, but I don't use it myself because it doesn't play as nice if you also want (mini)conda environments.
1. (a) How do I install multiple Python versions?
Simply download the various installers and install them in sensible locations. E.g. "C:\Program Files\Python39" for Python 3.9, or some other location where you're allowed to install software.
Don't have Python add itself to the PATH though, since that'll only find the last version to do so and can really confuse things.
Also, you probably want to use virtual environments consistently, as this ties a specific project very clearly to a specific Python version, avoiding future confusion or problems.
1. (b) "3.8.1, 3.8.2 till 3.8.13" which should I pick?
Always pick the latest 3.x.y, so if there's a 3.8.13 for Windows, but no 3.8.14, pick that. Check if the version is actually available for your operating system, sometimes there are later versions for one OS, but not for another.
The reason is that between a verion like 3.6 and 3.7, there may be major changes that change how Python works. Generally, there will be backwards compatibility, but some changes may break how some of your packages work. However, when going up a minor version, there won't be any such breaking changes, just fixes and additions that don't get in the way of what was already there. A change from 2.x to 3.x only happens if the language itself goes through a major change, and rarely happens (and perhaps never will again, depending on who you ask).
An exception to the "no minor version change problems" is of course if you run some script that very specifically relies on something that was broken in 3.8.6, but no fixed in 3.8.7+ (as an example). However, that's very bad coding, to rely on what's broken and not fixing it later, so only go along with that if you have no other recourse. Otherwise, just the latest minor version of any version you're after.
Also: make sure you pick the correct architecture. If there's no specific requirement, just pick 64-bit, but if your script needs to interact with other installed software at the binary level, it may require you to install 32-bit Python (and 32-bit packages as well). If you have no such requirement, 64-bit allows more memory access and has some other benefits on modern computers.
2. Why does Python have such heavy dependency on the exact versions of libraries and packages etc?
It's not just Python, this is true for many languages. It's just more visible to the end user for Python, because you run it as an interpreted language. It's only compiled at the very last moment, on the computer it's running on.
This has the advantage that the code can run on a variety of computers and operating systems, but the downside that you need the right environment where you're running it. For people who code in languages like C++, they have to deal with this problem when they're coding, but target a much smaller number of environments (although there's still runtimes to contend with, and DirectX versions, etc.). Other languages just roll everything up into the program that's being distributed, while a Python script by itself can be tiny. It's a design choice.
There are a lot of tools to help you automate the process though and well-written packages will make the process quite painless. If you feel Python is very shakey when it comes to this, that's probable to blame on the packages or scripts you're using, not really the language. The only fault of the language is that it makes it very easy for developers to make such a mess for you and make your life hard with getting specific requirements.
Look for alternatives, but if you can't avoid using a specific script or package, once you figure out how to install or use it, document it or better yet, automate it so you don't have to think about it again.
3. Do virtual environments copy all the files from the main Python installation to create a virtual environment and then install specific packages inside it? Isn't that a lot of wasted resources in duplication because almost all projects require there own virtual environment.
Not all of them, but quite a few of them. However, you still need the original installation to be present on the system. Also, you can't pick up a virtual environment and put it somewhere else, not even on the same PC without some careful changes (often better to just recreate it).
You're right that this is a bit wasteful - but this is a difficult choice.
Either Python would be even more complicated, having to manage many different version of packages in a single environment (Java developers will be able to tell you war stories about this, with their dependency management - or wax lyrically about it, once they get it themselves).
Or you get what we have: a bit wasteful, but in the end diskspace is a lot cheaper than your time. And unlike your time, diskspace is almost infinitely expandable.
You can share virtual environments between very similar projects though, but especially if you get your code from someone else, it's best to not have to worry and just give up a few dozen MB for the project. On the upside: you can just delete a virtual environment directory and that pretty much gets rid of the whole things. Some applications like PyCharm may remember that it was once there, but other than that, that's the virtual environment gone.
Just install them. You can have any number of Python installations side by side. Unless you need to have 2 different minor versions, for example 3.10.1 and 3.10.2, there is no need to do anything special. (And if you do need that then you don't need any advice.) Just set up separate shortcuts for each one.
Remember you have to install any 3rd-party libraries you need in each version. To do this, navigate to the Scripts folder in the version you want to do the install in, and run pip from that folder.
Python's 3rd-party libraries are open-source and come from projects that have release schedules that don't necessarily coincide with Python's. So they will not always have a version available that coincides with the latest version of Python.
Often you can get around this by downloading unofficial binaries from Christoph Gohlke's site. Google Python Gohlke.
Install Python using the windows executable installers from python.org. If the version is 3.x.y, use the highest y that has a windows executable installer. Unless your machine is very old, use the 64-bit versions. Do not have them add python to your PATH environment variable, but in only one of the installs have it install the python launcher py. That will help you in using multiple versions. See e.g. here.
Python itself does not. But some modules/libraries do. Especially those that are not purely written in Python but contain extensions written in C(++). The reason for this is that compiling programs on ms-windows can be a real PITA. Unlike UNIX-like operating systems with Linux, ms-windows doesn't come with development tools as standard. Nor does it have decent package management. Since the official Python installers are built with microsoft tools, you need to use those with C(++) extensions as well. Before 2015, you even had to use exactly the same version of the compiler that Python was built with. That is still a good idea, but no longer strictly necessary. So it is a signigicant amount of work for developers to release binary packages for each supported Python version. It is much easier for them to say "requires Python 3.x".

Python sys.path vs import

What I wish to understand is what is good/bad practice, and why, when it comes to imports. What I want to understand is the agreed upon view by the community on the matter, if there's any one such in some PEP document or similar.
What I see normally is people have a python environment, use conda/pip to install packages and all that's needed to do in the code is to use "import X" (and variants). In my current understanding this is the right way to do things.
Whenever python interacts with C++ at my firm, though, it always ends up with the need to use sys.path and absolute imports (but we have some standardized paths to use as "base" and usually define relative paths based on those).
There are 2 major cases:
Python wrapper for C++ library (pybind/ctype/etc) - in this case the user python code must use sys.path to specify where the C++ library to import is.
A project that establish communications between python and C++ (say C++ server, python clients, TCP connections and flatbuffer serialization between the two) - Here the python code lives together with the C++ code, and if it some python files end up using sys.path to import python modules from the same project but that live in a different directory - Essentially we deploy the python together with the C++ through our C++ deployment procedure.
I am not fully sure if we can do something better for case #1, but case #2 seems quite unnecessary, and basically forced just by the choice to not deploy the python code through a python package manager. Choice ends up forcing us to use sys.path on both the library and user code.
This seems very bad to me as basically this way of doing things doesn't allow us to fully manage our python environments (since we have some libraries that we import even thought they are not technically installed in the environment), and that is probably why I have a negative view of using sys.path for imports. But I need to find if I'm right, and if so I need some official (or almost) documents to support my case if I'm to propose fixes to our procedures.
For your scenario 2, my understanding is you have some C++ and accompanying python in one place, and a separate python project wants to import that python.
Could you structure the imported python as a package and install it to your environment with pip install path/to/package? If it's a package that you'll continue to edit, you can add the -e flag to pip install so that when the package changes your imports get the latest code.

Using python project in c# application

I'm looking for a way to call my python project and display the console on my c# app.
My python project is a bit particular, i use some specific librairy, i even have to install a package whl not avalible with pip, avalible only with python3.7(not with 3.8) etc. So to run the project, i need python 3.7 exactly.
The problem is also that i want to deploy my c# app in ClickOnce to be use by clients, without having them to install a local python version.
I've seen on the net two ways to do work with python and c#, that both doesn't seem working for my need.
using python in c# app
1.Call python in a shell
I've imported the python project in my c# app, i've called the python.exe avalible in the venv and i've deployed the app. Everithing seems to work, but i discover that python executable in the venv refer to the local python installation and doesn't seem autonomus. So it was only working for me and not the clients.
Use IronPython in c#
Really far from working, starting by the encoding blowing from everywhere, even in imported librairy like numpy. I've referenced my venv librairy in SetSearchPaths(), and even with that, it's doesn't seem to work.
Any suggestions? The best way i think, would be to have a python.exe independant following the project that can load my venv.
Python.Included and the Numpy.NET package made with it may work for this:
https://github.com/henon/Python.Included
https://github.com/SciSharp/Numpy.NET
They are introduced in this post from 2019 and seem to be in active development today, https://medium.com/scisharp/using-python-libraries-in-net-without-a-python-installation-11124d6190cf
That solution does not use IronPython, but Python.NET: https://github.com/pythonnet/pythonnet
IronPython is an implementation of Python in C#, championed by others and then Microsoft itself back in 2010 era. Later Microsoft dropped it and largely the whole notion of supporting dynamic languages on .NET (they made a DLR system for it back then). It is very cool and works for pure Python code.
But NumPy and many useful and popular Python modules are written in C, using the C API of Python.org 's default C written Python implementation, aka. CPython. This is what Microsoft decided to back too instead, because C written Python modules don't work (easily and well) with IronPython. Also IronPython remains at Python 2.7.
Python.NET just bridges the .NET land with the normal CPython interpreter, so that you can call code cross the language boundary. So Numpy and everything works the same as with Python usually. Which is what you want.
Python.Included is one way to deploy that in C# projects - which may or may not work for you, but at least provides a starting point:
Python.Included is an automatic deployment mechanism for .NET packages
which depend on the embedded Python distribution. This allows
libraries depending on Python and/or Python packages to be deployed
via Nuget without having to worry about any local Python
installations.
It packages embedded Python (python-3.7.3-embed-amd64.zip) in its .NET
assembly and automatically deploys it in the user's home directory
upon first execution. On subsequent runs, it will find Python already
deployed and therefor doesn't install it again.
If you don't want to use that install mechanism, I think you can just bundle the CPython interpreter with your C# application and use the Python.NET mechanism to call that from your app's directory instead.

How do I set up a Python development environment on Linux?

I'm a .NET developer who knows very little about Python, but want to give it a test drive for a small project I'm working on.
What tools and packages should I install on my machine? I'm looking for a common, somewhat comprehensive, development environment.
I'll likely run Ubuntu 9.10, but I'm flexible. If Windows is a better option, that's fine too.
Edit: To clarify, I'm not looking for the bare minimum to get a Python program to run. I wouldn't expect a newbie .NET dev to use notepad and a compiler. I'd recommend Visual Studio, NUnit, SQL Server, etc.
Your system already has Python on it. Use the text editor or IDE of your choice; I like vim.
I can't tell you what third-party modules you need without knowing what kind of development you will be doing. Use apt as much as you can to get the libraries.
To speak to your edit:
This isn't minimalistic, like handing a .NET newbie notepad and a compiler: a decent text editor and the stdlib are all you really need to start out. You will likely need third-party libraries to develop whatever kind of applications you are writing, but I cannot think of any third-party modules all Python programmers will really need or want.
Unlke the .NET/Windows programming world, there is no one set of dev tools that stands above all others. Different people use different editors a whole lot. In Python, a module namespace is fully within a single file and project organization is based on the filesystem, so people do not lean on their IDEs as hard. Different projects use different version control software, which has been booming with new faces recently. Most of these are better than TFS and all are 1000 times better than SourceSafe.
When I want an interactive session, I use the vanilla Python interpreter. Various more fancy interpreters exist: bpython, ipython, IDLE. bpython is the least fancy of these and is supposed to be good about not doing weird stuff. ipython and IDLE can lead to strange bugs where code that works in them doens't work in normal Python and vice-versa; I've seen this first hand with IDLE.
For some of the tools you asked about and some others
In .NET you would use NUnit. In Python, use the stdlib unittest module. There are various third-party extensions and test runners, but unittest should suit you okay.
If you really want to look into something beyond this, get unittest2, a backport of the 2.7 version of unittest. It has incorporated all the best things from the third-party tools and is really neat.
In .NET you would use SQL Server. In Python, you may use PostgreSQL, MySQL, sqlite, or some other database. Python specifies a unified API for databases and porting from one to another typically goes pretty smoothly. sqlite is in the stdlib.
There are various Object Relational Models to make using databases more abstracted. SQLAlchemy is the most notable of these.
If you are doing network programming, get Twisted.
If you are doing numerical math, get numpy and scipy.
If you are doing web development, choose a framework. There are about 200000: Pylons, zope, Django, CherryPy, werkzeug...I won't bother starting an argument by recommending one. Most of these will happily work with various servers with a quick setting.
If you want to do GUI development, there are quite a few Python bindings. The stdlib ships with Tk bindings I would not bother with. There are wx bindings (wxpython), GTK+ bindings (pygtk), and two sets of Qt bindings. If you want to do native Windows GUI development, get IronPython and do it in .NET. There are win32 bindings, but they'll make you want to pull your hair out trying to use them directly.
In order to reduce the chance of effecting/hosing the system install of python, I typically install virtualenv on the ubuntu python install. I then create a virtualenv in my home directory so that subsequent packages I install via pip or easy_install do not effect the system installation. And I add the bin from that virtualenv to my path via .bashrc
$ sudo apt-get install python-virtualenv
$ virtualenv --no-site-packages ~/local
$ PATH=~/local/bin:$PATH #<----- add this to .bashrc to make it permanent
$ easy_install virtualenv #<--- so that project environments are based off your local environment rather than the system, probably not necessary
Install your favorite editor, I like emacs + rope, but editors are a personal preference and there are plenty of choices.
When I start a new project/idea I create a new virtual environment for that project, so that I don't effect dependencies anywhere else. Since I would hate for some of my projects to break due to an upgrade of a library both that project and the new one depends on.
~/projects $ virtualenv --no-site-packages my_new_project.env
~/projects/my_new_project.env $ source bin/activate
(my_new_project.env)~/projects/my_new_project.env $ easy_install paste ipython #whatever else I think I need
(my_new_project.env)~/projects/my_new_project.env $ emacs ./ & # start hacking
When creating a new package...in order to have something that will be easy_installable/pippable use paster create
(my_new_project.env)~/projects/my_new_project.env$ paster create new_package
(my_new_project.env)~/projects/my_new_project.env/new_package$ python setup.py develop new_package
That's the common stuff as far as I can think of it. Everything else would be editor/version control tool specific
Since I'm accustomed to Eclipse, I find Eclipse + PyDev convenient for Python. For quick computations, Idle is great.
I've used Python on Windows and on Ubuntu, and Linux is much cleaner.
If you launch a terminal and type python you'll get an interpreter, where you can start trying stuff.
Just in case you haven't seen it, check out the book Dive Into Python, is free on-line.
http://www.diveintopython.org/
Follow the examples in the book using the interpreter.
For storing your work you could use any editor; Vim or EMACS could be the most powerful, but also the most difficult to learn at first. If you want a more "traditional" IDE, you could try WingIDE.
http://www.wingware.com/
After you start to get more comfortable with python you should try an enhanced interpreter; try ipython.
http://ipython.scipy.org/moin/
When you start to develop a more serious project you'll need to get additional modules. Here you have two options; 1) Use your distribution tools to install additional modules; or 2) Download the modules you need directly from their sites and install them manually. You'll be responsible to upgrade them of course.
You'll have to decide for yourself which way to go. Personally I prefer to download and install additional modules manually.
Python (duh), setuptools or pip, virtualenv, and an editor. I suggest geany, but that's just me. And of course, any other Python modules you'll need.
Getting to Python from .NET world
Jumping into the Linux world from a .NET / WIndows background can be a bit disconcerting (but I do encourage you to keep trying Linux)
But I would suggest to anyone coming from Windows, to stick with Windows for a little while. goto www.Activestate.com and download their Python package - it includes the full win32com extentions by Mark Hammond and it also includes a complete, fast IDE "pythonwin"
I have done real professional development with just this setup alone on a windows box - one 14MB .msi and off you go !
Now to use Python on the DLR (Dynamic common language runtime) you need to download IronPython. THis is a seperate interpreter, that was also originally written by Mark Hammond at Microsoft and is at ironpython.org.
With this you can run code like (from wikipedia) ::
import clr
clr.AddReference("System.Windows.Forms")
from System.Windows.Forms import MessageBox
MessageBox.Show("Hello World")
Now you can access any .NET code from python.
If you're just starting out with Python, I'd actually argue against bringing in the complexity of virtualenv (which I think can be pretty overwhelming), at least until you've got a firm grasp of Python basics (especially regarding library/dependency management).
If you're using Ubuntu and the Gnome desktop environment, gedit is the default (gui) text editor, and has great support for Python built in. So my recommendation is to start with the pre-installed Python and gedit (which is pretty extensible on its own).
You don't need much. Python comes with "Batteries Included."
Visual Studio == IDLE. You already have it. If you want more IDE-like environment, install Komodo Edit.
NUnit == unittest. You already have it in the standard library.
SQL Server == sqlite. You already have it in the standard library.
Stop wasting time getting everything ready. It's already there in the basic Python installation.
Get to work.
Linux, BTW, is primarily a development environment. It was designed and built by developers for developers. Windows is an end-user environment which has to be supplemented for development.
Linux was originally focused on developers. All the tools you need are either already there or are part of simple yum or RPM installs.
You would probably like to give NetBeans Python IDE a shot. You can choose to use either Windows/Linux.
Database: sqlite (inbuilt). You might want SQLAlchemy though.
GUI: tcl is inbuilt, but wxPython or pyQt are recommended.
IDE: I use idle (inbuilt) on windows, TextMate on Mac, but you might like PyDev. I've also heard good things about ulipad.
Numerics: numpy.
Fast inline code: lots of options. I like boost weave (part of scipy), but you could look into ctypes (to use dlls), Cython, etc.
Web server: too many options. Django (plus Apache) is the biggest.
Unit testing: inbuilt.
Pyparsing, just because.
BeautifulSoup (or another good HTML parser).
hg, git, or some other nice VC.
Trac, or another bug system.
Oh, and StackOverflow if you have any questions.
Pycharm Community is worth to try.

Recommendations for Python development on a Mac?

I bought a low-end MacBook about a month ago and am finally getting around to configuring it for Python. I've done most of my Python work in Windows up until now, and am finding the choices for OS X a little daunting. It looks like there are at least five options to use for Python development:
"Stock" Apple Python
MacPython
Fink
MacPorts
roll-your-own-from-source
I'm still primarily developing for 2.5, so the stock Python is fine from a functionality standpoint. What I want to know is: why should I choose one over the other?
Update:
To clarify, I am looking for a discussion of the various options, not links to the documentation. I've marked this as a Community Wiki question, as I don't feel there is a "correct" answer. Thanks to everyone who has already commented for their insight.
One advantage I see in using the "stock" Python that's included with Mac OS X is that it makes deployment to other Macs a piece of cake. I don't know what your deployment scenario is, but for me this is important. My code has to run on any number of Macs at work, and I try to minimize the amount of work it takes to run my code on all of those systems.
I would highly recommend using MacPorts with Porticus for managing your Python installation. It takes a while to build everything, but the advantage is that whatever you build yourself will be built against the same libraries, so you won't have to futz around with statically linked shared objects, etc. if you want your Python stuff to work with Apache, PostgreSQL, etc.
If you choose to go this way, remember to install the python_select port and use it to make your system use the Python installed from MacPorts.
As an added bonus, MacPorts has packages for most main-stream Python eggs, so if you should be able to have MacPorts keep you up-to-date with the latest versions of all that stuff :)
Here's some helpful info to get you started. http://www.python.org/download/mac/
Depends what you are using python for. If you are using MacOS funitionality and things like PyObjC you are probably best of with MacPython or the python provided by Apple.
I use Python on my Mac mostly for development of server side applications which later will run on FreeBSD & Linux boxes. For that I have used fink python for a few years and ever since MacPorts python. With mac ports it is simple to add required c modules (like database driver etc). It's also easy to keep two python Versions (2.5 & 2.6 in my case) around.
I used "compile your own" python to test pre-3.0 python but generally I find managing dependencies to c modules painfull if done by hand.
Thanks to easy_install installing pure python modules is fast and easy for all the options mentioned above.
I was never very much an IDE person. For development I use command line subversion installed by MacPorts, Textmate and occasionaly Expandrive do directly access files on servers. I personally are very dependent on Bicyclerepairman for Textmade to handle my refactoring needs.
Others seem to be very happy with Eclipse & Pydev.
How about EPD from Enthought? Yes, it's large but it is a framework build and includes things like wxPython, vtk, numpy, scipy, and ipython built-in.
I recommend using Python Virtual environments, especially if you use a Timecapsule because Timecapsule will back everything up, except modules you added to Python!
Based on the number of bugs and omissions people have been encountering in Leopard python (just here on SO!), I couldn't recommend that version. e.g., see:
Why do I get wrong results for hmac in Python but not Perl?
Problems on select module on Python 2.5
I would choose MacPorts.
It does not eliminate your existing python supplied by Apple since it installs by default in /opt/local/bin (plays nice with it) and plus it is easy to download and install additional python modules (even binary modules that you need to compile!). I use Porticus GUI to maintain my MacPorts installed list of packages, including python.
In my windows environment I use Eclipse and PyDev, which works quite well together, even if it's a bit sparse. Apparently the exact same environment is available for the Mac as well, so I suggest downloading Eclipse and using the internal update software function to update PyDev with the URL http://pydev.sourceforge.net/updates/. To look further into PyDev, look here.
Apple's supplied python is quite old – my tiger install has 2.3.5. This may not be a problem for you, but you would be missing out on a lot. Also, there is a risk that Apple will update it. I'm not sure if moving from 2.3.5 to (say) 2.4 would cause code to break, but I guess it's possible. This happened to perl people recently: http://developers.slashdot.org/article.pl?sid=09/02/18/1435227
Macpython is a framework build (as is Apple's, I believe). To be honest, I'm not sure exactly what that means, but it's a prerequisite for some modules, in particular wxPython. If you get python from macports or fink, you will not be able to run wxPython (unless you run it through X11).
And guess what was forgotten by every answer here ... ActivePython.
No compilation required, even for third-party modules such as numpy, lxml, pyqt and thousands of others.
I recommend python (any python?) plus the ipython shell. My most recent experience with MacPython was MacPython 2.5, and I found IDLE frustrating to use as an editor. It's not very featureful, and its' very slow to scroll large quantities of output.

Categories

Resources