Python IDE with integrated package management system - python

My friends usually ask me which Python IDE is easier to use, and I don't have an answer for them, because depending on their platform, package management could be a headache. Today I tried looking for a Python IDE with some kind of integrated package management system (e.g. PIP) that asks user if they like the IDE search online repositories for the missing package, and just install it--like the way their favorite TeX editor does.
So, do you know if such an IDE exists? If not, do you have any idea why?

Well, if you deal with Django development, I think you should definitely check out sourceLair which offers integration with pip. You just right click on your requirements.txt file, select install dependencies and you're done.
Currently, you cannot search for pip packages but I suppose this could be a future feature.

I think it mostly has to do with python packaging hate. Binary package management is very important to Windows users (of which Python has many, because they don't treat Windows as a second class platform), and building many C extensions with distutils on Windows often just doesn't work (try 'pip install numpy' on Windows, for instance). The state of "packaging" in python is confusing in general; I would say that's the primary reason nobody (that I know of) has gone thru the effort of integrating the existing tools into their IDEs.
As an aside, if you want something free and are familiar with eclipse, I recommend pydev.

Related

How to easily distribute Python software that has Python module dependencies? Frustrations in Python package installation on Unix

My goal is to distribute a Python package that has several other widely used Python packages as dependencies. My package depends on well written, Pypi-indexed packages like pandas, scipy and numpy, and specifies in the setup.py that certain versions or higher of these are needed, e.g. "numpy >= 1.5".
I found that it's immensely frustrating and nearly impossible for Unix savvy users who are not experts in Python packaging (even if they know how to write Python) to install a package like mine, even when using what are supposed to be easy to use package managers. I am wondering if there is an alternative to this painful process that someone can offer, or if my experience just reflects the very difficult current state of Python packaging and distribution.
Suppose users download your package onto their system. Most will try to install it "naively", using something like:
$ python setup.py install
Since if you google instructions on installing Python packages, this is usually what comes up. This will fail for the vast majority of users, since most do not have root access on their Unix/Linux servers. With more searching, they will discover the "--prefix" option and try:
$ python setup.py install --prefix=/some/local/dir
Since the users are not aware of the intricacies of Python packaging, they will pick an arbitrary directory as an argument to --prefix, e.g. "~/software/mypackage/". It will not be a cleanly curated directory where all other Python packages reside, because again, most users are not aware of these details. If they install another package "myotherpackage", they might pass it "~/software/myotherpackage", and you can imagine how down the road this will lead to frustrating hacking of PYTHONPATH and other complications.
Continuing with the installation process, the call to "setup.py install" with "--prefix" will also fail once users try to use the package, even though it appeared to have been installed correctly, since one of the dependencies might be missing (e.g. pandas, scipy or numpy) and a package manager is not used. They will try to install these packages individually. Even if successful, the packages will inevitably not be in the PYTHONPATH due to the non-standard directories given to "--prefix" and patient users will dabble with modifications of their PYTHONPATH to get the dependencies to be visible.
At this stage, users might be told by a Python savvy friend that they should use a package manager like "easy_install", the mainstream manager, to install the software and have dependencies taken care of. After installing "easy_install", which might be difficult, they will try:
$ easy_install setup.py
This too will fail, since users again do not typically have permission to install software globally on production Unix servers. With more reading, they will learn about the "--user" option, and try:
$ easy_install setup.py --user
They will get the error:
usage: easy_install [options] requirement_or_url ...
or: easy_install --help
error: option --user not recognized
They will be extremely puzzled why their easy_install does not have the --user option where there are clearly pages online describing the option. They might try to upgrade their easy_install to the latest version and find that it still fails.
If they continue and consult a Python packaging expert, they will discover that there are two versions of easy_install, both named "easy_install" so as to maximize confusion, but one part of "distribute" and the other part of "setuptools". It happens to be that only the "easy_install" of "distribute" supports "--user" and the vast majority of servers/sys admins install "setuptools"'s easy_install and so local installation will not be possible. Keep in mind that these distinctions between "distribute" and "setuptools" are meaningless and hard to understand for people who are not experts in Python package management.
At this point, I would have lost 90% of even the most determined, savvy and patient users who try to install my software package -- and rightfully so! They wanted to install a piece of software that happened to be written in Python, not to become experts in state of the art Python package distribution, and this is far too confusing and complex. They will give up and be frustrated at the time wasted.
The tiny minority of users who continue on and ask more Python experts will be told that they ought to use pip/virtualenv instead of easy_install. Installing pip and virtualenv and figuring out how these tools work and how they are different from the conventional "python setup.py" or "easy_install" calls is in itself time consuming and difficult, and again too much to ask from users who just wanted to install a simple piece of Python software and use it. Even those who pursue this path will be confused as to whether whatever dependencies they installed with easy_install or setup.py install --prefix are still usable with pip/virtualenv or if everything needs to be reinstalled from scratch.
This problem is exacerbated if one or more of the packages in question depends on installing a different version of Python than the one that is the default. The difficulty of ensuring that your Python package manger is using the Python version you want it to, and that the required dependencies are installed in the relevant Python 2.x directory and not Python 2.y, will be so endlessly frustrating to users that they will certainly give up at that stage.
Is there a simpler way to install Python software that doesn't require users to delve into all of these technical details of Python packages, paths and locations? For example, I am not a big Java user, but I do use some Java tools occasionally, and don't recall ever having to worry about X and Y dependencies of the Java software I was installing, and I have no clue how Java package managing works (and I'm happy that I don't -- I just wanted to use a tool that happened to be written in Java.) My recollection is that if you download a Jar, you just get it and it tends to work.
Is there an equivalent for Python? A way to distribute software in a way that doesn't depend on users having to chase down all these dependencies and versions? A way to perhaps compile all the relevant packages into something self-contained that can just be downloaded and used as a binary?
I would like to emphasize that this frustration happens even with the narrow goal of distributing a package to savvy Unix users, which makes the problem simpler by not worrying about cross platform issues, etc. I assume that the users are Unix savvy, and might even know Python, but just aren't aware (and don't want to be made aware) about the ins and outs of Python packaging and the myriad of internal complications/rivalries of different package managers. A disturbing feature of this issue is that it happens even when all of your Python package dependencies are well-known, well-written and well-maintained Pypi-available packages like Pandas, Scipy and Numpy. It's not like I was relying on some obscure dependencies that are not properly formed packages: rather, I was using the most mainstream packages that many might rely on.
Any help or advice on this will be greatly appreciated. I think Python is a great language with great libraries, but I find it virtually impossible to distribute the software I write in it (once it has dependencies) in a way that is easy for people to install locally and just run. I would like to clarify that the software I'm writing is not a Python library for programmatic use, but software that has executable scripts that users run as individual programs. Thanks.
We also develop software projects that depend on numpy, scipy and other PyPI packages. Hands down, the best tool currently available out there for managing remote installations is zc.buildout. It is very easy to use. You download a bootstrapping script from their website and distribute that with your package. You write a "local deployment" file, called normally buildout.cfg, that explains how to install the package locally. You ship both the bootstrap.py file and buildout.cfg with your package - we use the MANIFEST.in file in our python packages to force the embedding of these two files with the zip or tar balls distributed by PyPI. When the user unpackages it, it should execute two commands:
$ python bootstrap.py # this will download zc.buildout and setuptools
$ ./bin/buildout # this will build and **locally** install your package + deps
The package is compiled and all dependencies are installed locally, which means that the user installing your package doesn't even need root privileges, which is an added feature. The scripts are (normally) placed under ./bin, so the user can just execute them after that. zc.buildout uses setuptools for interaction with PyPI so everything you expect works out of the box.
You can extend zc.buildout quite easily if all that power is not enough - you create the so-called "recipes" that can help the user to create extra configuration files, download other stuff from the net or instantiate custom programs. zc.buildout website contains a video tutorial that explains in details how to use buildout and how to extend it. Our project Bob makes extensive use of buildout for distributing packages for scientific usage. If you would like, please visit the following page that contains detailed instructions for our developers on how they can setup their python packages so other people can build and install them locally using zc.buildout.
We're currently working to make it easier for users to get started installing Python software in a platform independent manner (in particular see https://python-packaging-user-guide.readthedocs.org/en/latest/future.html and http://www.python.org/dev/peps/pep-0453/)
For right now, the problem with two competing versions of easy_install has been resolved, with the competing fork "distribute" being merged backing into the setuptools main line of development.
The best currently available advice on cross-platform distribution and installation of Python software is captured here: https://packaging.python.org/

Python Code Completion

After using C# for long time I finally decided to switch to Python.
The question I am facing for the moment has to do about auto-complete.
I guess I am spoiled by C# and especially from resharper and I was expecting something similar to exist for Python.
My editor of choice is emacs and after doing some research I found autocomplete.pl, yasnippet and rope although it is not clear to me if and how they can be installed in a cygwin based system which is what I use since all the related documentation appears to be linux specific...
The version of emacs I currently use is 23.2.1 which bundles the python mode that although useful is far behind from whatever research has to offer.
My question to python users has to do about how common is autocomplete vs manual typing (using M-/ where possible) ?
I am thinking about just memorizing python build-in functions like len, append, extend etc. and revert close to a pre-autocomplete editing mode. How different such an approach is from what other pythonistas are doing?
I found this post
My Emacs Python environment
to be the most useful and comprehensive list of instructions and references on how to setup a decent Python development environment in Emacs regardless of OS platform. It is still a bit of work to setup but at least it covers the popular packages and components generally recommended for Python in Emacs that provide auto-completion functionality.
I loosely used this post as a guide to do the setup on my Windows machine with Emacs 23.2.1 and Python 2.6.5. Although, I also have Cygwin installed in some cases instead of running the *nix shell commands mentioned in the post, I just download the packages via a web browser, unzip them with 7zip, and copy them to my Emacs' plugin directory.
Also, to install Pymacs, Rope, and Ropemacs, I used Python's EasyInstall package manager. To use it, I downloaded and installed the setuptools package using the Windows install version. Once installed, at the command line, cd to their respective download locations and run the command
easy_install .
instead of the shell commands shown in the post.
Generally, I saved any *.el files in my ~\.emacs.d\plugins (e.g. in %USERPROFILE%\Application Data\.emacs.d\) and then updated my .emacs file to reference them as documented in the post.
Despite all this, on occasion, I've used DreamPie since it does have overall better auto-completion out of the box than my Emacs setup.
I'm spoiled by Intellisense too. The PyDev extensions for Eclipse offer a pretty good auto-complete substitute.
I find that PyDev + Eclipse can meet most of my needs. There is also PyCharm from the Intellij team. PyCharm has the added advantage of smooth integration with git.
I've been using PyScripter, an IDE for Windows, for a while now, and have found it very good. It has autocompletion among many other features. It's written in Delphi -- not that there's anything wrong with that -- it just bothers me a bit, though...
Take a look at Spyderlib, support most of the features including code completion
IMO, by far the easiest way to take advantage of the python tools available for emacs is to take advantage of the defaults that are all set up at:
https://github.com/gabrielelanaro/emacs-for-python
I actually took the time to get pymacs and ropemacs and python-mode all working independently before finding that little gem, and now I rely on it entirely for all my python based customizations. If you are new, I would definitely start there.

Deploying python app to Mac and Windows users

I've written an app in python that depends on wxPython and some other python libraries. I know about pyexe for making python scripts executable on Windows, but what would be the easiest way to share this with my Mac using friends who wouldn't know how to install the required dependencies? One option would be to bundle my dependencies in the same package, but that seems kind of clunky. How do people usually deploy such apps? For once I miss Java...
You could check out py2app, which is similar to py2exe
How do people usually deploy such apps?
2 choices.
With instructions.
All bundled up.
You write simple instructions like this. Folks can follow these pretty reliably, unless they don't have enough privileges. Sometimes they need to sudo in linux environments.
Download easy_install (or pip)
easy_install this, easy_install that (or pip this, pip that)
easy_install whatever package you wrote.
It works really well. If you download some Python packages you'll see this in action.
Sphinx requires docutils. Django requires docutils and PIL. It works out really well to simply document the dependencies. Other folks seem to do it without serious problems. Follow their lead.
Bundling things up means you have to
(a) provide the entire original distribution (as required by most open source licenses)
(b) provide a compatible open source license with the licenses of the things you bundled. This can be easy if you depend on things that all of the same license. Otherwise, you basically can't redistribute them and have to resort to installation instructions.

How do I set up a Python development environment on Linux?

I'm a .NET developer who knows very little about Python, but want to give it a test drive for a small project I'm working on.
What tools and packages should I install on my machine? I'm looking for a common, somewhat comprehensive, development environment.
I'll likely run Ubuntu 9.10, but I'm flexible. If Windows is a better option, that's fine too.
Edit: To clarify, I'm not looking for the bare minimum to get a Python program to run. I wouldn't expect a newbie .NET dev to use notepad and a compiler. I'd recommend Visual Studio, NUnit, SQL Server, etc.
Your system already has Python on it. Use the text editor or IDE of your choice; I like vim.
I can't tell you what third-party modules you need without knowing what kind of development you will be doing. Use apt as much as you can to get the libraries.
To speak to your edit:
This isn't minimalistic, like handing a .NET newbie notepad and a compiler: a decent text editor and the stdlib are all you really need to start out. You will likely need third-party libraries to develop whatever kind of applications you are writing, but I cannot think of any third-party modules all Python programmers will really need or want.
Unlke the .NET/Windows programming world, there is no one set of dev tools that stands above all others. Different people use different editors a whole lot. In Python, a module namespace is fully within a single file and project organization is based on the filesystem, so people do not lean on their IDEs as hard. Different projects use different version control software, which has been booming with new faces recently. Most of these are better than TFS and all are 1000 times better than SourceSafe.
When I want an interactive session, I use the vanilla Python interpreter. Various more fancy interpreters exist: bpython, ipython, IDLE. bpython is the least fancy of these and is supposed to be good about not doing weird stuff. ipython and IDLE can lead to strange bugs where code that works in them doens't work in normal Python and vice-versa; I've seen this first hand with IDLE.
For some of the tools you asked about and some others
In .NET you would use NUnit. In Python, use the stdlib unittest module. There are various third-party extensions and test runners, but unittest should suit you okay.
If you really want to look into something beyond this, get unittest2, a backport of the 2.7 version of unittest. It has incorporated all the best things from the third-party tools and is really neat.
In .NET you would use SQL Server. In Python, you may use PostgreSQL, MySQL, sqlite, or some other database. Python specifies a unified API for databases and porting from one to another typically goes pretty smoothly. sqlite is in the stdlib.
There are various Object Relational Models to make using databases more abstracted. SQLAlchemy is the most notable of these.
If you are doing network programming, get Twisted.
If you are doing numerical math, get numpy and scipy.
If you are doing web development, choose a framework. There are about 200000: Pylons, zope, Django, CherryPy, werkzeug...I won't bother starting an argument by recommending one. Most of these will happily work with various servers with a quick setting.
If you want to do GUI development, there are quite a few Python bindings. The stdlib ships with Tk bindings I would not bother with. There are wx bindings (wxpython), GTK+ bindings (pygtk), and two sets of Qt bindings. If you want to do native Windows GUI development, get IronPython and do it in .NET. There are win32 bindings, but they'll make you want to pull your hair out trying to use them directly.
In order to reduce the chance of effecting/hosing the system install of python, I typically install virtualenv on the ubuntu python install. I then create a virtualenv in my home directory so that subsequent packages I install via pip or easy_install do not effect the system installation. And I add the bin from that virtualenv to my path via .bashrc
$ sudo apt-get install python-virtualenv
$ virtualenv --no-site-packages ~/local
$ PATH=~/local/bin:$PATH #<----- add this to .bashrc to make it permanent
$ easy_install virtualenv #<--- so that project environments are based off your local environment rather than the system, probably not necessary
Install your favorite editor, I like emacs + rope, but editors are a personal preference and there are plenty of choices.
When I start a new project/idea I create a new virtual environment for that project, so that I don't effect dependencies anywhere else. Since I would hate for some of my projects to break due to an upgrade of a library both that project and the new one depends on.
~/projects $ virtualenv --no-site-packages my_new_project.env
~/projects/my_new_project.env $ source bin/activate
(my_new_project.env)~/projects/my_new_project.env $ easy_install paste ipython #whatever else I think I need
(my_new_project.env)~/projects/my_new_project.env $ emacs ./ & # start hacking
When creating a new package...in order to have something that will be easy_installable/pippable use paster create
(my_new_project.env)~/projects/my_new_project.env$ paster create new_package
(my_new_project.env)~/projects/my_new_project.env/new_package$ python setup.py develop new_package
That's the common stuff as far as I can think of it. Everything else would be editor/version control tool specific
Since I'm accustomed to Eclipse, I find Eclipse + PyDev convenient for Python. For quick computations, Idle is great.
I've used Python on Windows and on Ubuntu, and Linux is much cleaner.
If you launch a terminal and type python you'll get an interpreter, where you can start trying stuff.
Just in case you haven't seen it, check out the book Dive Into Python, is free on-line.
http://www.diveintopython.org/
Follow the examples in the book using the interpreter.
For storing your work you could use any editor; Vim or EMACS could be the most powerful, but also the most difficult to learn at first. If you want a more "traditional" IDE, you could try WingIDE.
http://www.wingware.com/
After you start to get more comfortable with python you should try an enhanced interpreter; try ipython.
http://ipython.scipy.org/moin/
When you start to develop a more serious project you'll need to get additional modules. Here you have two options; 1) Use your distribution tools to install additional modules; or 2) Download the modules you need directly from their sites and install them manually. You'll be responsible to upgrade them of course.
You'll have to decide for yourself which way to go. Personally I prefer to download and install additional modules manually.
Python (duh), setuptools or pip, virtualenv, and an editor. I suggest geany, but that's just me. And of course, any other Python modules you'll need.
Getting to Python from .NET world
Jumping into the Linux world from a .NET / WIndows background can be a bit disconcerting (but I do encourage you to keep trying Linux)
But I would suggest to anyone coming from Windows, to stick with Windows for a little while. goto www.Activestate.com and download their Python package - it includes the full win32com extentions by Mark Hammond and it also includes a complete, fast IDE "pythonwin"
I have done real professional development with just this setup alone on a windows box - one 14MB .msi and off you go !
Now to use Python on the DLR (Dynamic common language runtime) you need to download IronPython. THis is a seperate interpreter, that was also originally written by Mark Hammond at Microsoft and is at ironpython.org.
With this you can run code like (from wikipedia) ::
import clr
clr.AddReference("System.Windows.Forms")
from System.Windows.Forms import MessageBox
MessageBox.Show("Hello World")
Now you can access any .NET code from python.
If you're just starting out with Python, I'd actually argue against bringing in the complexity of virtualenv (which I think can be pretty overwhelming), at least until you've got a firm grasp of Python basics (especially regarding library/dependency management).
If you're using Ubuntu and the Gnome desktop environment, gedit is the default (gui) text editor, and has great support for Python built in. So my recommendation is to start with the pre-installed Python and gedit (which is pretty extensible on its own).
You don't need much. Python comes with "Batteries Included."
Visual Studio == IDLE. You already have it. If you want more IDE-like environment, install Komodo Edit.
NUnit == unittest. You already have it in the standard library.
SQL Server == sqlite. You already have it in the standard library.
Stop wasting time getting everything ready. It's already there in the basic Python installation.
Get to work.
Linux, BTW, is primarily a development environment. It was designed and built by developers for developers. Windows is an end-user environment which has to be supplemented for development.
Linux was originally focused on developers. All the tools you need are either already there or are part of simple yum or RPM installs.
You would probably like to give NetBeans Python IDE a shot. You can choose to use either Windows/Linux.
Database: sqlite (inbuilt). You might want SQLAlchemy though.
GUI: tcl is inbuilt, but wxPython or pyQt are recommended.
IDE: I use idle (inbuilt) on windows, TextMate on Mac, but you might like PyDev. I've also heard good things about ulipad.
Numerics: numpy.
Fast inline code: lots of options. I like boost weave (part of scipy), but you could look into ctypes (to use dlls), Cython, etc.
Web server: too many options. Django (plus Apache) is the biggest.
Unit testing: inbuilt.
Pyparsing, just because.
BeautifulSoup (or another good HTML parser).
hg, git, or some other nice VC.
Trac, or another bug system.
Oh, and StackOverflow if you have any questions.
Pycharm Community is worth to try.

Are there any other good alternatives to zc.buildout and/or virtualenv for installing non-python dependencies?

I am a member of a team that is about to launch a beta of a python (Django specifically) based web site and accompanying suite of backend tools. The team itself has doubled in size from 2 to 4 over the past few weeks and we expect continued growth for the next couple of months at least. One issue that has started to plague us is getting everyone up to speed in terms of getting their development environment configured and having all the right eggs installed, etc.
I'm looking for ways to simplify this process and make it less error prone. Both zc.buildout and virtualenv look like they would be good tools for addressing this problem but both seem to concentrate primarily on the python-specific issues. We have a couple of small subprojects in other languages (Java and Ruby specifically) as well as numerous python extensions that have to be compiled natively (lxml, MySQL drivers, etc). In fact, one of the biggest thorns in our side has been getting some of these extensions compiled against appropriate versions of the shared libraries so as to avoid segfaults, malloc errors and all sorts of similar issues. It doesn't help that out of 4 people we have 4 different development environments -- 1 leopard on ppc, 1 leopard on intel, 1 ubuntu and 1 windows.
Ultimately what would be ideal would be something that works roughly like this, from the dos/unix prompt:
$ git clone [repository url]
...
$ python setup-env.py
...
that then does what zc.buildout/virtualenv does (copy/symlink the python interpreter, provide a clean space to install eggs) then installs all required eggs, including installing any native shared library dependencies, installs the ruby project, the java project, etc.
Obviously this would be useful for both getting development environments up as well as deploying on staging/production servers.
Ideally I would like for the tool that accomplishes this to be written in/extensible via python, since that is (and always will be) the lingua franca of our team, but I am open to solutions in other languages.
So, my question then is: does anyone have any suggestions for better alternatives or any experiences they can share using one of these solutions to handle larger/broader install bases?
Setuptools may be capable of more of what you're looking for than you realize -- if you need a custom version of lxml to work correctly on MacOS X, for instance, you can put a URL to an appropriate egg inside your setup.py and have setuptools download and install that inside your developers' environments as necessary; it also can be told to download and install a specific version of a dependency from revision control.
That said, I'd lean towards using a scriptably generated virtual environment. It's pretty straightforward to build a kickstart file which installs whichever packages you depend on and then boot virtual machines (or production hardware!) against it, with puppet or similar software doing other administration (adding users, setting up services [where's your database come from?], etc). This comes in particularly handy when your production environment includes multiple machines -- just script the generation of multiple VMs within their handy little sandboxed subnet (I use libvirt+kvm for this; while kvm isn't available on all the platforms you have developers working on, qemu certainly is, or you can do as I do and have a small number of beefy VM hosts shared by multiple developers).
This gets you out of the headaches of supporting N platforms -- you only have a single virtual platform to support -- and means that your deployment process, as defined by the kickstart file and puppet code used for setup, is source-controlled and run through your QA and review processes just like everything else.
I always create a develop.py file at the top level of the project, and have also a packages directory with all of the .tar.gz files from PyPI that I want to install, and also included an unpacked copy of virtualenv that is ready to run right from that file. All of this goes into version control. Every developer can simply check out the trunk, run develop.py, and a few moments later will have a virtual environment ready to use that includes all of our dependencies at exactly the versions the other developers are using. And it works even if PyPI is down, which is very helpful at this point in that service's history.
Basically, you're looking for a cross-platform software/package installer (on the lines of apt-get/yum/etc.) I'm not sure something like that exists?
An alternative might be specifying the list of packages that need to be installed via the OS-specific package management system such as Fink or DarwinPorts for Mac OS X and having a script that sets up the build environment for the in-house code?
I have continued to research this issue since I posted the question. It looks like there are some attempts to address some of the needs I outlined, e.g. Minitage and Puppet which take different approaches but both may accomplish what I want -- although Minitage does not explicitly state that it supports Windows. Lacking any better options I will try to make either one of these or just extensive customized use of zc.buildout work for our needs, but I still feel like there must be better options out there.
You might consider creating virtual machine appliances with whatever production OS you are running, and all of the software dependencies pre-built. Code can be edited either remotely, or with a shared folder. It worked pretty well for me in a past life that had a fairly complicated development environment.
Puppet doesn't (easily) support the Win32 world either. If you're looking for a deployment mechanism and not just a "dev setup" tool, you might consider looking into ControlTier (http://open.controltier.com/) which has a open-source cross-platform solution.
Beyond that you're looking at "enterprise" software such as BladeLogic or OpsWare and typically an outrageous pricetag for the functionality offered (my opinion, obviously).
A lot of folks have been aggressively using a combination of Puppet and Capistrano (even non-rails developers) for deployment automation tools to pretty good effect. Downside, again, is that it's expecting a somewhat homogeneous environment.

Categories

Resources