Is there a pip / easy_install for Scala? [closed] - python

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I want to organize my Scala packages and love how Python solves this issue with pip.
Can you recommend a similar tool for the management of Scala packages?
EDIT:
I am looking for an easy installation of new packages with all it's dependencies like
>>> pip install <a_package> # installs a_package with all dependencies.

The most directly similar is probably Scala Build Tool. Specifically, Library Dependencies. The Java ecosystem includes many libraries and build tools, Scala is built on Java. So you gain the ability to leverage things like -
Maven
Gradle
Scala Build Tool
Further, because everything is run inside a virtual machine; there is no "system" level install. You can start with your CLASSPATH and for more investigate class loading.
#!/bin/sh
# From http://www.scalaclass.com/node/10 - CLASSPATH
L=`dirname $0`/../lib
cp=`echo $L/*.jar|sed 's/ /:/g'`
exec scala -classpath "$cp" "$0" "$#"
!#
import com.my.Goodness
val goodness = new Goodness
world.hello

Pythonistas install system wide packages which are then used by all of the python projects. This lead to a bunch of problems which virtualenv tries to solve. Scala guys and in general Java people have per-project definition which is written for dependency management tool -- either mvn (xml), sbt (scala), gradle (groovy), etc.
Most of these tools have system-wide cache, so usually it downloads some version of dependency only once, then puts it in a particular place on your disk. When you need to run/assemble your java or scala program it constructs so called CLASSPATH variable which is consists of patches to required libraries. Then CLASSPATH variable (aka PYTHONPATH in python world) is used by runtime environment to lookup required parts. Again, CLASSPATH varies a lot from project to project, whereas PYTHONPATH is quite constant. I do believe there are might be tools that do the very same job pip does, but it isn't accepted way in JVM world.

Related

Building a python package to publish in pypi

I am greatly confused with the process of building a python package that I want to distribute on pypi.
There are some specific, basic things that I did not understand:
What exactly is that gets published? Binaries? Source code? How do I do one or the other?
How do I build multiple platform-specific, os-specific build from the same codebase?
How do I build a the package for multiple versions of Python from the same codebase? Is it necessary if I want to support many python versions?
I am using a .toml file for the setup configuration.
I found some answers only, but all refer to procedures with either a setup.py or a setup.cfg.
What exactly is that gets published? Binaries? Source code?
Yes, and yes.
It depends on details of your project and your package config.
Arbitrary commands can be run during a package build.
You might, for example, run a fortran compiler locally
and ship binaries, or you might insist that each person
installing the package run their own local fortran compiler.
We usually expect full *.py source code will appear on pypi.org.
"Binaries" here will usually describe compiled machine code,
and not *.pyc bytecode files.
How do I build multiple platform-specific, os-specific build from the same codebase?
I have only done this via git pull on a target platform, followed
by a local build, but there is certainly support for cross target
toolchains if you need that.
How do I build a the package for multiple versions of Python from the same codebase?
Same as above -- do a build under each separate target version.
Is it necessary if I want to support many python versions?
Typically the answer is "no".
Pick a minimum required interpreter version, such as 3.7,
and do all your development / testing / release work in that.
Backward compatibility of interpreters is excellent.
Folks running 3.8 or 3.11 should have no trouble
with your package.
There can be a fly in the ointment.
Suppose your project depends on library X,
or depends on X which depends on Y.
And one of them stopped being updated a few years ago,
or went through a big change like a rename.
Your users who are on 3.11 might find it
inconvenient to obtain a compatible version of X or Y.
This might motivate you to do split releases,
for example via major version number or by
slightly altering your project name.
Clearly you haven't crossed that bridge quite yet.
The poetry ecosystem
is reasonably mature. It has tried to fix many of the
rough edges surrounding the python packaging practices
of the last few decades. I recommend that you prefer
modern over ancient practices, and that you adopt poetry
for your project.
If that won't fly for some reason, and especially if
binaries are a big deal for your project, consider publishing via
conda.
There are many pip pitfalls with target system
needing compilers and libraries. Conda does an
excellent job of ensuring that conda install ...
will Just Work.

What is best practice for working on a Python library package? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Basically we have a Python library with modules and functions that we use in many of our programs. Currently, we checkout the SVN repository directly into C:\Python27\Lib so that the library is in the Python path. When someone make modifications to the library, everyone will update to get those modifications.
Some of our programs are frozen (using cx-Freeze) and delivered so we have to keep tracking of the library version used in the deliveries, but cx-Freeze automatically packages the modules imported in the code.
I don't think it is a good idea to rely on people to verify that they have no uncommitted local changes in the library or that they are up to date before freezing any program importing it.
The only version tracking we have is the commit number of the library repository, which is not linked anywhere to the program delivery version, and which should not be used as a delivery version of the library in my opinion.
I was thinking about using a setup.py to build a distribution of a specific version of that library and then indicate that version in a requirements.txt file in the project folder of the program importing it, but then it becomes complicated if we want to make modifications to that library because we would have to build and install a distribution each time we want to test it. It is not that complicated but I think someone will freeze a program with a test version of that library and it comes back to the beginning...
I kept looking for a best practice for that specific case but I found nothing, any ideas?
Ultimately you're going to have to trust your users to follow what development process you establish. You can create tools to make that easier, but you'll always end up having some trust.
Things that have been helpful to a number of people include:
All frozen/shipped builds of an executable are built on a central machine by something like BuildBot or Jenkins, not by individual developers. That gives you a central point for making sure that builds are shipped from clean checkouts.
Provide scripts that do the build and error out if there are uncommitted changes.
Where possible it is valuable to make it possible to point PYTHONPATH at your distribution's source tree and have things work even if there is a setup.py that can build the distribution. That makes tests easier. As always, make sure that your tools for building shipped versions check for this and fail if it happens.
I don't personally think that a distribution has a lot of value over a clean tagged subversion checkout for a library included in closed-source applications.
You can take either approach, but I think you will find that the key is in having good automation for whichever approach you choose, not in the answer to distribution vs subversion checkout

How do I install streamparse from source? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I need to use streamparse on a CentOS machine that does not have internet access, meaning I cannot use pip. The only net-enabled services I can use are scp and ssh. My plan is to get streamparse on my local machine (Ubuntu) and then scp the streamparse files to the CentOS machine and manually install from there.
Any ideas on how to do this?
edit:
since this is "on hold as off-topic," I'll explain why it just might be considered "on topic" by addressing the 4 "on topic" bullet points from the community help page (https://stackoverflow.com/help/on-topic).
a specific programming problem: installation is a kind of programming problem, especially when you have to write (program, verb) shell scripts (program, noun) to accomplish the installation of software that leads to more programming.
a software algorithm: I am looking for a sequence of steps (aka an algorithm) to install something within specified technical constraints.
software tools commonly used by programmers: the thing I am trying to
install is a software tool. It is called streamparse. It is used by programmers.
a practical, answerable problem that is unique to software development: I was not asking this question for theoretical reasons--hence it is practical, and I believe installing things by getting around firewalls is unique to software development. I'll concede that this could be viewed not as "software development" but rather "devops" but those two things are merging so throw me a bone here.
Once you have the lib files on the CentOS box you can use pip to install by passing the -e (editable) flag:
$ pip install -e path/to/SomeProject
Here's a link to pipy's #editable-installs section
Thanks, #dougdragon. I also got pointed to the solution below. I'll leave yours as the accepted answer since you got it first.
$ wget https://pypi.python.org/packages/8d/f8/9ccde77a90a30ef491bee431f157aee38dbd93b5f3c7545779a0acee71db/streamparse-3.0.1.tar.gz
$ tar -zxvf streamparse-3.0.1.tar.gz
$ python streamparse-3.0.1/setup.py develop

How to uninstall python [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I am an amateur and have a mac and I would like to reset all python. I want to delete all the versions other than the one already on the mac in the OS. I would like the python versions to be as if I had just gotten a new OS.
THanks!!!
Unfortunately, there is no one-size-fits all answer here, because there are a number of different ways to install Python, many of which do not come with uninstallers.
Everything Apple installs is inside /System, or inside /usr but not /usr/local. So, those are the areas not to touch, no matter what.
Apple's Pythons put their system-wide site packages in /Library/Python/X.Y. Some third-party Pythons may also use site packages there. If you've mixed and matched, there's no way to straighten that up except to wipe the whole thing. To restore these directories to a clean slate, each one should have nothing but a site-packages directory, in which there should be nothing but a README and a easy-install.pth and/or Extras.pth.
Some third-party packages that have binary installers meant to work with Apple's Python install things into /usr/local/lib/pythonX.Y/site-packages. Again, these are shared with other Python installations. If you want to restore to a clean slate, delete everything in any such directory.
If you've configured user-specific site packages, or virtual environments, you should know which ones go with which Python—and, if you don't, just scrap them entirely.
Apple's Pythons install or link any scripts/executables that come with any site packages into /usr/local/bin. Unfortunately, most third-party Pythons will do the same thing. And of course non-Python executables installed from elsewhere also end up here. The only way to really be safe is to only delete files here that:
Are symlinks to something in /Library/Frameworks/Python.framework or into a site-packages directory.
Are scripts whose shebang lines points to a non-Apple Python (that is, it's not in /System or /usr/bin).
Are executables that link to a non-Apple Python (visible with otool -L).
If you're also trying to kill site packages installed into Apple Python, symlinks, shebangs, and executable links that point to Apple Python can go too.
Anything installed with a package manager—Homebrew, MacPorts, Fink, etc.—should be uninstalled the same way: brew uninstall python, sudo port uninstall python3.2, etc.
Anything that has an uninstaller (either inside a Python X.Y or MacPython or similar folder in Applications, or on the original disk image), obviously run that.
Meanwhile, non-Apple standard framework builds—that is, any binary installer from python.org, or anything you build yourself according to simple instructions—will put files into the following places:
/Library/Framework/Python.framework/X.Y. This is the main guts. Kill it. In fact, kill the whole Python.framework to remove all versions at once.
/usr/local/bin, shared with Apple, as mentioned above.
/usr/local/lib. Anything named libpython* here is killable.
/Applications/[Mac]Python*.
Non-framework builds by default install everything into /usr/local; they will be effectively the same as framework builds, but with no /Library/Framework/Python.framework/X.Y, and instead a /usr/local/lib/pythonX.Y that contains things besides just site-packages. Kill those things.
Third-party installations like Enthought or (x,y), you will have to figure out what you have and find the uninstallation instructions on their site. There's really no way around that.

distutils, distutils2, pip and requirements [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I am diving into the world of packaging Python applications and managed to get into this state of confusion where my head starts to spin due to all the concepts and options I am supposed to deal with.
Question:
What do I need to accomplish? Deploy my Python project from source located on a git server. The deployment tool should get and install all the dependencies, most of which are available via PIP and one needs to be fetched and installed via Git. The final result should be installable via Pip, so I can do something as:
[~] git clone git://some/path/project.git
[~] pip install project/
Context:
Currently I am trying to get Distutils2 to do what I want, but it seems that setup.py made using 'generate-setup' command doesn't play along with Pip.
I wanted to use Distutils2 since it is supposed to be most future proof of all. But various documentation on all the tools is just horrible (accurate info mixed with outdated and inaccurate information) in a way that makes a guy question his sanity.
So what should I do? Stick to distutils and setup.py? Or do I need to take a look at something the likes of Buildout?
Could the kind answerer please lay out what I am supposed to do with particular tool (e.g.: deploy your code using Distutils2, install dependencies using PIP, for git dependencies write a script and glue everything together doing XYZ).
Edit: I am using Distutils2 1.0a4, which seems incompatible with the docs.
Edit2: Reformatted the question to make it clearer what my question is really about.
Edit3: This is my fourth attempt at breaking the packaging and distribution toolchain of Python. I am not trying to get other people to do my work for me, however for a rookie it is pretty much impossible to crack what a particular tool is supposed to do, where it starts and where another ends. Especially because of the functional overlap between the tools. I am not located in Silicon Valley encircled by sages who could initiate me into the secrets and the publicly available documentation is of no use.
Final Edit:
Although I wasn't really thinking about replacing virtualenv with Buildout when starting this question. But while doing my research I realized something I always knew, but just didn't come to me in total clarity. There are many ways about Python packaging and deployment automation. There are also many tools which can help you get the stuff done. However while there is significant functional overlap between the tools, the toolchain is ever evolving and thus far there is no clear "standard best practice". The distribution toolchain arms race is still in full heat and no clear victor has emerged yet. This may be confusing to us noobs, who expect that most of shit in Python just works. What I was after (distutils/setuptools + pip + virtualenv in a Buildout fashion or even semi integrated with Buildout) certainly is dooable, but it just doesn't make much sense, not because its not possible - but because nobody does it. If you need this level of sophistication, then you need to commit. Personally I have decided to leave virtualenv behind (for this project) and embrace Buildout.
Take a look at buildout; together with a buildout plugin called mr.developer you can put together a deployment system that'll do all that you ask for.
There are plenty of examples and presentations on buildout configurations on the web, here are a few to get you started:
An introductionairy presentation on buildout: http://www.slideshare.net/djay/buildout-how-to-maintain-big-app-stacks-without-losing-your-mind
Includes a YouTube video of the presentation, so you can listen along.
Excellent blog post on how to use buildout to develop a Django application.
Includes details on how buildout and setup.py interplay.
The configuration for planet.plone.org: https://github.com/plone/planet.plone.org/blob/master/buildout.cfg
This builds a venus RSS aggregator with configuration, style, apache config and cronjobs, pulling in eggs as needed.
The Plone core development buildout: https://github.com/plone/buildout.coredev
Complex buildout that pulls in all the sources needed to develop the Plone CMS; this is a complex beast but it shows off what you can do with mr.developer.
It doesn't need to be difficult: install Jenkins and use pip's requirements.txt files to define the packages that your project needs. After that you can configure a build in Jenkins to perform various tasks, including installing the required packages. It can obtain the source code from your repository and install + build the whole project.

Categories

Resources