In the gradle/JVM world there is the awesome plugin: https://github.com/diffplug/spotless-changelog
Is there any similar public solution available I am currently missing when building pip or conda packages?
This: Automating Python package release process refers to using poetry, but is not nearly as fully-fledged as spotless-changelog and seems to require some manual setup/scripts.
I know that there is a python package for validating the keepachangelog format https://pypi.org/project/keepachangelog/ available - but do not see any integration like spotless-changelog.
Are manual steps required to chain it up? Or am I overlooking a great & already existing tool for such an automated process similar to spotless-changelog?
https://zestreleaser.readthedocs.io/en/latest/ so far looks like the closest match
Related
I'm tryin' to find a way to install a python package with its docs.
I have to use this on machines that have no connection to the internet and so online help is not a solution to me. Similar questions already posted here are telling that this is not possible. Do you see any way to make this easier as I'm currently doing this:
downloading the source archive
extracting the docs folder
running sphinx
launching the index file from a browser (firefox et al.)
Any ideas?
P.S. I'm very new to Python, so may be I'm missing something... And I'm using Windows (virtual) machines...
Edit:
I'm talking about two possible ways to install a package:
installing the package via easy_install (or any other to me unknown way) on a machine while I'm online, and then copying the changes to my installation to the target machine
downloading the source package (containing sphinx compatible docs) and installing the package on the target machine off-line
But in any case I do not know a way to install the package in a way that the supplied documentations are installed alltogether with module!
You might know that there exists a folder for the docs: <python-folder>/Doc which will contain only python278.chm after installation of Python 2.78 on Windows. So, I expect that this folder will also contain the docs for a newly installed package. This will avoid looking at docs for a different package version on the internet as well as my specific machine setup problems.
Most packages I'm currently using are supplied with documentation generated with sphinx, and their source package contains all the files necessary to generate the docs offline.
So what I'm looking for is some cli argument for a package installer like it's common for unix/linux based package managers. I did expect something like:
easy_install a_package --with-html-docs.
Here are some scenarios:
packages have documentation included within the zip/tar
packages have a -docs to download/install seperately
packages that have buildable documentation
packages that only have online documentation
packages with no documentation other than internal.
packages with no documentation anywhere.
The sneaky trick that you can use for options 1 & 3 is to download the package as a tar or zip and then use easy-install archive_name on the target machine this will install the package from the zip or tar file including (I believe) any documentation. You will find that there are dependencies that are unmet in some packages - those should give an error on the easy install mentioning what is missing - you will need to get those and use the same trick.
A couple of things that are very handy - virtual-env will let you have a library free version of python running so you can get the requirements and pip -d <dir> which will download without installing storing your packages in dir.
You should be able to use the same trick for option 2.
With packages that only have on-line documentation you could look to see if there is a downloadable version or could scrape the web pages and use a tool like pandoc to convert to something useful.
In the 5 scenario I would suggest raising a ticket on the package stating that lack of accessible documentation makes it virtually unusable and running sphinx on it.
In scenario 6 I suggest raising the ticket but missing out virtually and avoiding the use of that package on the basis that if it has no documentation it probably has a lot of other problems as well - if you are a package author feeling slandered reading this then you should be feeling ashamed instead.
Mirror/Cache PyPi
Another possibly is to have a linux box, or VM, initially outside of your firewall, running a cached or mirroring service e.g. pipyserver, install the required packages through it to populate the cache and then move it, (or its cache to another pip server), inside the firewall and you can then use pip with the documented settings to do all your installs inside the firewall. See also the answer here.
My program requires specific versions of several python packages. I don't want to have to require the user to specifically install the specific version, so I feel that the best solution is to simply install the package within the source repository, and to distribute it along with my package.
What is the simplest way to do this?
(Please be detailed - I'm familiar with pip and easy_install, but they don't seem to do this, at least not by default).
Go for virtualenv. Life will be much easier. MUCH easier. Basically, it allows you to create specific python environments as needed.
There are indeed two ways to get this done.
I usually use buildout (see a post by Jacob from Django: http://jacobian.org/writing/django-apps-with-buildout/) - and have everything from django up installed locally at the project's environment, with pydev and django support. It's very easy since I have projects that use latest versions of open source software and others that use specific versions of the same packages.
Another alternative is, as Charlie says, the virtualenv,which is designed to do just that. Many people recommend it, I've never used it myself as I'm happy with buildout.
I need to build an internal python package index server, starting from scratch. In house right now we use Python 2.6 and 2.7, and prefer installing packages using pip.
We don't require authentication, but it would help.
What's the easiest way to get a PyPi equivalent running internally?
I am aware of the existence of mypypi, djangopypi, Plone Software Center, and EggBasket, but I have not been able to find a simple clear set of steps to set one of these up in a modern environment; the most recent complete docs I found for any of them seemed to be Tarek Ziade's 2008 blog article on them but it's not clear how up to date that is (and that seems to pull in a huge dependency tree, to boot).
tldr; what's the best PyPi implementation, and how do I install it + configure it?
Best is a relative term but I myself created ClueReleaseManager to deal with this need locally - http://pypi.python.org/pypi/ClueReleaseManager
Consider a website build using python and django. In many cases it uses 3rd party modules beside standard python library - such as pytz, South, timezones or debug toolbar.
What is standard or just convenient way to deploy such application to production hosting with all the prerequisites (timezones, etc) installed automatically?
I'm new to python, and sorry if this question is lame.
There are at least two options available. Jacob Kaplan-Moss, one of the co founders of Django has written about packaging an application using buildout and djangorecipe. There is also the versatile fabric. You should be able to tackle your problem using either of these alone or in combination with some custom scripts.
Fabric is definitely a nice way to accomplish this. There is a fairly extensive blog write up on the process at http://www.caktusgroup.com/blog/2010/04/22/basic-django-deployment-with-virtualenv-fabric-pip-and-rsync/.
The key to fabric is "fabfile.py" - there's an example of one that does a deployment at http://bitbucket.org/copelco/caktus-deployment/src/tip/example-django-project/caktus_website/fabfile.py.
The variation of this that I've used to deploy to a Linode instance is http://gist.github.com/556508
You can either use a deployment solution like fabric (http://fabfile.org/) or you can try to package the entire thing up into a python egg with dependencies that will be automatically installed when you easy_install it. See http://mxm-mad-science.blogspot.com/2008/02/python-eggs-simple-introduction.html for a simple introduction to python eggs.
I am new at writing APIs in python, in any language for that matter. I was hoping to get pointers on how i can create an API that can be installed using setup.py method and used in other python projects. Something similar to the twitterapi.
I have already created and coded all the methods i want to include in the API. I just need to know how to implement the installation so other can use my code to leverage ideas they may have. Or if i need to format the code a certain way to facilitate installation.
I learn best with examples or tutorials.
Thanks so much.
It's worth noting that this part of python is undergoing some changes right now. It's all a bit messy. The most current overview I know of is the Hitchhiker's Guide to Packaging: http://guide.python-distribute.org/
The current state of packaging section is important: http://guide.python-distribute.org/introduction.html#current-state-of-packaging
The python packaging world is a mess (like poswald said). Here's a brief overview along with a bunch of pointers. Your basic problem (using setup.py etc.) is solved by reading the distutils guide which msw has mentioned in his comment.
Now for the dirt. The basic infrastructure of the distribution modules which is in the Python standard library is distutils referred to above. It's limited in some ways and so a series of extensions was written on top of it called setuptools. Setuptools along with actually increasing the functionality provided a command line "installer" called "easy_install".
Setuptools maintenance was not too great and so it was forked and a more active branch called "distribute" was setup and it is the preferred alternative right now. In addition to this, a replacement for easy_install named pip was created which was more modular and useful.
Now there's a huge project going which attempts to fold in all changes from distribute and stuff into a unified library that will go into the stdlib. It's tentatively called "distutils2".