I have a repository (on GitHub) consisting of a number of modules that can be added to the main project as plugins. I want to set up the repository such that an automatic PyPI deployment is triggered (only for the changed module) every time a pull request is accepted.
Is there any way to achieve this?
Travis-CI supports automatic PyPI deployments but for the entire repository. I need it only for a folder inside the repo (a module).
You can use the after_success: option to implement custom deployments on travis-ci.
Something like:
after_success:
"cd $subfolder && python setup.py sdist upload -r pypi"
You will have to provide your pypi credentials yourself using whichever method you find best.
Related
I'm using the command func azure functionapp publish to publish my python function app to Azure. As best I can tell, the command only packages up the source code and transfers it to Azure, then on a remote machine in Azure, the function app is "built" and deployed. The build phase includes the collection of dependencies from pypi. Is there a way to override where it looks for these dependencies? I'd like to point it to my ow pypi server, or alternatively, provide the wheels locally in my source tree and have it use those. I have a few questions/problems:
Are my assumptions correct?
Assuming they are, is this possible, and how?
I've tried a few things, read some docs, looked at the various --help options in the CLI tool, I've set up a pip.conf file that I've verified works for local pip usage, then on purpose "broken it" and tried to see if the publish would fail (it did not, so this leads me to believe it ignores pip.conf, or the build (and collection of dependencies happens on the remote end). I'm at a loss and any tips, pointers, or answers are appreciated!
You can add additional pip source to point to your own pypi server. Check https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python#remote-build-with-extra-index-url
Remote build with extra index URL:
When your packages are available from an accessible custom package index, use a remote build. Before publishing, make sure to create an app setting named PIP_EXTRA_INDEX_URL. The value for this setting is the URL of your custom package index. Using this setting tells the remote build to run pip install using the --extra-index-url option. To learn more, see the Python pip install documentation.
You can also use basic authentication credentials with your extra package index URLs. To learn more, see Basic authentication credentials in Python documentation.
And regarding referring local packages, that is also possible. Check https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python#install-local-packages
I hope both of your questions are answered now.
I would like to use my own package located in private repository.
Actually, i use requirements.txt file to install dependencies in my Python AppEngine.
However, i don't understand where do I have to add my private dependency.
Thanks
You have to add the private package to the requirements.txt by using the git+ssh protocol. Better instructions can be found here: Is it possible to use pip to install a package from a private github repository?
However, I am not sure how GAE handles private keys required to access the package. If GAE only copies the lib directory from the AppEngine project repository without other magical operations upon deployment, it should work. If AppEngine only uses the lib dependencies to collect the names of the packages and then does something funny by itself, I guess you would be out of luck, if there does not exist a way to give credentials to the AppEngine for deployment.
I am unable to test this now, but will update as soon as I have.
I'm trying to install private python-based git repos from a requirements.txt into a docker container such that they are easily editable during development.
For example, I have a Django project which contains a Dockerfile that allows building that project inside of a docker container. (It might look something like this https://github.com/JoeJasinski/docker-django-demo/blob/master/Dockerfile).
Now, say that project has a requirements.txt file that pulls in code from a private repos as follows.
django=1.11.2
-e git+git#github.com:myorg/my-private-project.git#egg=my_private_project
-e git+ssh://git#git.example.com/second-private-project#mytag#egg=second_private_project
-e git+https://github.com/myorg/third-private-project#egg=third_private_project
Ideally, I'd make it so I can edit both my main project, and the dependent repos without having to re-build the docker container each time. The Dockerfile "ADD . dest/" command makes it possible for the main project to be edited in place, but I'm having difficulty finding a good solution for installing these private repositories.
Normally (outside of Docker), the pip -e flag makes repos editable in place, which is great since I can edit and commit to them like any other repo.
However, inside of Docker, the container doesn't have access to the ssh private key needed to download the private repos (and this is probably a good thing, so we don't build the key into the docker images).
One thought I had is to download the private repos outside of the container, prior to building. Then somehow those repos would be "ADD"ed to the Docker container at build time and then individually added to the PYTHONPATH (maybe during runtime?). However, I feel like I'm over-complicating the situation.
Any suggestions as to a good, simple (Pythonic) way to install private python-based git repositories into a container so that it's easy to develop on both the main project and dependent repositories?
Here is the scenario I am dealing with:
I WANT to/HAVE setup CircleCI build for my project with unit tests etc.
In this project I use another one of my libraries which needs to be installed on the build container in CirleCi, otherwise my tests are failing.
I need to find a way to either:
pull git repository of external reference and install it
Or download it as zip
Or some other way ?
Happy to add more explanation if needed.
From the section Using Resources External to Your Repository:
CircleCI supports git submodule, and has advanced SSH key management to let you access multiple repositories from a single test suite. From your project’s Project Settings > Checkout SSH keys page, you can add a “user key” with one-click, allowing you access code from multiple repositories in your test suite. Git submodules can be easily set up in your circle.yml file (see example 1).
CircleCI’s VMs are connected to the internet. You can download dependencies directly while setting up your project, using curl or wget.
(Or just using git clone without submodules.)
I'm deploying an app to AWS Elastic Beanstalk using the API:
https://elasticbeanstalk.us-east-1.amazon.com/?ApplicationName=SampleApp
&SourceBundle.S3Bucket=amazonaws.com
&SourceBundle.S3Key=sample.war
...
My impression from reading around a bit is that Java deployments use .war, .zips are supported (docs) and that one can use .git (but only with PHP or using eb? doc).
Can I use the API to create an application version from a .git for a Python app? Or are zips the only type supported?
(Alternatively, can I git push to AWS without using the commandline tools?)
There are two ways to deploy to AWS:
The API backend, where it is basically a .zip file referenced from S3. When deploying, the Instance will unpack and run some custom scripts (which you can override from your AMI, or via Custom Configuration Files, which are the recommended way). Note that in order to create and deploy a new version in an AWS Elastic Beanstalk Environment, you need three calls: upload to s3, Create Application Version, and UpdateEnvironment.
The git endpoint, which works like this:
You install the AWS Elastic Beanstalk DevTools, and run a setup script on your git repo
When ran, the setup script patches your .git/config in order to support git aws.push and in particular, git aws.remote (which is not documented)
git aws.push simply takes your keys, builds a custom URL (git aws.remote), and does a git push -f master
Once AWS receives this (url is basically <api>/<app>/<commitid>(/<envname>), it creates the s3 .zip file (from the commit contents), then the application version on <app> for <commitid> and if <envname> is present, it also issues a UpdateEnvironment call. Your AWS ids are hashed and embedded into the URL just like all AWS calls, but sent as username / password auth tokens.
(full reference docs)
I've ported that as a Maven Plugin a few months ago, and this file show how it is done in plain Java. It actually covers a lot of code (since it actually builds a custom git repo - using jgit, calculates the hashes and pushes into it)
I'm strongly considering backporting as a ant task, or perhaps simply make it work without a pom.xml file present, so users only use maven to do the deployment.
Historically, only the first method was supported, while the second grew up in importance. Since the second is actually far easier (in beanstalk-maven-plugin, you have to call three different methods while a simply git push does all the three), we're supporting git-based deployments, and even published an archetype for it (you see a sample project here, especially the README.md in particular).
(btw, if you're using .war files, my elastic beanstalk plugin support both ways, and we're actually in favor of git, since it allows us to some incremental deployments)
So, you wanna still implement it?
There are three files I suggest you read:
FastDeployMojo.java is the main façade
RequestSigner does the real magic
This is a testcase for RequestSigner
Wanna do in
Python? I'd look for Dulwich
C#? The powershell version is based in it
Ruby? The Linux is based on it
Java? Use mine, it uses jgit