Here is the scenario I am dealing with:
I WANT to/HAVE setup CircleCI build for my project with unit tests etc.
In this project I use another one of my libraries which needs to be installed on the build container in CirleCi, otherwise my tests are failing.
I need to find a way to either:
pull git repository of external reference and install it
Or download it as zip
Or some other way ?
Happy to add more explanation if needed.
From the section Using Resources External to Your Repository:
CircleCI supports git submodule, and has advanced SSH key management to let you access multiple repositories from a single test suite. From your project’s Project Settings > Checkout SSH keys page, you can add a “user key” with one-click, allowing you access code from multiple repositories in your test suite. Git submodules can be easily set up in your circle.yml file (see example 1).
CircleCI’s VMs are connected to the internet. You can download dependencies directly while setting up your project, using curl or wget.
(Or just using git clone without submodules.)
Related
I'm trying to install private python-based git repos from a requirements.txt into a docker container such that they are easily editable during development.
For example, I have a Django project which contains a Dockerfile that allows building that project inside of a docker container. (It might look something like this https://github.com/JoeJasinski/docker-django-demo/blob/master/Dockerfile).
Now, say that project has a requirements.txt file that pulls in code from a private repos as follows.
django=1.11.2
-e git+git#github.com:myorg/my-private-project.git#egg=my_private_project
-e git+ssh://git#git.example.com/second-private-project#mytag#egg=second_private_project
-e git+https://github.com/myorg/third-private-project#egg=third_private_project
Ideally, I'd make it so I can edit both my main project, and the dependent repos without having to re-build the docker container each time. The Dockerfile "ADD . dest/" command makes it possible for the main project to be edited in place, but I'm having difficulty finding a good solution for installing these private repositories.
Normally (outside of Docker), the pip -e flag makes repos editable in place, which is great since I can edit and commit to them like any other repo.
However, inside of Docker, the container doesn't have access to the ssh private key needed to download the private repos (and this is probably a good thing, so we don't build the key into the docker images).
One thought I had is to download the private repos outside of the container, prior to building. Then somehow those repos would be "ADD"ed to the Docker container at build time and then individually added to the PYTHONPATH (maybe during runtime?). However, I feel like I'm over-complicating the situation.
Any suggestions as to a good, simple (Pythonic) way to install private python-based git repositories into a container so that it's easy to develop on both the main project and dependent repositories?
I have a repository (on GitHub) consisting of a number of modules that can be added to the main project as plugins. I want to set up the repository such that an automatic PyPI deployment is triggered (only for the changed module) every time a pull request is accepted.
Is there any way to achieve this?
Travis-CI supports automatic PyPI deployments but for the entire repository. I need it only for a folder inside the repo (a module).
You can use the after_success: option to implement custom deployments on travis-ci.
Something like:
after_success:
"cd $subfolder && python setup.py sdist upload -r pypi"
You will have to provide your pypi credentials yourself using whichever method you find best.
I would like to be able to install python packages to openshift, but those packages live in my private repositories, on bitbucket.
How can I create a SSH key for Openshift, and how do I make Openshift use it when installing packages? (after adding the corresponding public key to bitbucket as a Deploy Key)
What I've tried:
I used ssh-keygen to create a key on ~/.openshift_ssh/. It was created, but I'm not sure it is being used.
I also tried adding the publick key on <jenkins_dir>/app-root/data/.ssh/jenkins_id_rsa.pub, but the result is always the same. On the jenkins console output of the buildjob:
Doing git clone from ssh://git#bitbucket.org/jpimentel/zed.git to /tmp/easy_install-FpEKam/zed.git
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
Is there anything that can be done?
So, at this time OpenShift does not offer a simple mechanism to do this. I would urge developers to look at https://gondor.io/support/dependencies/ for an effective solution to the problem.
That said, I was finally able to find an acceptable (at least, for me) workaround that works on both scalable and non scalable apps, with the following procedure:
create a deploy/ directory in the repository
put a copy of your private deploy key in said directory
create a bash script deploy/wrapper.sh that will run ssh with the provided key:
#!/bin/sh
ssh -o StrictHostKeyChecking=no -i $OPENSHIFT_REPO_DIR/deploy/id_deploy $#
note the option passed to disable host key check; cloning will fail without it.
install dependencies in the build hook (.openshift/action_hooks/build). In my case I added something like
echo "Cloning private repo..."
source $VIRTUAL_ENV/bin/activate
GIT_SSH=$OPENSHIFT_REPO_DIR/deploy/wrapper.sh pip install git+ssh://git#bitbucket.org/team/reponame.git#egg=reponame
commit everything and push it to openshift.
profit!
If you want to deploy your custom python modules then recommended way is to create a libs directory in the application source code root and push them to your application git repository. OpenShift will automatically pick your modules.
I installed openstack through DevStack because I had to modify some files.
When I install DevStack, I have all the files under /opt/stack. There I have services folders (glance, keystone...) and libraries folders (python-glanceclient, python-keystoneclient).
If I modify those files, how can I replicate modification on a already deployed Openstack? Installing openstack without devstack builds a different structure of folder.
I mean, where the python-'service'client folders are in a fresh openstack installation?
Thank you
devstack pulls it's openstack software stack from github. the git repot's it installs from are located in /opt/stack.
What you may want to do is fork openstack as well as the repot of the openstack projects you wish to modify, then make your devstack deployment deploy not from the openstack repot, but your own forked repot.
you can do this by modifying the stack.sh script ( I believe ). devstack.org has a line by line explanation of the entire script on their site and that can point you in the right direction.
http://devstack.org/stack.sh.html read this.
once you've deployed using your own git repository you can of course edit and commit back to it. even push to it.
then any other devstack deployment you have can also be setup to pull from your repository instead of the public openstack repos.
of course rebasing later against openstack will be increasingly difficult as openstack developement moves at a fairly brisk pace.
If the modification you are making is one you want to commit back to the open source project check out this site:
http://wiki.openstack.org/HowToContribute
Basically openstack has a commit review and continuous integration environment that is based off of gerrit and jenkins. This is the method by which commits back to the open source repository are gate tested and manually reviewed by other developers before being merged.
If you are intending to deploy this for production use, I recommend against deploying from devstack. This is not the proper way to do that.
I am currently using github to develop a python application and am looking to deploy it on EC2.
Is there a good way to automatically handle the messiness this entails (setting up SSH key pairs on the EC2 instance for github, pulling from the github repository every time a commit is pushed to the master branch, etc.) without a bunch of custom scripts? Alternatively, is there an open-source project that has focused on this?
I wrote a simple python script to do this once. I also posted about it on my blog.
You set up mappings of your repositories and brances to point to local folders which already contain a checkout of that repo and branch. Then, you enable GitHub's post-receive hooks to hit the script which will then automatically trigger a git pull in the appropriate folder.