I am currently using github to develop a python application and am looking to deploy it on EC2.
Is there a good way to automatically handle the messiness this entails (setting up SSH key pairs on the EC2 instance for github, pulling from the github repository every time a commit is pushed to the master branch, etc.) without a bunch of custom scripts? Alternatively, is there an open-source project that has focused on this?
I wrote a simple python script to do this once. I also posted about it on my blog.
You set up mappings of your repositories and brances to point to local folders which already contain a checkout of that repo and branch. Then, you enable GitHub's post-receive hooks to hit the script which will then automatically trigger a git pull in the appropriate folder.
Related
I'm using venv for my python repo on github, and wanted to run the same code on 10+ ec2 instances (each instance will have a cronjob that just runs the same code on the same schedule)
Any recommendations on how to best achieve this + continue to make sure all instances get the latest release branches on github? I'd like to try and automate any configuration I need to do, so that I'm not doing this:
Create one ec2 instance, set up all the configurations I need, like download latest python version, etc. Then git clone, set up all the python packages I need using venv. Verify code works on this instance.
Repeat for remaining 10+ ec2 instances
Whenever someone releases a new master branch, I have to ssh into every ec2 instances, git pull to the correct branch, re-update any new configurations I need, repeat for all remaining 10+ ec2 instances.
Ideally I can just run some script that pushes everything that's needed to make the code work on all ec2 instances. I have little experience with this type of thing, but from reading around this is an approach I'm considering. Am I on the right track?:
Create a script I run to ssh into all my ec2 instances and git clone/update to correct branch
Use Docker to make sure all ec2 instances are set up properly so the python code works (Is this the right use-case for Docker?). Above script will run the necessary Docker commands
Similar thing with using venv and reading the requirements.txt file so all ec2 instances has the right python packages and versions
Depending on your app and requirements (is EC2 100% necessary?) I can recommend following:
Capistrano-like SSH deployments (https://github.com/dlapiduz/fabistrano) if your fleet is static and you need fast deployments. Not a best practice and not terribly secure, but you mentioned similar scheme in your post
Using AWS Image Builder (https://aws.amazon.com/image-builder/) or Packer (https://www.packer.io/) to build new release image and then replace old image with new in your EC2 autoscaling group
Build docker image of your app and use ECS or EKS to host it. I would recommend this approach if you're not married to running code directly on EC2 hosts.
This question already has an answer here:
Are we able to use GitHub API to create a commit? especially v4?
(1 answer)
Closed 3 years ago.
Ok, this seems like a stupid question and I don't have a lot of experience using git as I have always been dependent on GitHub desktop for my work. So my question is, Can a github remote repo be committed to and pushed to via only the github api. Can it be done without installing git on your system? I have used PyGitHub and git-python or even python subprocess that can be used to create a remote repo, initialize and add to the the local repo and perform the commit and pushes to the remote repo and I know that these services require the use of the git client installed on the system.
So I was just wondering if only the standalone github api can be called through python requests to do the same but the requirement being that I don't have to get Git installed in my local system. Any help on the matter would be really enlightening.
Can a github remote repo be committed to and pushed to via only the github api. Can it be done without installing git on your system?
Kinda: you can interact with the raw objects via the API, I'm not sure you can behave as if git was on your machine and to push/pull from a local working copy as you'd do if you did have git installed locally.
My experience of it is that it requires some understanding of the low-level fundamentals of Git (blobs, trees, commits, refs, and their relations): the v3 API exposes a git data / git database endpoint which provides transparent access to low-level git structures. There are some limitations (e.g. can't interact with a brand new empty repository via this, you have to have at least one commit in it somehow, high-level operations like "cherrypick" or "rebase" are unavailable and have to be hand-rolled if you want them, ...) but aside from that it works fine.
To understand the low-level objects I would recommend the "Git Internals" chapter of the git book, sections 10.2 Git Objects and 10.3 Git References. There are also "from the ground up" tutorials out there which explain these structures in a more hands-on way by building partial git clients from the ground up.
So I was just wondering if only the standalone github api can be called through python requests to do the same but the requirement being that I don't have to get Git installed in my local system.
See above, kinda: you can most certainly interact with a github repository via the API in most every way possible, but rebuilding a git client out of the API might be difficult.
PyGithub will enable you to deal with whatever we have in Github.com. To manage your local git. Like, committing or pushing or stashing. You might need to use some python modules that deal with Git, not Github. One of the examples could be
https://github.com/gitpython-developers/GitPython
As a fledgling Django developer, I was wondering if it was customary, or indeed possible, to create a site with Django then transfer the complete file structure to a different machine where it would "go live".
Thanks,
~Caitlin
You could use GIT or Mercurial - or other version control system. To put the site structure on a central server. After that you could deploy the site for example with fabric to multiple servers. For deployment process you should consider using for example virtualenv to isolate the project from global python packages and requirements.
Of course that's possible and in fact it's the only way to "go live". You don't want to develop in your live server, do you? And it's true for any platform, not just django.
If I understood your question correctly, you need a system to push your development code to live.
Use a version control system: git, svn, mercurial etc.
Identify environment specific code like setting/config files etc. and have separate instances of them for each environment.
Create a testing/staging/PP environment which has live data or live-like data and deploy your code there before pushing it to live.
To avoid any downtime during deployment process, usually a symbolic link is created which points to the existing code folder. When a new release is to be pushed, a new folder is created with new code, after all other dependencies are done (like setting and database changes) and the sym link is pointed to the new folder.
I'm deploying an app to AWS Elastic Beanstalk using the API:
https://elasticbeanstalk.us-east-1.amazon.com/?ApplicationName=SampleApp
&SourceBundle.S3Bucket=amazonaws.com
&SourceBundle.S3Key=sample.war
...
My impression from reading around a bit is that Java deployments use .war, .zips are supported (docs) and that one can use .git (but only with PHP or using eb? doc).
Can I use the API to create an application version from a .git for a Python app? Or are zips the only type supported?
(Alternatively, can I git push to AWS without using the commandline tools?)
There are two ways to deploy to AWS:
The API backend, where it is basically a .zip file referenced from S3. When deploying, the Instance will unpack and run some custom scripts (which you can override from your AMI, or via Custom Configuration Files, which are the recommended way). Note that in order to create and deploy a new version in an AWS Elastic Beanstalk Environment, you need three calls: upload to s3, Create Application Version, and UpdateEnvironment.
The git endpoint, which works like this:
You install the AWS Elastic Beanstalk DevTools, and run a setup script on your git repo
When ran, the setup script patches your .git/config in order to support git aws.push and in particular, git aws.remote (which is not documented)
git aws.push simply takes your keys, builds a custom URL (git aws.remote), and does a git push -f master
Once AWS receives this (url is basically <api>/<app>/<commitid>(/<envname>), it creates the s3 .zip file (from the commit contents), then the application version on <app> for <commitid> and if <envname> is present, it also issues a UpdateEnvironment call. Your AWS ids are hashed and embedded into the URL just like all AWS calls, but sent as username / password auth tokens.
(full reference docs)
I've ported that as a Maven Plugin a few months ago, and this file show how it is done in plain Java. It actually covers a lot of code (since it actually builds a custom git repo - using jgit, calculates the hashes and pushes into it)
I'm strongly considering backporting as a ant task, or perhaps simply make it work without a pom.xml file present, so users only use maven to do the deployment.
Historically, only the first method was supported, while the second grew up in importance. Since the second is actually far easier (in beanstalk-maven-plugin, you have to call three different methods while a simply git push does all the three), we're supporting git-based deployments, and even published an archetype for it (you see a sample project here, especially the README.md in particular).
(btw, if you're using .war files, my elastic beanstalk plugin support both ways, and we're actually in favor of git, since it allows us to some incremental deployments)
So, you wanna still implement it?
There are three files I suggest you read:
FastDeployMojo.java is the main façade
RequestSigner does the real magic
This is a testcase for RequestSigner
Wanna do in
Python? I'd look for Dulwich
C#? The powershell version is based in it
Ruby? The Linux is based on it
Java? Use mine, it uses jgit
I installed openstack through DevStack because I had to modify some files.
When I install DevStack, I have all the files under /opt/stack. There I have services folders (glance, keystone...) and libraries folders (python-glanceclient, python-keystoneclient).
If I modify those files, how can I replicate modification on a already deployed Openstack? Installing openstack without devstack builds a different structure of folder.
I mean, where the python-'service'client folders are in a fresh openstack installation?
Thank you
devstack pulls it's openstack software stack from github. the git repot's it installs from are located in /opt/stack.
What you may want to do is fork openstack as well as the repot of the openstack projects you wish to modify, then make your devstack deployment deploy not from the openstack repot, but your own forked repot.
you can do this by modifying the stack.sh script ( I believe ). devstack.org has a line by line explanation of the entire script on their site and that can point you in the right direction.
http://devstack.org/stack.sh.html read this.
once you've deployed using your own git repository you can of course edit and commit back to it. even push to it.
then any other devstack deployment you have can also be setup to pull from your repository instead of the public openstack repos.
of course rebasing later against openstack will be increasingly difficult as openstack developement moves at a fairly brisk pace.
If the modification you are making is one you want to commit back to the open source project check out this site:
http://wiki.openstack.org/HowToContribute
Basically openstack has a commit review and continuous integration environment that is based off of gerrit and jenkins. This is the method by which commits back to the open source repository are gate tested and manually reviewed by other developers before being merged.
If you are intending to deploy this for production use, I recommend against deploying from devstack. This is not the proper way to do that.