Is there a way to check if there are updated files from the server using Google App Engine? Then download only the updated files to your local to match it on the server?
It may be possible if you're talking about the standard environment and you always follow disciplined app deployment procedure:
deploy from a git repository (unsure if other VCS systems work)
don't deploy when you have uncommitted changes in the repository from which you deploy
If you meet these requirements then you can access the source code for a particular deployed version of a service via StackDriver, as described in Google Cloud DataStore automatic indexing.
At least in my case in between the files in the root directory I found an automatically generated file called source-context.json, containing the git URL and revisionID of the repository from which the deployment was made. Which you can use to selectively update your local repo.
Another approach, requiring the same deployment discipline mentioned above, plus always deploying from the same repository, which you'll consider the unique master copy of your code or one of its mirrors (needed due to git's distributed nature). Then you only need to compare your local repo against the version being deployed in this master copy repo (or its mirror).
You might want to check out the Google Cloud Source Repositories as a possible such master copy repo. Or mirroring a public repo master copy (see Connecting a Hosted Repository). The advantage would be some convenience and integration with other Google Cloud Platform tools, for example:
gcloud source repos
the Source Browser
deployments from the cloud shell, see for example Google Cloud: How to deploy mirrored Repository
Otherwise direct downloading of the app code from GAE is monolithic - you can't select just specific files, see Downloading Your Source Code.
Related
I'm implementing continuous integration and continuous delivery for a large enterprise data warehouse project.
All the code reside in Google Cloud Repository and I'm able to set up Google Cloud Build trigger, so that every time code of specific file type (Python scripts) are pushed to the master branch, a Google Cloud build starts.
The Python scripts doesn't make up an app. They contain an ODBC connection string and script to extract data from a source and store it as a CSV-file. The Python scripts are to be executed on a Google Compute Engine VM Instance with AirFlow installed.
So the deployment of the Python scripts is as simple as can be: The .py files are only to be copied from the Google Cloud repository folder to a specific folder on the Google VM instance. There is not really a traditionally build to run, as all the Python files are separate for each other and not part of an application.
I thought this would be really easy, but now I have used several days trying to figure this out with no luck.
Google Cloud Platform provides several Cloud Builders, but as far as I can see none of them can do this simple task. Using GCLOUD also does not work. It can copy files but only from local pc to VM not from source repository to VM.
What I'm looking for is a YAML or JSON build config file to copy those Python files from source repository to Google Compute Engine VM Instance.
Hoping for some help here.
The files/folders in the Google Cloud repository aren't directly accessible (it's like a bare git repository), you need to first clone the repo then copy the desired files/folders from the cloned repo to their destinations.
It might be possible to use a standard Fetching dependencies build step to clone the repo, but I'm not 100% certain of it in your case, since you're not actually doing a build:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders']
If not you may need one (or more) custom build steps. From Creating Custom Build Steps:
A custom build step is a container image that the Cloud Build worker
VM pulls and runs with your source volume-mounted to /workspace.
Your custom build step can execute any script or binary inside the
container; as such, it can do anything a container can do.
Custom build steps are useful for:
Downloading source code or packages from external locations
...
I am trying to run through the creation of a Flask web app in Azure using this instruction page.
Creating Web apps with Flask in Azure
In the "Application Overveiw" section, it lists some FlaskWebProjectfiles saying.
Here's an overview of the files you'll find in the initial Git repository
\FlaskWebProject\__init__.py
\FlaskWebProject\views.py
\FlaskWebProject\static\content\
\FlaskWebProject\static\fonts\
\FlaskWebProject\static\scripts\
\FlaskWebProject\templates\about.html
\FlaskWebProject\templates\contact.html
\FlaskWebProject\templates\index.html
\FlaskWebProject\templates\layout.html
The problem is that I don't get these files when I connect up Azure to a Github repository. I know they exist because my Azure app renders the this default Flask webapp. The files exist in /wwwroot.
I am sure that I am missing something obvious here, so if anyone has followed the most recent Flask setup instruction for Azure, and had success, their input would be great.
Your initial GitHub repository is empty, so you need to clone the repository.
The process is described in the same article you mentioned, but a little later.
Basically:
1) Go to the deployment source and configure the deployment source - for example, local github
2) Go to Settings => Properties. Here you should have Git URL where your files are placed
3) Go to your workstation, and execute
git clone https://yourdeploymentusername#todeleteflask.scm.azurewebsites.net:443/todeleteflask.git
Enter password.
You should be all set now. Now, if you make change, you may push to the repository and it will arrive on the site.
I'm deploying an app to AWS Elastic Beanstalk using the API:
https://elasticbeanstalk.us-east-1.amazon.com/?ApplicationName=SampleApp
&SourceBundle.S3Bucket=amazonaws.com
&SourceBundle.S3Key=sample.war
...
My impression from reading around a bit is that Java deployments use .war, .zips are supported (docs) and that one can use .git (but only with PHP or using eb? doc).
Can I use the API to create an application version from a .git for a Python app? Or are zips the only type supported?
(Alternatively, can I git push to AWS without using the commandline tools?)
There are two ways to deploy to AWS:
The API backend, where it is basically a .zip file referenced from S3. When deploying, the Instance will unpack and run some custom scripts (which you can override from your AMI, or via Custom Configuration Files, which are the recommended way). Note that in order to create and deploy a new version in an AWS Elastic Beanstalk Environment, you need three calls: upload to s3, Create Application Version, and UpdateEnvironment.
The git endpoint, which works like this:
You install the AWS Elastic Beanstalk DevTools, and run a setup script on your git repo
When ran, the setup script patches your .git/config in order to support git aws.push and in particular, git aws.remote (which is not documented)
git aws.push simply takes your keys, builds a custom URL (git aws.remote), and does a git push -f master
Once AWS receives this (url is basically <api>/<app>/<commitid>(/<envname>), it creates the s3 .zip file (from the commit contents), then the application version on <app> for <commitid> and if <envname> is present, it also issues a UpdateEnvironment call. Your AWS ids are hashed and embedded into the URL just like all AWS calls, but sent as username / password auth tokens.
(full reference docs)
I've ported that as a Maven Plugin a few months ago, and this file show how it is done in plain Java. It actually covers a lot of code (since it actually builds a custom git repo - using jgit, calculates the hashes and pushes into it)
I'm strongly considering backporting as a ant task, or perhaps simply make it work without a pom.xml file present, so users only use maven to do the deployment.
Historically, only the first method was supported, while the second grew up in importance. Since the second is actually far easier (in beanstalk-maven-plugin, you have to call three different methods while a simply git push does all the three), we're supporting git-based deployments, and even published an archetype for it (you see a sample project here, especially the README.md in particular).
(btw, if you're using .war files, my elastic beanstalk plugin support both ways, and we're actually in favor of git, since it allows us to some incremental deployments)
So, you wanna still implement it?
There are three files I suggest you read:
FastDeployMojo.java is the main façade
RequestSigner does the real magic
This is a testcase for RequestSigner
Wanna do in
Python? I'd look for Dulwich
C#? The powershell version is based in it
Ruby? The Linux is based on it
Java? Use mine, it uses jgit
I installed openstack through DevStack because I had to modify some files.
When I install DevStack, I have all the files under /opt/stack. There I have services folders (glance, keystone...) and libraries folders (python-glanceclient, python-keystoneclient).
If I modify those files, how can I replicate modification on a already deployed Openstack? Installing openstack without devstack builds a different structure of folder.
I mean, where the python-'service'client folders are in a fresh openstack installation?
Thank you
devstack pulls it's openstack software stack from github. the git repot's it installs from are located in /opt/stack.
What you may want to do is fork openstack as well as the repot of the openstack projects you wish to modify, then make your devstack deployment deploy not from the openstack repot, but your own forked repot.
you can do this by modifying the stack.sh script ( I believe ). devstack.org has a line by line explanation of the entire script on their site and that can point you in the right direction.
http://devstack.org/stack.sh.html read this.
once you've deployed using your own git repository you can of course edit and commit back to it. even push to it.
then any other devstack deployment you have can also be setup to pull from your repository instead of the public openstack repos.
of course rebasing later against openstack will be increasingly difficult as openstack developement moves at a fairly brisk pace.
If the modification you are making is one you want to commit back to the open source project check out this site:
http://wiki.openstack.org/HowToContribute
Basically openstack has a commit review and continuous integration environment that is based off of gerrit and jenkins. This is the method by which commits back to the open source repository are gate tested and manually reviewed by other developers before being merged.
If you are intending to deploy this for production use, I recommend against deploying from devstack. This is not the proper way to do that.
I am currently using github to develop a python application and am looking to deploy it on EC2.
Is there a good way to automatically handle the messiness this entails (setting up SSH key pairs on the EC2 instance for github, pulling from the github repository every time a commit is pushed to the master branch, etc.) without a bunch of custom scripts? Alternatively, is there an open-source project that has focused on this?
I wrote a simple python script to do this once. I also posted about it on my blog.
You set up mappings of your repositories and brances to point to local folders which already contain a checkout of that repo and branch. Then, you enable GitHub's post-receive hooks to hit the script which will then automatically trigger a git pull in the appropriate folder.