We have a python application on a window azure VM that reads data from an API and loads the data into an onprem DB. The application works just fine and the code is source controlled in an Azure devops repo. The current deployment process is for someone to pull the main branch and copy the application from their local machine to c:\someapplication\ on the STAGE/PROD server. We would like to automate this process. There are a bunch of tutorials on how to do a build for API and Web applications which require your azure subscription and app name (which we dont have). My two questions are:
is there a way to do a simple copy to the c:\someapplication folder from azure devops for this application
if there is, would the copy solution be the best solution for this or should I consider something else?
Should i simply clone the main repo to each folder location above and then automate the git pull via the azure pipeline? Any advice or links would be greatly appreciated.
According to your description, you could try to use the CopyFiles#2 task and set the local folder as shared folder, so that use it as TargetFolder. The target folder or UNC path that will contain the copied files.
YAML like:
- task: CopyFiles#2
inputs:
SourceFolder:$(Build.SourcesDirectory) # string. Source Folder.
Contents: '**' # string. Required. Contents. Default: **.
TargetFolder: 'c:\\someapplication' # string. Required. Target Folder.
Please check if it meets your requirements.
Related
After having worked on a Azure Functions application, i have now deployed the app and had it running for a while. Now I want to continue my work on another computer, however I cant seem to identify any way to download the source code in either VS Code nor Azure Portal?
For python function we can not download the content from Azure portal or VS code. It is in read-only mode.
Workaround:
1.Copy your project to another computer.
2.Create a new project on another computer and copy the main files from azure portal.
host.json and requirements.txt files from App files.
init.py and function.json files from Code+Test.
I'm implementing continuous integration and continuous delivery for a large enterprise data warehouse project.
All the code reside in Google Cloud Repository and I'm able to set up Google Cloud Build trigger, so that every time code of specific file type (Python scripts) are pushed to the master branch, a Google Cloud build starts.
The Python scripts doesn't make up an app. They contain an ODBC connection string and script to extract data from a source and store it as a CSV-file. The Python scripts are to be executed on a Google Compute Engine VM Instance with AirFlow installed.
So the deployment of the Python scripts is as simple as can be: The .py files are only to be copied from the Google Cloud repository folder to a specific folder on the Google VM instance. There is not really a traditionally build to run, as all the Python files are separate for each other and not part of an application.
I thought this would be really easy, but now I have used several days trying to figure this out with no luck.
Google Cloud Platform provides several Cloud Builders, but as far as I can see none of them can do this simple task. Using GCLOUD also does not work. It can copy files but only from local pc to VM not from source repository to VM.
What I'm looking for is a YAML or JSON build config file to copy those Python files from source repository to Google Compute Engine VM Instance.
Hoping for some help here.
The files/folders in the Google Cloud repository aren't directly accessible (it's like a bare git repository), you need to first clone the repo then copy the desired files/folders from the cloned repo to their destinations.
It might be possible to use a standard Fetching dependencies build step to clone the repo, but I'm not 100% certain of it in your case, since you're not actually doing a build:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders']
If not you may need one (or more) custom build steps. From Creating Custom Build Steps:
A custom build step is a container image that the Cloud Build worker
VM pulls and runs with your source volume-mounted to /workspace.
Your custom build step can execute any script or binary inside the
container; as such, it can do anything a container can do.
Custom build steps are useful for:
Downloading source code or packages from external locations
...
I am working on a Python based web application for collaborative xml/document editing, and one requirement from the client is that users should be able to push the files they created (and saved on the server) directly to a Github remote repo, without the need of ever creating a local clone on the server (i.e., no local working directory or tracking of any sort). In GUI terms, this would correspond to going to Github website and manually add the file to the remote repo by clicking the "Upload files" or "create new file" button, or simply edit the existing file on the remote repo on the Github website and then commit the change inside the web browser. I wonder is this functionality even possible to achieve either using some Python Github modules or writing some code from scratch using the Github API or something?
So you can create files via the API and if the user has their own GitHub account, you can upload it as them.
Let's use github3.py as an example of how to do this:
import github3
gh = github3.login(username='foo', password='bar')
repository = gh.repository('organization-name', 'repository-name')
for file_info in files_to_upload:
with open(file_info, 'rb') as fd:
contents = fd.read()
repository.create_file(
path=file_info,
message='Start tracking {!r}'.format(file_info),
content=contents,
)
You will want to check that it returns an object you'd expect to verify the file was successfully uploaded. You can also specify committer and author dictionaries so you could attribute the commit to your service so people aren't under the assumption that the person authored it on a local git set-up.
I am trying to run through the creation of a Flask web app in Azure using this instruction page.
Creating Web apps with Flask in Azure
In the "Application Overveiw" section, it lists some FlaskWebProjectfiles saying.
Here's an overview of the files you'll find in the initial Git repository
\FlaskWebProject\__init__.py
\FlaskWebProject\views.py
\FlaskWebProject\static\content\
\FlaskWebProject\static\fonts\
\FlaskWebProject\static\scripts\
\FlaskWebProject\templates\about.html
\FlaskWebProject\templates\contact.html
\FlaskWebProject\templates\index.html
\FlaskWebProject\templates\layout.html
The problem is that I don't get these files when I connect up Azure to a Github repository. I know they exist because my Azure app renders the this default Flask webapp. The files exist in /wwwroot.
I am sure that I am missing something obvious here, so if anyone has followed the most recent Flask setup instruction for Azure, and had success, their input would be great.
Your initial GitHub repository is empty, so you need to clone the repository.
The process is described in the same article you mentioned, but a little later.
Basically:
1) Go to the deployment source and configure the deployment source - for example, local github
2) Go to Settings => Properties. Here you should have Git URL where your files are placed
3) Go to your workstation, and execute
git clone https://yourdeploymentusername#todeleteflask.scm.azurewebsites.net:443/todeleteflask.git
Enter password.
You should be all set now. Now, if you make change, you may push to the repository and it will arrive on the site.
I am currently using github to develop a python application and am looking to deploy it on EC2.
Is there a good way to automatically handle the messiness this entails (setting up SSH key pairs on the EC2 instance for github, pulling from the github repository every time a commit is pushed to the master branch, etc.) without a bunch of custom scripts? Alternatively, is there an open-source project that has focused on this?
I wrote a simple python script to do this once. I also posted about it on my blog.
You set up mappings of your repositories and brances to point to local folders which already contain a checkout of that repo and branch. Then, you enable GitHub's post-receive hooks to hit the script which will then automatically trigger a git pull in the appropriate folder.