I am trying to create an azure ci/cd pipeline for my python application. I have tried many ways but not get success. I can create CI successfully even in some cases cd also but not able to see the output on the azure app service.
I use Linux app services that use the Python 3.7 version.
I can create ci-cd successfully using the YAML file but I want to create using the classic editor without YAML, as I have some restrictions using yaml.
I will post the steps I deploy a simple hello world project with DevOps CI/CD pipeline.
1. Create pipeline:
2. Create Release pipeline:
3. Save and queue your pipeline, the release pipeline would be triggered. Here is the file structure on Azure KUDU:
Related
I am trying to deploy a Python Azure Function into an Azure Function App. The function __init__.py script imports an SDK which is stored as an Azure Artifact Python Package. I can build and publish the function to Azure successfully using a pipeline from the DevOps repo, however the function fails at the import mySDK line when I run it.
I assume the issue is that because the function is serverless, when it is called the SDK needs to be pip installed again - how do I do this?
I have tried adding a PIP_EXTRA_INDEX_URL to the artifact feed in the Function App with no success.
PIP_EXTRA_INDEX_URL wored for me.
What was the issue you received when you tried it?
Basically before you deploy your function, you should alter the application settings on your function app and add the PIP_EXTRA_INDEX_URL key-value pair. Then add the python package in your azure artifacts feed to the requirements.txt file in your function app code.
There is a good guide here EasyOps - How To connect to azure artifact feed from Function App
Is it possible to run robot framework test suite using azure data bricks notebook?.
I have a set of robot framework test suite, that uses database library, Operating System library etc.
In my local machine, I install python, pip install all necessary libraries and then run my robot code like
"Python -m robot filename.robot"
I want to do the same using azure notebooks, Is it possible?
Databricks supports 4 Default Language:
Python,
Scala,
SQL,
R
I was unable to find any documentation, which shows use of robot framework on databricks.
However, you can try running commands on Azure databricks which you tried on local machine.
Databricks is simply just a cloud infrastructure provider to run your spark workload with some add on capability.
I have an existing Devops pipeline that trains ML models. To guaranty the robustness for the models, it will be necessary to retrain it periodically. For this I decided to create an Azure function that will be executed each month. will It collect the new data, apply the data pre-processing and finally trigger the Azure Devops training pipeline. All of this must be done with python.
By doing a research, I understood that this can be done using REST Azure Devops API request.
I founded this python git repo https://github.com/microsoft/azure-devops-python-api which provides an API to communicate with Azure DevOps. I executed the code provided by this package which displays the list of my devops projects. But I can't found how trigger the pipeline.
Assuming that my organisation named ORGA1, the project named PROJ1 and the pipeline that I want ti execute named PIPELINE1, How can I launch it using an Azure function or even a simple python script ?
PS: I am using a python 3.9 Timer Trigger Azure function.
Thank you in advance for your help.
EDIT
I tired to use LOGIC APP to do this like #mohammed described in the comment and I think that this is a good solution. Above the workflow that I created:
So I launch the logic app each X hours, this will trigger the azure Devops, and ASA it end training with Success it will send me an email.
I have one error here, is that I am creating a new release and not triggering a specific pipeline each time. But navigating in the different actions under the devops service, I cannot found any thing related to launching a devops pipeline. Can anyone have an idea how to do it.
You can use a Logic App with a timer to trigger your DevOps pipeline instead of an Azure function, as it has all the built-in connectors required to interface with your DevOps. See : https://www.serverlessnotes.com/docs/keep-your-team-connected-using-azure-devops-and-azure-logic-apps
Thanks to the tips provided by #Mohammed I found a solution. Logic App provides what I am looking for. Under the list of Devops connectors provided the by the Logic App, there is a connector named Queue a new build and this is exactly what I am looking for. This is my first experiment architecture and I will update it later by adding Azure Function service before calling the Devops Pipeline.
You may try using Azure Durable Functions, you can kind of replicate what a Logic App does while still using Azure Functions. See documentation here 1
I am trying to implement azure devops on few of my pyspark projects.
some of the projects are developed in pyCharm and some are in intelliJ with python API.
Below is the code structure commited in the git repository.
setup.py is the build file used to create .egg file.
I have tried few of the steps as shown below to create a build pipeline in the devops.
But the python installation part/execution part is failing with below error.
##[error]The process 'C:\hostedtoolcache\windows\Python\3.7.9\x64\python.exe' failed with exit code 1
I would prefer UI API for building and creating .egg files, If not possible YAML files.
Any leads appreciated!
I used Use Python step and then all works like a charm
Can you show details of Install Python step?
I have used the following steps and it succeeds!
I'm implementing continuous integration and continuous delivery for a large enterprise data warehouse project.
All the code reside in Google Cloud Repository and I'm able to set up Google Cloud Build trigger, so that every time code of specific file type (Python scripts) are pushed to the master branch, a Google Cloud build starts.
The Python scripts doesn't make up an app. They contain an ODBC connection string and script to extract data from a source and store it as a CSV-file. The Python scripts are to be executed on a Google Compute Engine VM Instance with AirFlow installed.
So the deployment of the Python scripts is as simple as can be: The .py files are only to be copied from the Google Cloud repository folder to a specific folder on the Google VM instance. There is not really a traditionally build to run, as all the Python files are separate for each other and not part of an application.
I thought this would be really easy, but now I have used several days trying to figure this out with no luck.
Google Cloud Platform provides several Cloud Builders, but as far as I can see none of them can do this simple task. Using GCLOUD also does not work. It can copy files but only from local pc to VM not from source repository to VM.
What I'm looking for is a YAML or JSON build config file to copy those Python files from source repository to Google Compute Engine VM Instance.
Hoping for some help here.
The files/folders in the Google Cloud repository aren't directly accessible (it's like a bare git repository), you need to first clone the repo then copy the desired files/folders from the cloned repo to their destinations.
It might be possible to use a standard Fetching dependencies build step to clone the repo, but I'm not 100% certain of it in your case, since you're not actually doing a build:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders']
If not you may need one (or more) custom build steps. From Creating Custom Build Steps:
A custom build step is a container image that the Cloud Build worker
VM pulls and runs with your source volume-mounted to /workspace.
Your custom build step can execute any script or binary inside the
container; as such, it can do anything a container can do.
Custom build steps are useful for:
Downloading source code or packages from external locations
...