Python azure module: how to create a new deployment - python

azure.servicemanagmentservice.py contains:
def create_deployment(self, service_name, deployment_slot, name,
package_url, label, configuration,
start_deployment=False,
treat_warnings_as_error=False,
extended_properties=None):
What is package_url and configuration? Method comment indicates that
package_url:
A URL that refers to the location of the service package in the
Blob service. The service package can be located either in a
storage account beneath the same subscription or a Shared Access
Signature (SAS) URI from any storage account.
....
configuration:
The base-64 encoded service configuration file for the deployment.
All over the internet there are references to Visual Studio and Powershell to create those files. What do they look like? Can I manually create them? Can azure module create them? Why is Microsoft service so confusing and lacks documentation?
I am using https://pypi.python.org/pypi/azure Python Azure SDK. I am running Mac OS X on my dev box, so I don't have Visual Studio or cspack.exe.
Any help appreciated. Thank you.

According to your description, it looks like you are trying to use Python Azure SDK to create a cloud service deployment. Here is the documentation of how to use the create_deployment function.
Can I manually create them? Can azure module create them?
If you mean you wanna know how to create an Azure deployment package of your Python app, based on my experience, there are several options you can leverage.
If you have a visual studio, you can create a cloud project from projects’ template and package the project via 1 click. In VS, create new project -> cloud ->
Then package the project:
Without VS, you can use Microsoft Azure PowerShell cmdlets, or cspack commandline tool to create a deployment package. Similar ask could be found at: Django project to Azure Cloud Services without Visual Studio
After you packaging the project, you would have the .cspkg file like this:
For your better reference, I have uploaded the testing project at:
https://onedrive.live.com/redir?resid=7B27A151CFCEAF4F%21143283
As to the ‘configuration’, it means the base-64 encoded service configuration file (.cscfg) for the deployment
In Python, we can set up the ‘configuration’ via below code
configuration = base64.b64encode(open('E:\\TestProjects\\Python\\YourProjectFolder\\ServiceConfiguration.Cloud.cscfg', 'rb').read())
Hope above info could provide you a quick clarification. Now, let’s go back to Python SDK itself and see how we can use create_deployment function to create a cloud service deployment accordingly.
Firstly, I’d like to suggest you to refer to https://azure.microsoft.com/en-us/documentation/articles/cloud-services-python-how-to-use-service-management/ to get the basic idea of what azure Service Management is and how it’s processing.
In general, we can make create_deployment function work via 5 steps
Create your project’s deployment package and set up a configuration file (.cscfg) – (to have the quick test, you can use the one that I have uploaded)
Store your project’s deployment package in a Microsoft Azure Blob Storage account under the same subscription as the hosted service to which the package is being uploaded. Get the blob file’s URL (or use a Shared Access Signature (SAS) URI from any storage account). You can use Azure storage explorer to upload the package file, and then it will be shown on Azure portal.
Use OpenSSL to create your management certificate. It is needed to create two certificates, one for the server (a .cer file) and one for the client (a .pem file), the article I mentioned just now have provided a detailed info https://azure.microsoft.com/en-us/documentation/articles/cloud-services-python-how-to-use-service-management/
Screenshot of my created certificates
Then, upload .cer certificate to Azure portal: SETTINGS -> management certificate tab -> click upload button (on the bottom of the page).
Create a cloud service in Azure and keep the name in mind.
Create another project to test Azure SDK - create_deployment, code snippet for your reference:
subscription_id = 'Your subscription ID That could be found in Azure portal'
certificate_path = 'E:\YourFoloder\mycert.pem'
sms = ServiceManagementService(subscription_id, certificate_path)
def TestForCreateADeployment():
service_name = "Your Cloud Service Name"
deployment_name = "name"
slot = 'Production'
package_url = ".cspkg file’s URL – from your blob"
configuration = base64.b64encode(open('E:\\TestProjects\\Python\\ YourProjectFolder\\ServiceConfiguration.Cloud.cscfg ', 'rb').read())
label = service_name
result = sms.create_deployment(service_name,
slot,
deployment_name,
package_url,
label,
configuration)
operation = sms.get_operation_status(result.request_id)
print('Operation status: ' + operation.status)
Here is running result’s screenshot:

Related

Move python code from Azure Devops repo to windows Azure VM

We have a python application on a window azure VM that reads data from an API and loads the data into an onprem DB. The application works just fine and the code is source controlled in an Azure devops repo. The current deployment process is for someone to pull the main branch and copy the application from their local machine to c:\someapplication\ on the STAGE/PROD server. We would like to automate this process. There are a bunch of tutorials on how to do a build for API and Web applications which require your azure subscription and app name (which we dont have). My two questions are:
is there a way to do a simple copy to the c:\someapplication folder from azure devops for this application
if there is, would the copy solution be the best solution for this or should I consider something else?
Should i simply clone the main repo to each folder location above and then automate the git pull via the azure pipeline? Any advice or links would be greatly appreciated.
According to your description, you could try to use the CopyFiles#2 task and set the local folder as shared folder, so that use it as TargetFolder. The target folder or UNC path that will contain the copied files.
YAML like:
- task: CopyFiles#2
inputs:
SourceFolder:$(Build.SourcesDirectory) # string. Source Folder.
Contents: '**' # string. Required. Contents. Default: **.
TargetFolder: 'c:\\someapplication' # string. Required. Target Folder.
Please check if it meets your requirements.

Google Application Credentials Python (Jupyter Notebook)

I'm trying to start using Google Analytics API 4 as instructed with Python and Jupyter Notebook. I follow the instructions https://developers.google.com/analytics/devguides/reporting/data/v1/quickstart-client-libraries and get to Step 3. Configure authentication
And then they write that you need to set GOOGLE_APPLICATION_CREDENTIALS="[PATH]" I downloaded this file to my computer and added it to the project folder, but I can't get verified using the service account.
On github they write https://github.com/googleapis/python-analytics-data#installation that you need to use a virtual environment? Is it so? Will it work without it?
I am using service-account, not oauth 2.0
To be clear GOOGLE_APPLICATION_CREDENTIALS is a virtual environmental variable. This variable is used by many of the google client libraries to load the credentials for any of the APIs. The question i have duplicated this as shows a number of ways to set it.
As you seem to still be a little unsure. Here is some additinal information.
As stated in the docs.
An easy way to provide service account credentials is by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable, the API client will use the value of this variable to find the service account key JSON file.
You need to set an env var on your machine to the path of the service account key file.
There are a number of examples of how to do that.
Authenticating as a service accoun
Set GOOGLE_APPLICATION_CREDENTIALS in Python project to use Google API
I did. I'll tell you how it turns out for beginners:
https://developers.google.com/analytics/devguides/reporting/data/v1/quickstart-client-libraries
Click on the blue button Enable the Google Analytics Data API v1, a project is created, a service account is created in Google Cloud and the Google Analytics API is activated. Download JSON file
After that, you need to add a service account in the GA4 resource
Then you need to authenticate according to the method from here https://cloud.google.com/docs/authentication/production#setting_the_environment_variable
This requires Passing credentials manually. Using the code in the article, the path to the json file on the computer is indicated
When trying to authenticate a service account, 403 GET storage.googleapis.com/storage/v1/… will be issued: starting-account-smhpwtovr5jj #test-data-api-1654114095791.iam.gserviceaccount.com does not have storage.buckets.list access to the Google Cloud project.
To do this, you need to go to IAM and create a new role for the Owner or Storage Admin service account. After that, GOOGLE_APPLICATION_CREDENTIALS finds and you can start installing the Data API library

Zip deploy azure function from storage account

I'm trying to zip deploy an azure function from a blob storage.
I have set SCM_DO_BUILD_DURING_DEPLOYMENT, to true.
I have also set WEBSITE_RUN_FROM_PACKAGE to the remote url.
I am able to deploy easily if the function is in a remote url. However, I can't seem to do it if I have it as a blob on azure.
The prefarable runtime for this is python.
For having the zip deploy from storage account you need to navigate to your .zip blob in your storage account and get the generated SAS token for that blob.
Then add the same url in your Function App Application settings for WEBSITE_RUN_FROM_PACKAGE.
NOTE:- This option is the only one supported for running from a package on Linux hosted in a Consumption plan.
For more information on this you can refer Run your functions from a package file in Azure

How to update AzureWebJob after it is created

I created AzureWebJob manually via Azure portal, but now I need to update the source code. I did not found a way to re-upload the source code content (python script). Is there any CLI I can use to update the job or I have to delete the job and re-create it every time I need an update?
There is some guide to deployment via VS but only for .NET, is there something similar to Python?
https://learn.microsoft.com/en-us/azure/app-service/webjobs-dotnet-deploy-vs
In the final it is very easy!
It is possible to connect to App Service via FTPS client.
The credentials can be found in Azure Portal
Home > <App Service Name> > Deployment Center > FTPS credentials
And the source codes of the jobs are in the folder
/site/wwwroot/App_Data/jobs
and thats it!

Migrating from Amazon S3 to Azure Storage (Django web app)

I maintain this Django web app where users congregate and chat with one another. They can post pictures too if they want. I process these photos (i.e. optimize their size) and store them on an Amazon S3 bucket (like a 'container' in Azure Storage). To do that, I set up the bucket on Amazon, and included the following configuration code in my settings.py:
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_S3_FORCE_HTTP_URL = True
AWS_QUERYSTRING_AUTH = False
AWS_SECRET_ACCESS_KEY = os.environ.get('awssecretkey')
AWS_ACCESS_KEY_ID = os.environ.get('awsaccesskeyid')
AWS_S3_CALLING_FORMAT='boto.s3.connection.OrdinaryCallingFormat'
AWS_STORAGE_BUCKET_NAME = 'dakadak.in'
Additionally Boto 2.38.0 and django-storages 1.1.8 are installed in my virtual environment. Boto is a Python package that provides interfaces to Amazon Web Services, whereas django-storages is a collection of custom storage backends for Django.
I now want to stop using Amazon S3 and instead migrate to Azure Storage. I.e. I need to migrate all existing files from S3 to Azure Storage, and next I need to configure my Django app to save all new static assets on Azure Storage.
I can't find precise documentation on what I'm trying to achieve. Though I do know django-storages support Azure. Has anyone done this kind of a migration before, and can point out where I need to begin, what steps to follow to get everything up and running?
Note: Ask me for more information if you need it.
Per my experience, you can do the two steps for migrating from Amazon S3 to Azure Storage for Django web app.
The first step is moving all files from S3 to Azure Blob Storage. There are two ways you can try.
Using the tools for S3 and Azure Storage to move files from S3 to local directory to Azure Blob Storage.
These tools below for S3 can help moving files to local directory.
AWS Command Line(https://aws.amazon.com/cli/): aws s3 cp s3://BUCKET/FOLDER localfolder --recursive
S3cmd Tools(http://s3tools.org/): s3cmd -r sync s3://BUCKET/FOLDER localfolder
S3 Browser (http://s3browser.com/) :
This is a GUI client.
For moving local files to Azure Blob Storage, you can use AzCopy Command-Line Utility for high-performance uploading, downloading, and copying data to and from Microsoft Azure Blob, File, and Table storage, please refer to the document https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/.
Example: AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer
Migrating by programming with Amazon S3 and Azure Blob Storage APIs in your familiar language like Python, Java, etc. Please refer to their API usage docs https://azure.microsoft.com/en-us/documentation/articles/storage-python-how-to-use-blob-storage/ and http://docs.aws.amazon.com/AmazonS3/latest/API/APIRest.html.
The second step is following the document http://django-storages.readthedocs.org/en/latest/backends/azure.html to re-configure Django settings.py file. Django-storages would allow any uploads you do to be automatically stored in your storage container.
DEFAULT_FILE_STORAGE='storages.backends.azure_storage.AzureStorage'
AZURE_ACCOUNT_NAME='myaccount'
AZURE_ACCOUNT_KEY='account_key'
AZURE_CONTAINER='mycontainer'
You can find these settings on Azure Portal as #neolursa said.
Edit:
On Azure old portal:
On Azure new portal:
The link you've shared has the configuration for Django to Azure Blobs. Now when I did this all I had to do is to go to the Azure portal. Under the storage account, get the access keys as below. Then create a container and give the container name to Django configuration. This should be enough. However I've done this a while ago.
For the second part, migrating the current files from S3 bucket to BLOB, there a couple of tools you can use;
If you are using visual studio, you can find the blobs under the Server Explorer after entering your azure account credentials in Visual Studio.
Alternatively you can use third party tools like Azure Storage Explorer or Cloudberry explorer.
Hope this helps!

Categories

Resources