Situation
I have an existing Python app in Google Colab that calls the Twitter API and sends the response to Cloud Storage.
I'm trying to automate the Twitter API call in GCP, and am wondering how I install the requests library for the API call, and install os for authentication.
I tried doing the following library installs in a Cloud Function:
import requests
import os
Result
That produced a resulting error message:
Deployment failure: Function failed on loading user code.
Do I need to install those libraries in a Cloud Function? I'm trying to understand this within the context of my Colab python app, but am not clear if the library installs are necessary.
Thank you for any input.
when you create your cloud function source code , there are two files.
main.py
requirements.txt
Add packages in requirements.txt as below
#Function dependencies, for example:
requests==2.20.0
creating a new python environment for your project might help and would be a good start for any project
it is easy to create.
## for unix-based systems
## create a python environment
python3 -m venv venv
## activate your environment
## in linux-based systems
. ./venv/bin/activate
if you are using google colab, add "!" before these commands, they should work fine.
Related
Does anybody knows how to integrate mne python in google cloud.
We are basically using basic MEG analysis using data in Python and we want to import data onto Google Cloud and use the open source library MNE Python.
Have Tried of using sudo apt install mne-python in shell.
Setting up a Python environment with the required dependencies installed is required in order to use MNE-Python in Google Cloud.
Set up a project and create a Google Cloud account.
Install Python along with any required dependencies on the newly created virtual machine (VM) instance in Google Cloud.
Install MNE-Python with conda or pip.
Use Google Cloud Storage to store your data or upload your MEG data to the VM instance.
Make use of MNE-Python and your data to create a Python script that will carry out your MEG analysis.
On the VM instance, execute the script.
I am building a CI/CD azure pipeline to build and publish an azure function from a DevOps repo to Azure. The function in question uses a custom SDK stored as a Python package Artifact in an organisation scoped feed.
If I use a pip authenticate task to be able to access the SDK, the task passes but the pipeline then crashes when installing the requirements.txt. Strangely, before we get to the SDK, there is an error installing the azure-functions package. If I remove the SDK requirement and the pip authenticate task this error does not occur however. So something about the authenticate task means the agent cannot access azure-functions.
Additionally, if I swap the order of 'azure-functions' and 'CustomSDK' in the requierments.txt, the agent is still unable to install the SDK artifact so something must be wrong with the authentication task:
steps:
- task: PipAuthenticate#1
displayName: 'Pip Authenticate'
inputs:
artifactFeeds: <organisation-scoped-feed>
pythonDownloadServiceConnections: <service-connection-to-SDK-URL>
Why can I not download these packages?
This was due to confusion around the extra index url. In order to access both PyPI and the artifact feed, the following settings need to be set:
- task: PipAuthenticate#1
displayName: 'Pip Authenticate'
inputs:
pythonDownloadServiceConnections: <service-connection-to-SDK-Feed>
onlyAddExtraIndex: true
This way pip will consult PyPI first, and then the artifact feed.
Try running the function while the _init_.py file is active on the screen.
If you're just trying out the Quickstart, you shouldn't need to change anything in the function.json file. When you start debugging, make sure you're looking at the _init_.py file.
When you run the trigger, make sure you're on the _init_ .py file. Otherwise, VS Code will try to run the current active window's file.
So I was looking into scheduling a python script on a daily basis and, rather than using Task Scheduler on my own machine, I was wondering if it is possible to do so using an Azure cloud account.
For your needs, I suggest you use Web Jobs in Web Apps Service.
It has two types of Azure Web Jobs for you to choose:
Continuous and Trigger.
For your needs, Trigger should be adopted.
You could refer to the document here for more details.In addition, here shows how to run tasks in WebJobs.
You could refer to the steps as below to create your webjob.
Step 1 :
Use the virtualenv component to create an independent python runtime environment in your system.Please install it first with command pip install virtualenv if you don't have it.
If you installed it successfully ,you could see it in your python/Scripts file.
Step2 : Run the commad to create independent python runtime environment.
Step 3: Then go into the created directory's Scripts folder and activate it (this step is important , don't miss it)
Please don't close this command window and use pip install <your libraryname> to download external libraries in this command window.
Step 4:Keep the Webjob.py(which is your own business code) uniformly compressed into a folder with the libs packages in the Libs/site-packages folder that you rely on.
Step 5:
Create webjob in Web app service and upload the zip file,then you could execute your Web Job and check the log
You could also refer to the SO thread:
1.Options for running Python scripts in Azure
2.Python libraries on Web Job
BTW,you need to create a azure web app first because Webjob runs in azure web app.
Hope it helps you.
I'm trying to deploy a Flask web app with mysql connectivity. It's my first time using Azure, and coming off Linux it all seems pretty confusing.
My understanding is that one includes within the requirements.txt to include the packages required. When I build the default Flask app from Azure the file looks like this:
Flask<1
At this stage the site loads fine.
If I then include an additional line
https://cdn.mysql.com/Downloads/Connector-Python/mysql-connector-python-2.1.14.tar.gz
As per this answer https://stackoverflow.com/a/34489738/2697874
Then in my views.py file (which seems to be broadly synonymous to my old app.py file) I include...import mysql.connector
I then restart and reload my site...which then returns the error The page cannot be displayed because an internal server error has occurred.
Error logging spits out a load of html (seems pretty weird way to deliver error logs - so I must be missing something here). When I save to html and load it up I get this...
How can I include the mysql.connector library within my Flask web app?
Per my experience, the resoure https://cdn.mysql.com/Downloads/Connector-Python/mysql-connector-python-2.1.14.tar.gz is for Linux, not for Azure WebApps based on Windows, and the link seems to be not available now.
I used the command pip search mysql-connector to list the related package. Then, I tried to use mysql-connector instead of mysql-connector-python via pip install, and tried to import mysql.connector in local Python interpreter that works fine.
So please use mysql-connector==2.1.4 instead of mysql-connector-python== in the requirements.txt file of your project using IDE, then re-deploy the project on Azure and try again. The package will be installed automatically as the offical doc said as below.
Package Management
Packages listed in requirements.txt will be installed automatically in the virtual environment using pip. This happens on every deployment, but pip will skip installation if a package is already installed.
Any update, please feel free to let me know.
I'm following this guide and trying to develop a Flask app to run on the Google App Engine. I followed the guide to the letter but when I launch the dev app server from the Launcher and go to http://localhost:8080/, I get a HTTP 500 error.
I check the logs and it says No module named flask. Then I check the interactive console in the admin console by running import flask and I get the same error message. I can import flask in any other python file without error.
Is there a way to fix this?
Working a bit with GAE and Flask I have realized this:
Running directly with Python
To run the app with python directly (python app.py) you need have the dependents packages installed in your environment using the command: pip install flask
Running with dev_appserver.py
To run the app with the dev_appserver.py provided by GAE SDK you need have all dependent packages inside your project, as: Flask, jinja2... Look in my another answer a example how to configure this packages : https://stackoverflow.com/a/14248647/1050818
UPDATED
Running Python, Virtualenv, Flask and GAE on Windows
Install Python
Install Python http://www.python.org/ftp/python/2.7.2/python-2.7.2.msi
Click in Windows Start button and search by "Edit the system environment" and open
Go to the tab Advanced and click on button "Environment Variables…"
When the Environment Variables window opens, choose Path from the System variables list and click Edit…
Add this ;C:\Python27;C:\Python27\Scripts at the end of the value and save
Install setuptools MS Windows installer (Necessary to install PIP on Windows)
Choose the correct installer for you in this page http://pypi.python.org/pypi/setuptools#files( I used this one: http://pypi.python.org/packages/2.7/s/setuptools/setuptools-0.6c11.win32-py2.7.exe#md5=57e1e64f6b7c7f1d2eddfc9746bbaf20)
Download the installar and Install that
Install PIP
Download PIP http://pypi.python.org/pypi/pip#downloads
Extract it to any folder
From that directory, type python setup.py install
Install Virtualenv
Execute pip install virtualenv
Execute this mkdir c:\virtualenvs to create a folder to the Virtual Envs
Execute this cd c:\virtualenvs to access that folder
Execute virtualenv flaskdemo to create a virtualenv for you project
Active the virtualenv c:\virtualenvs\flaskdemo\scripts\activate
Install Google App Engine SDK
Install the SDK https://developers.google.com/appengine/downloads
Create the project
Create a directory for your project
Create the main of your application https://github.com/maxcnunes/flaskgaedemo/blob/master/main.py
Create the configuration of your appliction for Google App Engine https://github.com/maxcnunes/flaskgaedemo/blob/master/app.yaml
Create a file to let GAE initialize your application https://github.com/maxcnunes/flaskgaedemo/blob/master/initialize_gae.py
(Look a example of the code here: https://github.com/maxcnunes/flaskgaedemo )
Install Flask to run Locally
Execute pip install flask
Install Flask to run on the GAE
Download Flask https://github.com/mitsuhiko/flask/archive/0.9.zip and extract the folder flask inside your project
Download Werkzeug https://github.com/mitsuhiko/werkzeug/archive/0.8.3.zip and extract the folder werkzeug inside your project
Download Jinja2 https://github.com/mitsuhiko/jinja2/archive/2.6.zip and extract the folder jinja2 inside your project
Download Simple Json https://github.com/simplejson/simplejson/archive/v3.0.5.zip and extract the folder simplejson inside your project
Running the application with GAE SDK
Open Google App Engine Launcher
Add a new application
Run the application
Click in Browse button to open your application on browser
Finally click on Deploy button to deploy your application
Usually, templates come with a requirements.txt. If not, add your dependencies there and then run pip install -t lib -r requirements.txt to force the libraries to be saved in the lib folder.
Make sure you've added lib to appengine_config.py with vendor.add('lib') if it's not already there.
I was also facing the same issue and after spending 1 day on it have found out my silly mistake actually while refactoring my flask app I have changed
appengine_config.py to some other name.
Ideally appengine_config.py should look like below if you are having all your dependencies in lib folder only
from google.appengine.ext import vendor
#Add any libraries installed in the "lib" folder.
vendor.add('lib')
And because it was not able to find and execute appengine_config.py so lib folder was not registered as a dependency folder. To check you can try printing something in appengine_config.py to check if it's being executed on server startup.
tldr: use appengine_config.py and copy your virtualenv to a folder called lib, then make SURE you are running the app via dev_appserver.py
(the below is via bash in ubuntu)
SO after a long battle, I find that virtual env and gcloud dont play nice -
I copied everything from my virtual env dir
.../.virtualenvs/nimble/local/lib/python2.7/site-packages
into
[projectdir]/lib
and my appengine_config.py finally worked locally like it does in the cloud, but I absolutely HAVE to run
dev_appserver.py [my proj dir here]
or the google.appengine module wont load. did not know I should be using dev server. I feel very dumb.
for reference, heres the appengine_config.py
"""`appengine_config` gets loaded when starting a new application instance."""
print 'running app config yaya!'
from google.appengine.ext import vendor
vendor.add('lib')
print 'I am the line after adding lib, it should have worked'
import os
print os.getcwd()
Do you have Extra Libraries component for Python installed? It can be installed with
gcloud components install app-engine-python-extras
After installing this extra library you should be able to use built-in flask library without a problem. For more information, refer to this page
Source