Accessing BigQuery on Google App Engine dev server - python

I am successfully able to get BQ data from one project to another from the advice in this answer. However this only works when deployed on my development/staging instance and not my local development server on Google App Engine.
My findings are that it works in production because you include:
libraries:
- name: pycrypto
version: "latest"
in app.yaml. However these libraries are not accessible from the dev server. I have tried installing everything locally (Pycrypto, oauth2client, openSSL) after digging through some docs and tracing the error but still cannot get it to work. I have tried installing through pip and manually doing the build/install from the raw files to no avail. Any advice on getting these queries to work on the local django server? Working on Ubuntu if that matters, perhaps it's looking in the wrong spot for the libraries?

If its just the libs that are mising follow this answer https://stackoverflow.com/a/11405769/3877822 to insatll pycrypto to the root of your project
As #Udi suggests in the comment below, the following command also
installs pycrypto and can be used in virtualenv as well:
easy_install
http://www.voidspace.org.uk/downloads/pycrypto26/pycrypto-2.6.win32-py2.7.exe
Notice to choose the relevant link for your setup from this list

Related

Python app hosted on app engine is not able to connect with Firestore where as working on local

I have created a new project on GCP. I am trying to select/add data from firestore (already in native mode) from Python flask app. From local env. I am able to connect to firestore. But once I hosted my app on App engine, my API is not able to connect with Firestore. It's throwing the below error:
TypeError: with_scopes_if_required() got an unexpected keyword argument 'default_scopes'
_create_composite_credentials (/layers/google.python.pip/pip/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:238)
I am not sure what I am missing here. I appreciate any support.
Thanks
I had the same issue, and I just fixed it today.
As you can see in the changelog https://googleapis.dev/python/google-api-core/latest/changelog.html recently they have updated the requirements for google-auth lib.
You need to check your requirements.txt.
In my case, we had two libs pinned that we weren't using directly.
google-auth==1.21.1
google-auth-oauthlib==0.4.1
I just removed them from the requirements.txt so that dependencies versions are handled automatically, and it solved the issue.
So I guess it depends on your case, on how you are managing dependencies, but you might want to play with google-auth versions.

Deploying python application

I have created a python application containing rest apis which call machine learning code on PyCharm IDE. I want to deploy rest apis on IIS.
I copy and past complete PyCharm project in virtual directory. The issue I am facing is dependencies like tensorflow and keras are not being found due to which API is giving "Internal server error", however I am able to call rest services.
please guide.
As FishingCode suggested please share code snippet to understand the exact issue.
Please include and install requirements.txt file in virtual environment.
Also, you may try installing using docker tutorial

Deploying a Django App to OpenShift 3.0 that needs NLTK

I am working on a Django project needing Newspaper3K to retrieve some information over the internet. Trying to deploy my Django web app onto the free OpenShift Online 3.0 Starter, the build fails when it comes to installing the Newspaper3K and hence its dependency NLTK.
Please advise the correct steps to achieve this "Django with NLTK deploying to OpenShift 3" installation. Thanks!
It turned out to be blocked by the lxml package. The default version of pip fails to compile it. Thanks to the answer here
https://stackoverflow.com/a/46125643/8583561
I set the environment variable
UPGRADE_PIP_TO_LATEST=1
This should be added to the .s2i/environment file.
This enables the PIP to be updated to the latest and the build is completed without an issue.

Importing mysql.connector into Azure Flask Project

I'm trying to deploy a Flask web app with mysql connectivity. It's my first time using Azure, and coming off Linux it all seems pretty confusing.
My understanding is that one includes within the requirements.txt to include the packages required. When I build the default Flask app from Azure the file looks like this:
Flask<1
At this stage the site loads fine.
If I then include an additional line
https://cdn.mysql.com/Downloads/Connector-Python/mysql-connector-python-2.1.14.tar.gz
As per this answer https://stackoverflow.com/a/34489738/2697874
Then in my views.py file (which seems to be broadly synonymous to my old app.py file) I include...import mysql.connector
I then restart and reload my site...which then returns the error The page cannot be displayed because an internal server error has occurred.
Error logging spits out a load of html (seems pretty weird way to deliver error logs - so I must be missing something here). When I save to html and load it up I get this...
How can I include the mysql.connector library within my Flask web app?
Per my experience, the resoure https://cdn.mysql.com/Downloads/Connector-Python/mysql-connector-python-2.1.14.tar.gz is for Linux, not for Azure WebApps based on Windows, and the link seems to be not available now.
I used the command pip search mysql-connector to list the related package. Then, I tried to use mysql-connector instead of mysql-connector-python via pip install, and tried to import mysql.connector in local Python interpreter that works fine.
So please use mysql-connector==2.1.4 instead of mysql-connector-python== in the requirements.txt file of your project using IDE, then re-deploy the project on Azure and try again. The package will be installed automatically as the offical doc said as below.
Package Management
Packages listed in requirements.txt will be installed automatically in the virtual environment using pip. This happens on every deployment, but pip will skip installation if a package is already installed.
Any update, please feel free to let me know.

Access Xero from Google App Engine

I am trying to access Xero (Accounting Software) from my Google App Engine system. (Python 2.7).
The Xero example uses M2Crypto but this uses .c files which don't seem to work on App Engine. I also downloaded pycrypto from Google but this has the same problem.
At this stage I would just like my App Engine program to have a button for the user to log in to Xero. Any pointers for me?
Note: Some of these packages appear to need a visual studio or a c compiler, which I don't have.
An update. It appears that only a 'private' application needs pycrypto or m2crypto, neither of which I can install. A public application does not need these, the example I am following has both. I am continuing to work through this.
Thanks in advance
David (Windows Vista, Python 2.7, Current Google App Engine SDK)
I was able to access xero using pyzero, however there were a couple of gotchas, the most significant being the need to upgrade the version of urllib3 that comes packaged as part of the requests library.
I've created a simple project that demonstrates it in use.
https://github.com/hamish/gae_xero
The libraries that I needed to install were:
https://codeload.github.com/freakboy3742/pyxero/zip/master
http://labix.org/download/python-dateutil/python-dateutil-1.5.tar.gz
https://codeload.github.com/kennethreitz/requests/zip/master
https://codeload.github.com/requests/requests-oauthlib/zip/master
https://pypi.python.org/packages/source/o/oauthlib/oauthlib-0.6.1.tar.gz
https://pypi.python.org/packages/source/u/urllib3/urllib3-1.7.1.tar.gz (instaled into the requests/packages/ directory)
Additionally the pycrypto library must be installed and enabled:
sudo pip install pycrypto
[excerpt from app.yaml]
- name: pycrypto
version: latest
Your best bet will be to access the Xero API Endpoints via the AppEngine URL Fetch Service. You'll probably have to satisfy Xero API Authentication along the way.

Categories

Resources