I am not able to get, update or create the documents in Google Firebase (Cloud Firestore) database using Python.
What I have:
A) The database with a collection and documents (inserted manually on the web):
B) Credential JSON file saved as test.json (it is called often path/to/serviceKey.json in the documentation), which looks like this (redacted):
{
"type": "service_account",
"project_id": "test-6f02d",
"private_key_id": "fffca ... 5b7",
"private_key": "-----BEGIN PRIVATE KEY-----\n ... 1IHE=\n-----END PRIVATE KEY-----\n",
"client_email": "test-admin#test-6f02d.iam.gserviceaccount.com",
"client_id": "112 ... 060",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/ ... .gserviceaccount.com"
}
This user has a role Owner.
C) firebase_admin installed (using virtualenv, pip), I can do:
import firebase_admin
from firebase_admin import credentials, firestore
databaseURL = {'databaseURL': "https://test-6f02d.firebaseio.com"}
cred = credentials.Certificate("test.json")
firebase_admin.initialize_app(cred, databaseURL)
<firebase_admin.App object at 0x7f20056534e0>
The following is working:
db = firestore.client()
for k in db.collection('items').get():
print(k)
I am getting the 3 documents, I can access the id of the documents
<google.cloud.firestore_v1beta1.document.DocumentSnapshot object at 0x7f2003bebc18>
<google.cloud.firestore_v1beta1.document.DocumentSnapshot object at 0x7f2003bebdd8>
<google.cloud.firestore_v1beta1.document.DocumentSnapshot object at 0x7f2003bebcf8>
print(k.id)
a3BxcpWpavHmuz6DpZH3
However, it is the max I can get.
1) I do not know how to access the values of the document. Something like this:
from firebase_admin import db
ref = db.reference('items')
print(ref)
<firebase_admin.db.Reference object at 0x7f20013b2828>
# GET?
ref.get()
# empty
2) I do not know how to access the values directly (e.g., using browser or requests), something like:
https://test-6f02d.firebaseio.com/items.json
returns
{
"error" : "Permission denied"
}
3) I do not know how to update an existing document or create a new one in the collection items.
# UPDATE?
# PUSH?
I tried to follow this blog and the documentation (but it does not have examples) and several answers here on SO, but without any success.
Thanks in advance.
Another night and I can answer myself (thanks to Doug for the hint in the discussion):
The problem for me was that there are two similar documentations (for Python part, the 2nd one is more extensive than just Python). I found the first one more helpful, but sometimes I needed to use part of the second one, too:
https://googleapis.github.io/google-cloud-python/latest/firestore/index.html (particularly the API Reference)
https://firebase.google.com/docs/firestore/
1) Accessing the documents:
import firebase_admin
from firebase_admin import credentials, firestore
databaseURL = {
'databaseURL': "https://test-6f02d.firebaseio.com"
}
cred = credentials.Certificate("test.json")
firebase_admin.initialize_app(cred, databaseURL)
database = firestore.client()
col_ref = database.collection('items') # col_ref is CollectionReference
results = col_ref.where('name', '==', 'Pepa').get() # one way to query
results = col_ref.order_by('date',direction='DESCENDING').limit(1).get() # another way - get the last document by date
for item in results:
print(item.to_dict())
print(item.id)
# item is DocumentSnapshot
# note: the documentation says get() is depreciated in favour of stream(), however stream() did not work for me
2) Still do not know, but I do not need it as 1) works ok.
3) Update or create document:
# Continuing from 1)
# Udpdate:
doc = col_ref.document(item.id) # doc is DocumentReference
field_updates = {"description": "Updated description"}
doc.update(field_updates)
# Create:
import datetime
new_values = {
"name": "Newbie",
"description": "Shiny New Document",
"date": datetime.datetime.now()
}
col_ref.document().create(new_values)
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
cred = credentials.Certificate("serviceAccountKey.json")
firebase_admin.initialize_app(cred)
db = firestore.client()
docs = db.collection("persons").where("Province","==",cprovince.get()).get()
for doc in docs:
print(doc.to_dict())
pip install --upgrade google-cloud-firestore
Then use
# Import
from google.cloud import firestore
# Create your Firebase client
firebase = firestore.Client(project="your project name")
# Define the collection you're working in
collection = firebase.collection("myCollection")
# Filter for docs, get their ref's, grab the first, and convert it back to a dict
collection.where(...).get()[0].to_dict()
# Add a doc (by ID)
doc_explicit_ref = collection.document("my unique doc ID")
doc_explicit_ref.add({"my data":"is here"})
# Add an implicit doc (auto generated ID)
collection.add({"my cool":"data"})
This follows the same convention of pretty much any other python lib that GCP has - ensure your GOOGLE_APPLICATION_CREDENTIALS env var points to your svc account json
Related
I followed this official document to run airflow in docker. In my local airflow webserver (localhost:8080), I create a Google Cloud connection by pasting my google credential.json in Keyfile JSON and add 2 scopes (shown in the picture below)
airflow connection
my credential looks like:
{
"type": "service_account",
"project_id": "xxxx-tech",
"private_key_id": "xxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\nXXXXX=\n-----END PRIVATE KEY-----\n",
"client_email": "xxx#tech.iam.gserviceaccount.com",
"client_id": "10xxxx17",
"auth_uri": "https://urldefense.proofpoint.com/v2/url?u=https-3xxxxx= ",
"token_uri": "https://urldefense.proofpoint.com/v2/url?u=https-3xxxxx ",
"auth_provider_x509_cert_url": "https://urldefense.proofpoint.com/v2/url?u=https-3xxxxxx= ",
"client_x509_cert_url": "https://urldefense.proofpoint.com/v2/url?u=https-3xxxxxx= "
}
I want to use airflow's hook to get this credential for reading data in Google Sheet.
But in my python code, I tried:
from airflow.providers.google.common.hooks.base_google import GoogleBaseHook
from googleapiclient.discovery import build
gcp_hook = GoogleBaseHook(gcp_conn_id="spx-service-account")
creds = gcp_hook._get_credentials()
service = build('sheets', 'v4', credentials=creds, cache_discovery=False)
sheet = service.spreadsheets()
result = sheet.values().get(spreadsheetId="1yN3atY6NG7PfY8yNcNAiVqURra8WtQJWCKXc-ccymk0", range="Sheet1!A1:B4").execute()['values']
But executing this python in terminal shows json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
error
I don't know why it shows JSONDecodeError. I also tried the following code. Printing cred_dict in terminal shows the exact same content as my credential.json
gcp_hook = GoogleBaseHook(gcp_conn_id="spx-service-account")
cred_dict = json.loads(gcp_hook._get_field('keyfile_dict'))
From the test above, I guess it may not be a json decode issue, but I still cannot use GoogleBaseHook in my python code. I wonder how can I set airflow connection in the right way?
Thank you in advance!
Oops, it ends up to be something wrong with my credential.json. I have no idea why those token_uri is encoded when I downloaded it from my Gmail (tech team sent it to us via email) and opened it with my windows laptop. If your credential is correct, the code below should work:
from airflow.providers.google.common.hooks.base_google import GoogleBaseHook
from googleapiclient.discovery import build
gcp_hook = GoogleBaseHook(gcp_conn_id="spx-service-account")
creds = gcp_hook._get_credentials()
service = build('sheets', 'v4', credentials=creds, cache_discovery=False)
sheet = service.spreadsheets()
result = sheet.values().get(spreadsheetId="1yN3atY6NG7PfY8yNcNAiVqURra8WtQJWCKXc-ccymk0", range="Sheet1!A1:B4").execute()['values']
I'm new to Python and KivyMD. Also to working with Databases. I want to check if the data provided by the user using the KivyMD App is already in the Firebase Realtime Database. These are the data in the firebase.
The Code
def send_data(self, email):
from firebase import firebase
firebase = firebase.FirebaseApplication("https://infinity-mode-default-rtdb.firebaseio.com/", None)
data = {
'Email' : email
}
if email.split() == []:
cancel_btn_checkpoint_dialogue = MDFlatButton(text='Retry', on_release=self.close_checkpoint_dialogue)
self.checkpoint_dialog = MDDialog(title='Access Denied', text="Invalid Username"),
buttons=[cancel_btn_checkpoint_dialogue])
self.checkpoint_dialog.open()
else:
firebase.post('Users', data)
If the user enters an existing value in the database, that value should not be saved in the database. Also a Dialog box should be shown that the email is already in use. If the value provided by the user is not in the database it should be saved. Please help me to do this.
Did you try with firebase_admin?
I check my data with like this:
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
cred = credentials.Certificate(
"firestore-cred.json"
)
firebase_admin.initialize_app(cred)
db = firestore.client()
data = {
"Email" : email
}
query_email = db.collection(u'Users') \
.where(u"Email",u"==",data["Email"]) \
.get()
if query_email:
# exists
...
else:
# does not exist
...
If user email does not exist, query_email is going to be empty list.
Do not forget that query_email is not json data. You can convert it json with to_dict():
email_query_result_as_json = query_email[0].to_dict()
I'm guessing you use python-firebase which so far has very little documentation. So I'll try to do my best based on the package source...
You have to use firebase.get method to retrieve a dictionary with the current data stored for a given user, then check if that data contains the information you are going to add (untested, since firebase doesn't even import on my machine):
record = firebase.get('/Users', 'id')
if not record.get('Email'):
firebase.post('/Users/'+'id', data)
Try using the dataSnapshot.exist() method or the snapshot.hasChild method, they work in python, I'm not 100% sure whether they work in python but it is worth a go.
I am using a free plan for firebase storage. When I want to store a picture from Python, it works fine:
import os
import firebase_admin
from firebase_admin import credentials, storage
file_extension = os.path.splitext(request.files["input_image"].filename)[1] # ex: jpg
cred = credentials.Certificate(os.getenv("FIREBASE_CREDS"))
firebase_admin.initialize_app(cred)
bucket = storage.bucket(os.getenv("FIREBASE_BUCKET"))
# ex: profile_pictures/user1.jpg
blob = bucket.blob(f"profile_pictures/{current_user.username}{file_extension}")
blob.upload_from_string(request.files["input_image"].filename)
profile_picture_url = blob.public_url
However, when I try to go to the picture public url, I am having this page, while I am expecting an image:
The link is as follow:
https://storage.googleapis.com/bucket_name/folder_name/filename
Note that my rule policy is the following:
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write;
}
}
}
The problem was the following: It looks like when posting a file from the Python SDK, other clients are not allowed to access the URL.
The good new is there is a very easy fix. Just add this line of code after posting the file:
blob.make_public()
That's it.
The code I've written seems to be what I need, however it doesn't work and I get a 401 error (authentication) I've tried everything: 1. Service account permissions 2. create secret id and key (not sure how to use those to get access token though) 3. Basically, tried everything for the past 2 days.
import requests
from google.oauth2 import service_account
METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
METADATA_HEADERS = {'Metadata-Flavor': 'Google'}
SERVICE_ACCOUNT = [NAME-OF-SERVICE-ACCOUNT-USED-WITH-CLOUD-FUNCTION-WHICH-HAS-COMPUTE-ADMIN-PRIVILEGES]
def get_access_token():
url = '{}instance/service-accounts/{}/token'.format(
METADATA_URL, SERVICE_ACCOUNT)
# Request an access token from the metadata server.
r = requests.get(url, headers=METADATA_HEADERS)
r.raise_for_status()
# Extract the access token from the response.
access_token = r.json()['access_token']
return access_token
def start_vms(request):
request_json = request.get_json(silent=True)
request_args = request.args
if request_json and 'number_of_instances_to_create' in request_json:
number_of_instances_to_create = request_json['number_of_instances_to_create']
elif request_args and 'number_of_instances_to_create' in request_args:
number_of_instances_to_create = request_args['number_of_instances_to_create']
else:
number_of_instances_to_create = 0
access_token = get_access_token()
address = "https://www.googleapis.com/compute/v1/projects/[MY-PROJECT]/zones/europe-west2-b/instances?sourceInstanceTemplate=https://www.googleapis.com/compute/v1/projects/[MY-PROJECT]/global/instanceTemplates/[MY-INSTANCE-TEMPLATE]"
headers = {'token': '{}'.format(access_token)}
for i in range(1,number_of_instances_to_create):
data = {'name': 'my-instance-{}'.format(i)}
r = requests.post(address, data=data, headers=headers)
r.raise_for_status()
print("my-instance-{} created".format(i))
Any advice/guidance? If someone could tell me how to get an access token using secret Id and key. Also, I'm not too sure if OAuth2.0 will work because I essentially want to turn these machines on, and they do some processing and then self destruct. So there is no user involvement to allow access. If OAuth2.0 is the wrong way to go about it, what else can I use?
I tried using gcloud, but subprocess'ing gcloud commands aren't recommended.
I did something similar to this, though I used the Node 10 Firebase Functions runtime, but should be very similar never-the-less.
I agree that OAuth is not the correct solution since there is no user involved.
What you need to use is 'Application Default Credentials' which is based on the permissions available to your cloud functions' default service account which will be the one labelled as "App Engine default service account" here:
https://console.cloud.google.com/iam-admin/serviceaccounts?folder=&organizationId=&project=[YOUR_PROJECT_ID]
(For my project that service account already had the permissions necessary for starting and stopping GCE instances, but for other API's I have grant it permissions manually.)
ADC is for server-to-server API calls. To use it I called google.auth.getClient (of the Google APIs Auth Library) with just the scope, ie. "https://www.googleapis.com/auth/cloud-platform".
This API is very versatile in that it returns whatever credentials you need, so when I am running on cloud functions it returns a 'Compute' object and when I'm running in the emulator it gives me a "UserRefreshClient" object.
I then include that auth object in my call to compute.instances.insert() and compute.instances.stop().
Here the template I used for testing my code...
{
name: 'base',
description: 'Temporary instance used for testing.',
tags: { items: [ 'test' ] },
machineType: `zones/${zone}/machineTypes/n1-standard-1`,
disks: [
{
autoDelete: true, // you will want this!
boot: true,
type: 'PERSISTENT',
initializeParams: {
diskSizeGb: '10',
sourceImage: "projects/ubuntu-os-cloud/global/images/ubuntu-minimal-1804-bionic-v20190628",
}
}
],
networkInterfaces: [
{
network: `https://www.googleapis.com/compute/v1/projects/${projectId}/global/networks/default`,
accessConfigs: [
{
name: 'External NAT',
type: 'ONE_TO_ONE_NAT'
}
]
}
],
}
Hope that helps.
If you’re getting a 401 error that means that the access token you're using is either expired or invalid.
This guide will be able to show you how to request OAuth 2.0 access tokens and make API calls using a Service Account: https://developers.google.com/identity/protocols/OAuth2ServiceAccount
The .json file mentioned is the private key you create in IAM & Admin under your service account.
Hi there I'm new in python.
I would like to implement the listener on my Firebase DB.
When I change one or more parameters on the DB my Python code have to do something.
How can I do it?
Thank a lot
my db is like simple list of data from 001 to 200:
"remote-controller"
001 -> 000
002 -> 020
003 -> 230
my code is:
from firebase import firebase
firebase = firebase.FirebaseApplication('https://remote-controller.firebaseio.com/', None)
result = firebase.get('003', None)
print result
It looks like this is supported now (october 2018): although it's not documented in the 'Retrieving Data' guide, you can find the needed functionality in the API reference. I tested it and it works like this:
def listener(event):
print(event.event_type) # can be 'put' or 'patch'
print(event.path) # relative to the reference, it seems
print(event.data) # new data at /reference/event.path. None if deleted
firebase_admin.db.reference('my/data/path').listen(listener)
As Peter Haddad suggested, you should use Pyrebase for achieving something like that given that the python SDK still does not support realtime event listeners.
import pyrebase
config = {
"apiKey": "apiKey",
"authDomain": "projectId.firebaseapp.com",
"databaseURL": "https://databaseName.firebaseio.com",
"storageBucket": "projectId.appspot.com"
}
firebase = pyrebase.initialize_app(config)
db = firebase.database()
def stream_handler(message):
print(message["event"]) # put
print(message["path"]) # /-K7yGTTEp7O549EzTYtI
print(message["data"]) # {'title': 'Pyrebase', "body": "etc..."}
my_stream = db.child("posts").stream(stream_handler)
If Anybody wants to create multiple listener using same listener function and want to get more info about triggered node, One can do like this.
Normal Listener function will get a Event object it has only Data, Node Name, Event type. If you add multiple listener and You want to differentiate between the data change. You can write your own class and add some info to it while creating object.
class ListenerClass:
def __init__(self, appname):
self.appname = appname
def listener(self, event):
print(event.event_type) # can be 'put' or 'patch'
print(event.path) # relative to the reference, it seems
print(event.data) # new data at /reference/event.path. None if deleted
print(self.appname) # Extra data related to change add your own member variable
Creating Objects:
listenerObject = ListenerClass(my_app_name + '1')
db.reference('PatientMonitoring', app= obj).listen(listenerObject.listener)
listenerObject = ListenerClass(my_app_name + '2')
db.reference('SomeOtherPath', app= obj).listen(listenerObject.listener)
Full Code:
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
# Initialising Database with credentials
json_path = r'E:\Projectz\FYP\FreshOnes\Python\PastLocations\fyp-healthapp-project-firebase-adminsdk-40qfo-f8fc938674.json'
my_app_name = 'fyp-healthapp-project'
xyz = {'databaseURL': 'https://{}.firebaseio.com'.format(my_app_name),'storageBucket': '{}.appspot.com'.format(my_app_name)}
cred = credentials.Certificate(json_path)
obj = firebase_admin.initialize_app(cred,xyz , name=my_app_name)
# Create Objects Here, You can use loops and create many listener, But listener will create thread per every listener, Don't create irrelevant listeners. It won't work if you are running on machine with thread constraint
listenerObject = ListenerClass(my_app_name + '1') # Decide your own parameters, How you want to differentiate. Depends on you
db.reference('PatientMonitoring', app= obj).listen(listenerObject.listener)
listenerObject = ListenerClass(my_app_name + '2')
db.reference('SomeOtherPath', app= obj).listen(listenerObject.listener)
As you can see on the per-language feature chart on the Firebase Admin SDK home page, Python and Go currently don't have realtime event listeners. If you need that on your backend, you'll have to use the node.js or Java SDKs.
You can use Pyrebase, which is a python wrapper for the Firebase API.
more info here:
https://github.com/thisbejim/Pyrebase
To retrieve data you need to use val(), example:
users = db.child("users").get()
print(users.val())
Python Firebase Realtime Listener Full Code :
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
def listener(event):
print(event.event_type) # can be 'put' or 'patch'
print(event.path) # relative to the reference, it seems
print(event.data) # new data at /reference/event.path. None if deleted
json_path = r'E:\Projectz\FYP\FreshOnes\Python\PastLocations\fyp-healthapp-project-firebase-adminsdk-40qfo-f8fc938674.json'
my_app_name = 'fyp-healthapp-project'
xyz = {'databaseURL': 'https://{}.firebaseio.com'.format(my_app_name),'storageBucket': '{}.appspot.com'.format(my_app_name)}
cred = credentials.Certificate(json_path)
obj = firebase_admin.initialize_app(cred,xyz , name=my_app_name)
db.reference('PatientMonitoring', app= obj).listen(listener)
Output:
put
/
{'n0': '40', 'n1': '71'} # for first time its gonna fetch the data from path whether data is changed or not
put # On data changed
/n1
725
put # On data changed
/n0
401