python firebase realtime listener - python

Hi there I'm new in python.
I would like to implement the listener on my Firebase DB.
When I change one or more parameters on the DB my Python code have to do something.
How can I do it?
Thank a lot
my db is like simple list of data from 001 to 200:
"remote-controller"
001 -> 000
002 -> 020
003 -> 230
my code is:
from firebase import firebase
firebase = firebase.FirebaseApplication('https://remote-controller.firebaseio.com/', None)
result = firebase.get('003', None)
print result

It looks like this is supported now (october 2018): although it's not documented in the 'Retrieving Data' guide, you can find the needed functionality in the API reference. I tested it and it works like this:
def listener(event):
print(event.event_type) # can be 'put' or 'patch'
print(event.path) # relative to the reference, it seems
print(event.data) # new data at /reference/event.path. None if deleted
firebase_admin.db.reference('my/data/path').listen(listener)

As Peter Haddad suggested, you should use Pyrebase for achieving something like that given that the python SDK still does not support realtime event listeners.
import pyrebase
config = {
"apiKey": "apiKey",
"authDomain": "projectId.firebaseapp.com",
"databaseURL": "https://databaseName.firebaseio.com",
"storageBucket": "projectId.appspot.com"
}
firebase = pyrebase.initialize_app(config)
db = firebase.database()
def stream_handler(message):
print(message["event"]) # put
print(message["path"]) # /-K7yGTTEp7O549EzTYtI
print(message["data"]) # {'title': 'Pyrebase', "body": "etc..."}
my_stream = db.child("posts").stream(stream_handler)

If Anybody wants to create multiple listener using same listener function and want to get more info about triggered node, One can do like this.
Normal Listener function will get a Event object it has only Data, Node Name, Event type. If you add multiple listener and You want to differentiate between the data change. You can write your own class and add some info to it while creating object.
class ListenerClass:
def __init__(self, appname):
self.appname = appname
def listener(self, event):
print(event.event_type) # can be 'put' or 'patch'
print(event.path) # relative to the reference, it seems
print(event.data) # new data at /reference/event.path. None if deleted
print(self.appname) # Extra data related to change add your own member variable
Creating Objects:
listenerObject = ListenerClass(my_app_name + '1')
db.reference('PatientMonitoring', app= obj).listen(listenerObject.listener)
listenerObject = ListenerClass(my_app_name + '2')
db.reference('SomeOtherPath', app= obj).listen(listenerObject.listener)
Full Code:
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
# Initialising Database with credentials
json_path = r'E:\Projectz\FYP\FreshOnes\Python\PastLocations\fyp-healthapp-project-firebase-adminsdk-40qfo-f8fc938674.json'
my_app_name = 'fyp-healthapp-project'
xyz = {'databaseURL': 'https://{}.firebaseio.com'.format(my_app_name),'storageBucket': '{}.appspot.com'.format(my_app_name)}
cred = credentials.Certificate(json_path)
obj = firebase_admin.initialize_app(cred,xyz , name=my_app_name)
# Create Objects Here, You can use loops and create many listener, But listener will create thread per every listener, Don't create irrelevant listeners. It won't work if you are running on machine with thread constraint
listenerObject = ListenerClass(my_app_name + '1') # Decide your own parameters, How you want to differentiate. Depends on you
db.reference('PatientMonitoring', app= obj).listen(listenerObject.listener)
listenerObject = ListenerClass(my_app_name + '2')
db.reference('SomeOtherPath', app= obj).listen(listenerObject.listener)

As you can see on the per-language feature chart on the Firebase Admin SDK home page, Python and Go currently don't have realtime event listeners. If you need that on your backend, you'll have to use the node.js or Java SDKs.

You can use Pyrebase, which is a python wrapper for the Firebase API.
more info here:
https://github.com/thisbejim/Pyrebase
To retrieve data you need to use val(), example:
users = db.child("users").get()
print(users.val())

Python Firebase Realtime Listener Full Code :
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
def listener(event):
print(event.event_type) # can be 'put' or 'patch'
print(event.path) # relative to the reference, it seems
print(event.data) # new data at /reference/event.path. None if deleted
json_path = r'E:\Projectz\FYP\FreshOnes\Python\PastLocations\fyp-healthapp-project-firebase-adminsdk-40qfo-f8fc938674.json'
my_app_name = 'fyp-healthapp-project'
xyz = {'databaseURL': 'https://{}.firebaseio.com'.format(my_app_name),'storageBucket': '{}.appspot.com'.format(my_app_name)}
cred = credentials.Certificate(json_path)
obj = firebase_admin.initialize_app(cred,xyz , name=my_app_name)
db.reference('PatientMonitoring', app= obj).listen(listener)
Output:
put
/
{'n0': '40', 'n1': '71'} # for first time its gonna fetch the data from path whether data is changed or not
put # On data changed
/n1
725
put # On data changed
/n0
401

Related

How to check if data exist in firebase. (KivyMD Python)

I'm new to Python and KivyMD. Also to working with Databases. I want to check if the data provided by the user using the KivyMD App is already in the Firebase Realtime Database. These are the data in the firebase.
The Code
def send_data(self, email):
from firebase import firebase
firebase = firebase.FirebaseApplication("https://infinity-mode-default-rtdb.firebaseio.com/", None)
data = {
'Email' : email
}
if email.split() == []:
cancel_btn_checkpoint_dialogue = MDFlatButton(text='Retry', on_release=self.close_checkpoint_dialogue)
self.checkpoint_dialog = MDDialog(title='Access Denied', text="Invalid Username"),
buttons=[cancel_btn_checkpoint_dialogue])
self.checkpoint_dialog.open()
else:
firebase.post('Users', data)
If the user enters an existing value in the database, that value should not be saved in the database. Also a Dialog box should be shown that the email is already in use. If the value provided by the user is not in the database it should be saved. Please help me to do this.
Did you try with firebase_admin?
I check my data with like this:
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
cred = credentials.Certificate(
"firestore-cred.json"
)
firebase_admin.initialize_app(cred)
db = firestore.client()
data = {
"Email" : email
}
query_email = db.collection(u'Users') \
.where(u"Email",u"==",data["Email"]) \
.get()
if query_email:
# exists
...
else:
# does not exist
...
If user email does not exist, query_email is going to be empty list.
Do not forget that query_email is not json data. You can convert it json with to_dict():
email_query_result_as_json = query_email[0].to_dict()
I'm guessing you use python-firebase which so far has very little documentation. So I'll try to do my best based on the package source...
You have to use firebase.get method to retrieve a dictionary with the current data stored for a given user, then check if that data contains the information you are going to add (untested, since firebase doesn't even import on my machine):
record = firebase.get('/Users', 'id')
if not record.get('Email'):
firebase.post('/Users/'+'id', data)
Try using the dataSnapshot.exist() method or the snapshot.hasChild method, they work in python, I'm not 100% sure whether they work in python but it is worth a go.

python aiosmtpd server with basic authentication

Im trying to create an aiosmtpd server to process emails received.
It works great without authentication, yet i simply cannot figure out how to setup the authentication.
I have gone through the documents and searched for examples on this.
a sample of how im currently using it:
from aiosmtpd.controller import Controller
class CustomHandler:
async def handle_DATA(self, server, session, envelope):
peer = session.peer
mail_from = envelope.mail_from
rcpt_tos = envelope.rcpt_tos
data = envelope.content # type: bytes
# Process message data...
print('peer:' + str(peer))
print('mail_from:' + str(mail_from))
print('rcpt_tos:' + str(rcpt_tos))
print('data:' + str(data))
return '250 OK'
if __name__ == '__main__':
handler = CustomHandler()
controller = Controller(handler, hostname='192.168.8.125', port=10025)
# Run the event loop in a separate thread.
controller.start()
# Wait for the user to press Return.
input('SMTP server running. Press Return to stop server and exit.')
controller.stop()```
which is the basic method from the documentation.
could someone please provide me with an example as to how to do simple authentication?
Alright, since you're using version 1.3.0, you can follow the documentation for Authentication.
A quick way to start is to create an "authenticator function" (can be a method in your handler class, can be standalone) that follows the Authenticator Callback guidelines.
A simple example:
from aiosmtpd.smtp import AuthResult, LoginPassword
auth_db = {
b"user1": b"password1",
b"user2": b"password2",
b"user3": b"password3",
}
# Name can actually be anything
def authenticator_func(server, session, envelope, mechanism, auth_data):
# For this simple example, we'll ignore other parameters
assert isinstance(auth_data, LoginPassword)
username = auth_data.login
password = auth_data.password
# If we're using a set containing tuples of (username, password),
# we can simply use `auth_data in auth_set`.
# Or you can get fancy and use a full-fledged database to perform
# a query :-)
if auth_db.get(username) == password:
return AuthResult(success=True)
else:
return AuthResult(success=False, handled=False)
Then you're creating the controller, create it like so:
controller = Controller(
handler,
hostname='192.168.8.125',
port=10025,
authenticator=authenticator_func, # i.e., the name of your authenticator function
auth_required=True, # Depending on your needs
)

List subscriptions for a given Azure account

I'm trying to list the subscriptions in an Azure account using azure-python-sdk.
I have followed this link in documentation.
https://learn.microsoft.com/en-us/python/api/azure-mgmt-subscription/azure.mgmt.subscription.operations.subscriptionsoperations?view=azure-python#list-custom-headers-none--raw-false----operation-config-
from azure.mgmt.subscription import SubscriptionClient
from msrestazure.azure_active_directory import UserPassCredentials
credentials = UserPassCredentials(username='xxxx', password='xxxx')
sub_client = SubscriptionClient(credentials)
subs = [sub.as_dict() for sub in sub_client.subscriptions.list()]
print(subs)
It is supposed to return a list of subscriptions.
However, I see only empty list returned every time I try the above code.
Can anybody help?
Try this code,
def list_subscriptions():
try:
sub_client = get_client_from_cli_profile(SubscriptionClient)
except CLIError:
logger.info("Not logged in, running az login")
_run_az_cli_login()
sub_client = get_client_from_cli_profile(SubscriptionClient)
return [["Subscription_name", "Subscription ID"]] + [
[sub.display_name, sub.subscription_id]
for sub in sub_client.subscriptions.list()
]
You can find the handy tool from here
If the list is empty and you get not exception, it's likely your credentials are correct (no exception), but your user doesn't have access to subscriptions (no permissions)
In the Azure portal, in the subscription panel you have a button "Access control (IAM)" to define what users are allowed to a given subscription.
https://learn.microsoft.com/azure/role-based-access-control/role-assignments-portal
https://learn.microsoft.com/azure/role-based-access-control/rbac-and-directory-admin-roles
(I work at MS in the SDK team)
I think I solved the issue using Azure CLI. Yet, I still wonder why it didn't work as supposed using azure-python-sdk.
Here is the code:
import subprocess
import json
subscriptions = json.loads(subprocess.check_output('az account list', shell=True).decode('utf-8'))
print(subscriptions)
Thank you for your responses.
I have a similar problem, so I have used AzureCliCredential and it simply worked.
The code is this:
def subscription_list():
credential = AzureCliCredential()
subscription_client = SubscriptionClient(credential)
sub_list = subscription_client.subscriptions.list()
column_width = 40
print("Subscription ID".ljust(column_width) + "Display name")
print("-" * (column_width * 2))
for group in list(sub_list):
print(f'{group.subscription_id:<{column_width}}{group.display_name}')
Before trying this code, you have to log to Azure through the command line in your dev environment.

Python ml engine predict: How can I make a googleapiclient.discovery.build persistent?

I need to make online predictions from a model that is deployed in cloud ml engine. My code in python is similar to the one found in the docs (https://cloud.google.com/ml-engine/docs/tensorflow/online-predict):
service = googleapiclient.discovery.build('ml', 'v1')
name = 'projects/{}/models/{}'.format(project, model)
if version is not None:
name += '/versions/{}'.format(version)
response = service.projects().predict(
name=name,
body={'instances': instances}
).execute()
However, I receive the "instances" data from outside the script, I wonder if there is a way I could run this script without making the "service = googleapiclient.discovery.build('ml', 'v1')" each time before a request, since it takes time.
pd: this is my very first project on gcp. Thank you.
Something like this will work. You'll want to initialize your service globally then use that service instance to make your call.
import googleapiclient.discovery
AI_SERVICE = None
def ai_platform_init():
global AI_SERVICE
# Set GCP Authentication
credentials = os.environ.get('GOOGLE_APPLICATION_CREDENTIALS')
# Path to your credentials
credentials_path = os.path.join(os.path.dirname(__file__), 'ai-platform-credentials.json')
if credentials is None and os.path.exists(credentials_path):
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = credentials_path
# Create AI Platform Service
if os.path.exists(credentials_path):
AI_SERVICE = googleapiclient.discovery.build('ml', 'v1', cache=MemoryCache())
# Initialize AI Platform on load.
ai_platform_init()
then later on, you can do something like this:
def call_ai_platform():
response = AI_SERVICE.projects().predict(name=name,
body={'instances': instances}).execute()
Bonus! in case you were curious about the MemoryCache class in the googleapiclient.discovery call, that was borrowed from another SO:
class MemoryCache():
"""A workaround for cache warnings from Google.
Check out: https://github.com/googleapis/google-api-python-client/issues/325#issuecomment-274349841
"""
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content

Python way of polling longrunning operations from operation name in Google Cloud?

I'm calling a Google Cloud Function that returns an Operation object implementing the google.longrunning.Operations interface. I want to poll this operation from another Python process that will only receive the operation name (will not have access to the operation object itself). So I need something like:
operation = getOperation(operationName)
isdone = operation.done()
AFAIK, you can't do the first step above. I haven't found it here: https://google-cloud-python.readthedocs.io/en/stable/core/operation.html
I would like to do what is explained in the docs about the google.longrunning interface (https://cloud.google.com/speech-to-text/docs/reference/rpc/google.longrunning#google.longrunning.Operations.GetOperation):
rpc GetOperation(GetOperationRequest) returns (Operation)
Where the GetOperationRequest simply requires the operation name. Is there a way to "re-create" an operation using functions from the google-cloud-python library?
Update for more recent clients. You need to refresh the operation using the OperationClient:
For updating an existing operation you will need to pass the channel across to the OperationClient.
For example, backing up a Firestore datastore.
from google.cloud import firestore_admin_v1
from google.api_core import operations_v1, grpc_helpers
import time
def main():
client = firestore_admin_v1.FirestoreAdminClient()
channel = grpc_helpers.create_channel(client.SERVICE_ADDRESS)
api = operations_v1.OperationsClient(channel)
db_path = client.database_path('myproject', 'mydb')
operation = client.export_documents(db_path)
current_status = api.get_operation(operation.name)
while current_status.done == False:
time.sleep(5)
current_status = api.get_operation(operation.name)
print('waiting to complete')
print('operation done')
In my case, The AutoML Tables Client didn't have a SERVICE_ADDRESS or SCOPE properties, so I can't create a new gRPC channel.
But using the existing one in the client seems to work!
from google.api_core import operations_v1
from google.cloud.automl_v1beta1 import TablesClient
automl_tables_client = TablesClient(
credentials=...,
project=...,
region=...,
)
operation_name = ""
grpc_channel = automl_tables_client.auto_ml_client.transport._channel
api_client = operations_v1.OperationsClient(grpc_channel)
response = api_client.get_operation(operation_name)
You can use the get_operation method of the "Long-Running Operations Client":
from google.api_core import operations_v1
api = operations_v1.OperationsClient()
name = ...
response = api.get_operation(name)

Categories

Resources