I'm following MongoDB university course M220, on using Python with MongoDB.
It starts with defining the connection:
from pymongo import MongoClient
uri = "mongodb+srv://m220student:m220password#mflix.abcde.mongodb.net"
client = MongoClient(uri)
And then it uses client.stats
That gives:
Database(MongoClient(host=['mflix-shard-00-01.abcde.mongodb.net:27017', 'mflix-shard-00-00.9go7j.mongodb.net:27017', 'mflix-shard-00-02.9go7j.mongodb.net:27017'], document_class=dict, tz_aware=False, connect=True, authsource='admin', replicaset='atlas-js08eu-shard-0', ssl=True), 'stats')
I can't figure out where the stats comes from?
There is nothing in MongoDB API documentation. I even unpacked and searched the source code wheel file and couldn't find it.
You created an instance of the mongo db client, thus you have access to the db.stats() method:
Here's the documentation for that method:
https://docs.mongodb.com/manual/reference/method/db.stats/
Related
when running the azure CLI command:
az storage account blob-service-properties show --account-name sa36730 --resource-group rg-exercise1
The output json contains the filed isVersioningEnabled.
I am trying to get this field using python sdk.
I wrote this code but the output doesnt contain the version enabled information.
def blob_service_properties():
connection_string = "<connection string>"
# Instantiate a BlobServiceClient using a connection string
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
properties = blob_service_client.get_service_properties()
pprint.pprint(properties)
# [END get_blob_service_properties]
My output looks like:
{'analytics_logging': <azure.storage.blob._models.BlobAnalyticsLogging object at 0x7ff0f8b7c340>,
'cors': [<azure.storage.blob._models.CorsRule object at 0x7ff1088b61c0>],
'delete_retention_policy': <azure.storage.blob._models.RetentionPolicy object at 0x7ff0f8b9b1c0>,
'hour_metrics': <azure.storage.blob._models.Metrics object at 0x7ff0f8b9b700>,
'minute_metrics': <azure.storage.blob._models.Metrics object at 0x7ff0f8b9b3d0>,
'static_website': <azure.storage.blob._models.StaticWebsite object at 0x7ff0f8ba5c10>,
'target_version': None}
Is there a way to get the versioning information using Python SDK for storage blob?
I tried the below steps, it worked for me, in my environment
To get the Blobservice properties you can use azure-mgmt-storage package.
You can use the below to get the blob service properties.
Code:
from azure.mgmt.storage import StorageManagementClient
from azure.identity import DefaultAzureCredential
storage_client=StorageManagementClient(credential=DefaultAzureCredential(),subscription_id="<your subscription Id>")
blob_service_list = storage_client.blob_services.list("<your resourcegrouup name>', '<your account name>')
for items in blob_service_list:
print(items)
Console:
I saw the GCP quick start(https://cloud.google.com/sql/docs/mysql/quickstart-connect-functions) and tried to connect Google cloud functions to My SQL of cloud SQL. Actually the website is using HTTP trigger, but I want to use "Finalize/Create" action in GCS as a trigger of the cloud functions. Therefore I used following python code and txt code.
【MAIN.PY】
import sqlalchemy
from google.cloud import storage
# Set the following variables depending on your specific
# connection name and root password from the earlier steps:
def sql_select(data, context):
connection_name = "INSTANCE_CONNECTION_NAME"
db_password = "DATABASE_USER_PASSWORD"
db_name = "DATABASE_NAME"
db_user = "root"
driver_name = 'mysql+pymysql'
query_string = '"unix_socket": "/cloudsql/{}".format(connection_name)'
print("I am here 1")
db = sqlalchemy.create_engine(
sqlalchemy.engine.url.URL(
drivername=driver_name,
username=db_user,
password=db_password,
database=db_name,
query={query_string},
),
pool_size=5,
max_overflow=2,
pool_timeout=30,
pool_recycle=1800)
# stmt = sqlalchemy.text('INSERT INTO entries (guestName, content) values ("third guest", "Also this one");')
# try:
# with db.connect() as conn:
# conn.execute(stmt)
# except Exception as e:
# return 'Error: {}'.format(str(e))
# return 'ok'
【REQUIREMENTS.TXT】
SQLAlchemy==1.3.18
PyMySQL==0.9.3
Click==7.0
Flask==1.0.2
itsdangerous==1.1.0
Jinja2==2.10
MarkupSafe==1.1.0
Pillow==5.4.1
qrcode==6.1
six==1.12.0
Werkzeug==0.14.1
google-cloud-storage==1.23.0
And I tried to test the code by pushing "testing function" tab. But I have following error and couldn't connect cloud functions I made to cloud SQL(My SQL).
【The error message】
Error: function terminated. Recommended action: inspect logs for termination reason. Details:
'set' object has no attribute 'get'
I tried to change to latest version of sqlalchemy. But I couldn't solve the error.
And i have no idea to solve the problem.
Are there any solutions to connect cloud functions to cloud SQL?
Thank you
The link you referenced should give you almost everything you need.
You should (!?) be able to trigger the function for the GCS bucket events but you'll need to ensure you deploy the function correctly:
See: https://cloud.google.com/functions/docs/calling/storage#object_finalize
You'll need to import sqlalchemy. You've included the package in requirements.txt but missed the second step.
You'll need to update the values of connection_name, db_password etc. after having created the Cloud SQL instance, database etc.
I think your query_string statement is incorrect and would be better as:
query_string = dict({"unix_socket": "/cloudsql/{}".format(connection_name)})
That's possibly the origination of the error.
Cannot figure out how to get db.stats in Mongoengine.
I've tried:
db = MongoEngine()
db.stats()
Also
db.Document.objects.stats()
db.Document.stats()
Also tried to execute JS, but nothing works and documentation is very poor.
db.stats it is a mongo's shell method
You can try something like that:
from mongoengine.connection import get_connection
con = get_connection()
con.get_database().eval('db.stats()')
con.get_database().eval('db.getCollectionInfos()')
Also I advise you to examine objects with dir method, sometimes it could be useful:
from pprint import pprint
pprint(dir(con))
MongoEngine is a wrapper for PyMongo. So to get the stats of a mongo database using mongoengine you could run the 'dbstats' mongodb api command on the database, using the pymongo command funtion like this:
from mongoengine import connect
client = connect()
db = client.get_database('your_database_name')
db_stats = db.command('dbstat')
coll_stats = db.command('collstats', 'your_colletion_name')
print(db_stats)
print(coll_stats)
I'm looking to increment the 'views' field by +1 in my document within my collection. I'm using mongodb atlas database to be included in my flask app. I've included my route here. Any suggestions would be great thanks.
#app.route('/view_count/<recipe_id>', methods=['POST'])
def view_count(recipe_id):
mongo.db.recipes.update_one({"_id": ObjectId(recipe_id)}, {"$inc": {'views': 1}})
return redirect(url_for('view_recipe.html'))
you queries are correct if you are using pymongo.
maybe, the problem are mongo.db.
Example
from bson import ObjectId
from pymongo import MongoClient
# connect to general db
client = MongoClient('mongodb://localhost:27017')
# mongo accept everything, so is ok these queries below
# OBS: client.db means connection with database called db inside mongo
client.db.recipes.insert_one({'_id':ObjectId(), 'views': 0})
client.db.recipes.find_one({}) # the insertion above work
client.db.recipes.update_one({}, {'$inc': {'views': 1}}) # have only one, so update they
but if you change:
client = MongoClient('mongodb://localhost:27017')
# with
client = MongoClient('mongodb://localhost:27017').db
# everything continue working, but now, the path to recipes is db.db.db.recipes
I need to create a new user in azure devops using the python client library for Azure DevOps REST API.
I wrote the following code:
from azure.devops.connection import Connection
from azure.devops.v5_0.graph.models import GraphUserCreationContext
from msrest.authentication import BasicAuthentication
credentials = BasicAuthentication('', personal_access_token)
connection = Connection(base_url=organization_url, creds=credentials)
graph_client = connection.clients_v5_0.get_graph_client()
addAADUserContext = GraphUserCreationContext("anaya.john#mydomain.com")
print(addAADUserContext)
resp = graph_client.create_user(addAADUserContext)
print(resp)
I get the output:
{'additional_properties': {}, 'storage_key': 'anaya.john#dynactionize.onmicrosoft.com'}
And an error occurs while calling the create_user method:
azure.devops.exceptions.AzureDevOpsServiceError: VS860015: Must have exactly one of originId or principalName set.
Actually what i should pass a GraphUserPrincipalNameCreationContext to the create_user function.
I found a .NET sample which does this in a function named AddRemoveAADUserByUPN() :
https://github.com/microsoft/azure-devops-dotnet-samples/blob/master/ClientLibrary/Samples/Graph/UsersSample.cs
GraphUserPrincipalNameCreationContext is an interface in this sample. But python doesn't support interfaces.
So how can implement this in python?
Some of the classes like GraphUserPrincipalNameCreationContext aren't currently available in the python client API. They are working on it. You can track the issue here in GitHub repo:
https://github.com/microsoft/azure-devops-python-api/issues/176
You can use User Entitlements - Add REST API for azure devops instead of it's Graph API. You can use the following python client for this purpose:
https://github.com/microsoft/azure-devops-python-api/tree/dev/azure-devops/azure/devops/v5_0/member_entitlement_management
You can refer to the sample given in the following question to know about how to use the mentioned python client :
Unable to deserialize to object: type, KeyError: ' key: int; value: str '