As the title indicates, how does one use python to elegantly access an API and parse and save the JSON contents onto a relational database (MYSQL) for later access?
Here, I saved the data onto a pandas object. But how do I create a mysql database, save the json contents onto it, and access the contents for later use?
# Libraries
import json, requests
import pandas as pd
from pandas.io.json import json_normalize
# Set URL
url = 'https://api-v2.themuse.com/jobs'
# For loop to
for i in range(100):
data = json.loads(requests.get(
url=url,
params={'page': i}
).text)['results']
data_norm = pd.read_json(json.dumps(data))
You create your Mysql table on your server using something like Mysql Workbench CE. then in python you do this. I wasnt sure if you want to use data in for loop or data_norm so for ease of use, here some functions. insertDb() can be put in your for loop, since data will be overwriten by itself in every iteration.
import MySQLdb
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='password',
db='nameofdb'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
def insertDb():
try:
db = dbconnect()
cursor = db.cursor()
cursor.execute("""
INSERT INTO nameoftable(nameofcolumn) \
VALUES (%s) """, (data))
cursor.close()
except Exception as e:
print e
If this is merely for storage for processing later, kind of like a cache, a varchar field is enough. If however you need to retrieve some structured jdata, JSON field is what you need.
Related
Requirement: 1. I want to create python API which will help to insert data in big query table and this API will host in swagger/postman, from there user can provide input data so that it will get reflected in big query table.
Can anyone help me to find out suitable solution with code
import sqlite3 as sql
from google.cloud import bigquery
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('path/to/file.json')
project_id = 'project_id'
client = bigquery.Client(credentials= credentials,project=project_id)
def add_data(group_name, user_name):
try:
# Connecting to database
con = sql.connect('shot_database.db')
# Getting cursor
c = con.cursor()
# Adding data
job_config.use_legacy_sql = True
query_job = client.query("""
INSERT INTO `table_name` (group, user)
VALUES (%s, %s)""",job_config = job_config)
results = query_job.result() # Wait for the job to complete.
# Applying changes
con.commit()
except:
print("An error has occured")
The code you provided is a mix of SQLite and BigQuery, but it likes that you're trying to use BigQuery to insert data into a table. To insert data into a BigQuery table using Python, you can use the insert_data() method of the Client class. Here's I am adding an example of how you can use this method to insert data into a table called "mytable" in a dataset called "mydataset":
# Define the data you want to insert
data = [
{
"group": group_name,
"user": user_name
}
]
# Insert the data
table_id = "mydataset.mytable"
errors = client.insert_data(table_id, data)
if errors == []:
print("Data inserted successfully")
else:
print("Errors occurred while inserting data:")
print(json.dumps(errors, indent=2))
Then, You can create an API using Flask or Django and call the add_data method which you have defined to insert data into big query table.
I have a python flask app which primarily uses sqlalchemy to execute all of it's mySQL queries and I need to write tests for it using a local database and behave.
After a brief research, the database I've chosen for this task is a local sqlite3 db, mainly because I've read that its pretty much compatible with mySQL and sqlalchemy, and also because it's easy to set up and tear-down.
I've established a connection to it successfully and managed to create all the tables I need for the tests.
I've encountered a problem when trying to execute some queries, where the query statement is being built as a sqlalchemy TextClause object and my sqlite3 connection cursor raises the following exception when trying to execute the statement:
TypeError: argument 1 must be str, not TextClause
How can I convert this TextClause object dynamically to a string and execute it?
I don't want to make drastic changes to the code just for testing.
A code example:
employees table:
id
name
1
Jeff Bezos
2
Bill Gates
from sqlalchemy import text
import sqlite3
def select_employee_by_id(id: int):
employees_table = 'employees'
db = sqlite3.connect(":memory:")
cursor = db.cursor()
with db as session:
statement = text("""
SELECT *
FROM {employees_table}
WHERE
id = :id
""".format(employees_table=employees_table)
).bindparams(id=id)
data = cursor.execute(statement)
return data.fetchone()
Should return a row containing {'id': 1, 'name': 'Jeff Bezos'} for select_employee_by_id(1)
Thanks in advance!
If you want to test your TextClause query then you should execute it by using SQLAlchemy, not by using a DBAPI (SQLite) cursor:
from sqlalchemy import create_engine, text
def select_employee_by_id(id: int):
employees_table = 'employees'
engine = create_engine("sqlite://")
with engine.begin() as conn:
statement = text("""
SELECT *
FROM {employees_table}
WHERE
id = :id
""".format(employees_table=employees_table)
).bindparams(id=id)
data = conn.execute(statement)
return data.one()
I am trying to read data from snowflake database using FASTAPI. I was able to create the connection which is able to pull data from snowflake.
The issue which I am facing right now is that I am only getting 1 record (instead of 10 records).
I suspect I am not using correct keyword while returning the data. appreciate any help.
Here is my code :-
from fastapi import FastAPI
import snowflake.connector as sf
import configparser
username='username_value'
password='password_value'
account= 'account_value'
warehouse= 'test_wh'
database= 'test_db'
ctx=sf.connect(user=username,password=password,account=account,warehouse=warehouse,database=database)
app = FastAPI()
#app.get('/test API')
async def fetchdata():
cursor = ctx.cursor()
cursor.execute("USE WAREHOUSE test_WH ")
cursor.execute("USE DATABASE test_db")
cursor.execute("USE SCHEMA test_schema")
sql = cursor.execute ("SELECT DISTINCT ID,NAME,AGE,CITY FROM TEST_TABLE WHERE AGE > 60")
for data in sql:
return data
You use return in your inner for-loop. This will return the first row encountered.
If you want to return all rows as a list, you can probably do (I'm not familiar with the snowflake connector):
return list(data)
instead of the for-loop, or sql.fetchall().
I'm new to Python and I'm building a simple CRUD app using Flask. Here's how I'm doing it:
from flask import Flask,request
import pymysql.cursors
connection = pymysql.connect(
host='localhost',
db='mydb',
user='root',
password='password',
cursorclass='pymysql.cursors.DictCursor
)
#app.route('/login',methods=['POST'])
def login():
cursor = connection.cursor(pymysql.cursors.DictCursor)
cursor.execute("SELECT id,hash,displayName,attempt,status FROM users WHERE id=%s", (request.form['username']))
user = cursor.fetchone()
pprint(user)
But this code outputs something like this thing:
{u'displayName': 'John Smith',
u'hash': 'somehash.asdf!###$',
u'id': 'developer',
u'attempt': 0,
u'status': 1
}
The thing is, I can't seem to get these attributes using the standard user.hash syntax. Am I doing something wrong? I need to either:
convert it to JSON-like structure
get the properties of user when it's presented in this form
When using a pymysql.cursors.DictCursor cursor the fetched data is returned as a dictionary, therefore you can use all the common dict traversal/accessing/modifying methods on it.
This means that you can access your returned hash as: user["hash"].
When it comes to JSON, Python comes with a built-in json module that can readily convert your retrieved dict into a JSON string, so in your case, to get a JSON representation of the returned user dictionary use:
import json
json_string = json.dumps(user)
print(json_string) # or do whatever you need with it
I am trying to retrieve the text, longitude, and latitude from a tweepy StreamListener and then store the data in my SQL database. I am able to store the coordinates fine, but for some reason the unicode is not working.
For the SQL I have:
mysql> CREATE TABLE tweets (tweet nvarchar(140), lat float(10,6) not null, lng float(10,6) not null) engine=myisam;
For my python script I have (not including the main()):
import mysql.connector
from tweepy import Stream
from tweepy import OAuthHandler
from tweepy.streaming import StreamListener
from authenticator import Authenticator
import json
#connecting to mysql database
conn = mysql.connector.connect(user='root', password='arlenyu123',host='localhost',database='twitter')
mycursor = conn.cursor()
class MyStreamListener(StreamListener):
def on_status(self, status):
if status.coordinates is not None:
#for some reason the status text isn't being fed into the database properly... not sure why.
mycursor.execute('INSERT INTO tweets (tweet, lat, lng) VALUES ({},{},{})'.
format(status.text, status.coordinates['coordinates'][0], status.coordinates['coordinates'][1]))
return True
def on_error(self, status_code):
if status_code == 403:
print("The request is understood, but it has been refused or access is not allowed. Limit is maybe reached")
return False
Please note that I am a beginner so any advice is appreciated.
You should never use string interpolation to create SQL commands. Use the parameter substitution that is provided by the SQL connector.
mycursor.execute('INSERT INTO tweets (tweet, lat, lng) VALUES (%s,%s,%s)',
(status.text, status.coordinates['coordinates'][0], status.coordinates['coordinates'][1]))