I have a JSON column in a MySQL table that contains a multi-level JSON object. I can access the values at the first level using the function JSON_EXTRACT but I can't find how to go over the first level.
Here's my MySQL table:
CREATE TABLE ref_data_table (
`id` INTEGER(11) AUTO_INCREMENT NOT NULL,
`symbol` VARCHAR(12) NOT NULL,
`metadata` JSON NOT NULL,
PRIMARY KEY (`id`)
);
Here's my Python script:
import json
import mysql.connector
con = mysql.connector.connect(**config)
cur = con.cursor()
symbol = 'VXX'
metadata = {
'tick_size': 0.01,
'data_sources': {
'provider1': 'p1',
'provider2': 'p2',
'provider3': 'p3'
},
'currency': 'USD'
}
sql = \
"""
INSERT INTO ref_data_table (symbol, metadata)
VALUES ('%s', %s);
"""
cur.execute(sql, (symbol, json.dumps(metadata)))
con.commit()
The data is properly inserted into the MySQL table and the following statement in MySQL works:
SELECT symbol, JSON_EXTRACT(metadata, '$.data_sources')
FROM ref_data_table
WHERE symbol = 'VXX';
How can I request the value of 'provider3' in 'data_sources'?
Many thanks!
Try this:
'$.data_sources.provider3'
SELECT symbol, JSON_EXTRACT(metadata, '$.data_sources.provider3)
FROM ref_data_table
WHERE symbol = 'VXX';
the JSON_EXTRACT method in MySql supports that, the '$' references the JSON root, whereas periods reference levels of nesting. in this JSON example
{
"key": {
"value": "nested_value"
}
}
you could use JSON_EXTRACT(json_field, '$.key.value') to get "nested_value"
Related
I'm new to using Python sqlite and parsing json files. I'm trying to create a (3 table) database using python sqlite. I am able to create these tables using the code below, but now I need to populate the tables using a json file. How do I add the json file to the tables when there are multiple values (as seen below)? For example, there are multiple items and I want to create a counter variable (itemid) for each item in the order.
import sqlite3
from sqlite3 import Error
def create_connection(db_file):
""" create a database connection to the SQLite database
specified by db_file
:param db_file: database file
:return: Connection object or None
"""
conn = None
try:
conn = sqlite3.connect(db_file)
return conn
except Error as e:
print(e)
return conn
def create_table(conn, create_table_sql):
""" create a table from the create_table_sql statement
:param conn: Connection object
:param create_table_sql: a CREATE TABLE statement
:return:
"""
try:
c = conn.cursor()
c.execute(create_table_sql)
except Error as e:
print(e)
def main():
database = r"C:\sqlite\db\pythonsqlite.db"
sql_create_items_table = """ CREATE TABLE IF NOT EXISTS items (
orderid integer PRIMARY KEY,
itemid integer PRIMARY KEY,
name text,
price numeric); """
sql_create_charges_table = """CREATE TABLE IF NOT EXISTS charges (
FOREIGN KEY (items_orderid) REFERENCES items (orderid),
date datetime,
subtotal numeric,
taxes numeric,
total numeric);"""
sql_create_payment_table = """CREATE TABLE IF NOT EXISTS payment (
FOREIGN KEY (items_orderid) REFERENCES items (orderid),
card_type text,
card_number integer,
zip text,
cardholder text,
method text);"""
# create a database connection
conn = create_connection(database)
# create tables
if conn is not None:
# create items table
create_table(conn, sql_create_items_table)
# create charges table
create_table(conn, sql_create_charges_table)
# create payment table
create_table(conn, sql_create_payment_table)
else:
print("Error! cannot create the database connection.")
if __name__ == '__main__':
main()
This is an example of the json file entries:
{
"orders": [
{
"items": [
{
"name": "coffee",
"price": 2.75
},
{
"name": "espresso",
"price": 1.25
}
],
"charges": {
"date": "04/01/21 11:10",
"subtotal": 4.0,
"taxes": 0.28,
"total": 4.28
},
"payment": {
"card_type": "visa",
"last_4_card_number": "6072",
"zip": "21213",
"cardholder": "Andrew Luna",
"method": "credit_card"
}
}
}
I have created a Python script that creates a table in MySQL and another one that populates it with data from a JSON file.
Sample JSON file:
{
"ansible_facts":{
"ansible_network_resources":{
"l3_interfaces":[
{
"name":"GigabitEthernet0/0"
},
{
"name":"GigabitEthernet0/0.100",
"ipv4":[
{
"address":"172.1.1.1 255.255.255.252"
}
]
},
{
"name":"GigabitEthernet0/0.101",
"ipv4":[
{
"address":"172.1.1.1 255.255.255.252"
}
]
},
{
"name":"GigabitEthernet0/1",
"ipv4":[
{
"address":"56.2.1.1 255.255.255.252"
}
]
},
{
"name":"GigabitEthernet0/2"
}
]
},
"ansible_net_python_version":"3.6.9",
"ansible_net_hostname":"host02342-mpls",
"ansible_net_model":"CISCO-CHA",
"ansible_net_serialnum":"F1539AM",
"ansible_net_gather_subset":[
"default"
],
"ansible_net_gather_network_resources":[
"l3_interfaces"
],
"ansible_net_version":"15.3(2)T",
"ansible_net_api":"cliconf",
"ansible_net_system":"ios",
"ansible_net_image":"flash0:/c3900-universalk9-mz.spa.153-2.t.bin",
"ansible_net_iostype":"IOS"
}
}
Table creation script
import mysql.connector
mydb = mysql.connector.connect(host="IPaddress", user="user", password="pw", database="db")
mycursor = mydb.cursor()
mycursor.execute("CREATE TABLE Routers (ansible_net_hostname NVARCHAR(255), ansible_net_model NVARCHAR(255), ansible_network_resources NVARCHAR(255))")
The script to import JSON data into MySQL
import json, pymysql
json_data = open("L3_out.json").read()
json_obj = json.loads(json_data)
con = pymysql.connect(host="IPaddress", user="user", password="pw", database="db")
cursor = con.cursor()
for item in json_obj:
ansible_net_hostname = item.get("ansible_net_hostname")
ansible_net_model = item.get("ansible_net_model")
ansible_network_resources = item.get("ansible_network_resources")
cursor.execute(
"insert into Routers(ansible_net_hostname, ansible_net_model, ansible_network_resources) value(%s, %s, %s)",
(ansible_net_hostname, ansible_net_model, ansible_network_resources)
con.commit()
con.close()
I'm having issues importing ansible_network_resources field object into the Routers table. The other columns (ansible_net_hostname, ansible_net_model) get inserted perfectly. What am I doing wrong?
First of all, it's not clear how does
for item in json_obj:
ansible_net_hostname=item.get("ansible_net_hostname")
work.
Since 'item' in your case is a key from the dictionary. In the file you shown there is only one root key "ansible_facts". So you are trying to call get() on the string.
To get the data of "ansible_network_resources" do the following:
for key in json_obj:
ansible_network_resources=json_obj[key].get("ansible_network_resources")
I have a table that contains an item with the following attributes:
{
"country": "USA",
"names": [
"josh",
"freddy"
],
"phoneNumber": "123",
"userID": 0
}
I'm trying to query an item in a DynameDB by looking for a name using python. So I would write in my code that the item I need has "freddy" in the field "names".
I saw many forums mentioning "contains" but none that show an example...
My current code is the following:
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('users_table')
data = table.query(
FilterExpression: 'names = :name',
ExpressionAttributeValues: {
":name": "freddy"
}
)
I obviously cannot use that because "names" is a list and not a string field.
How can I look for "freddy" in names?
Since names field isn't part of the primary key, so you can't use query. The only way to look for an item by names is to use scan.
import boto3
from boto3.dynamodb.conditions import Key, Attr
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('users_table')
data = table.scan(
FilterExpression=Attr('names').contains('freddy')
)
I have a database structure like this:
CREATE TABLE person (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
age INTEGER NOT NULL,
hometown_id INTEGER REFERENCES town(id)
);
CREATE TABLE town (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
population INTEGER NOT NULL
);
And I want to get the following result when selecting:
{
"name": "<person.name>",
"age": "<person.age>"
"hometown": {
"name": "<tometown.name>",
"population": "<tometown.population>"
}
}
I'm already using psycopg2.extras.DictCursor, so I think I need to play with SQL's SELECT AS.
Here's an example of what I tried with no resullt, I've done many similar with minor adjustments, all of them raising different errors:
SELECT
person(name, age),
town(name, population) as town,
FROM person
JOIN town ON town.id = person.hometown_id
Any way to do this, or should I just select all columns individually and build the dict inside of Python?
Postgres version info:
psql (9.4.6, server 9.5.2)
WARNING: psql major version 9.4, server major version 9.5.
Some psql features might not work.
smth like?..
t=# with t as (
select to_json(town),* from town
)
select json_build_object('name',p.name,'age',age,'hometown',to_json) "NameItAsYou Wish"
from person p
join t on t.id=p.hometown_id
;
NameItAsYou Wish
--------------------------------------------------------------------------------
{"name" : "a", "age" : 23, "hometown" : {"id":1,"name":"tn","population":100}}
(1 row)
i want to return a sql query output with column name as json,
to create an table on client-side.
But i have not found a solution for this.
my code:
json_data = json.dumps(c.fetchall())
return json_data
like this output:
{
"name" : "Toyota1",
"product" : "Prius",
"color" : [
"white pearl",
"Red Methalic",
"Silver Methalic"
],
"type" : "Gen-3"
}
does anyone know a solution?
Your code only returns the values. To also get the column names you need to query a table called 'sqlite_master', which has the sql string that was used to create the table.
c.execute("SELECT sql FROM sqlite_master WHERE " \
"tbl_name='your_table_name' AND type = 'table'")
create_table_string = cursor.fetchall()[0][0]
This will give you a string from which you can parse the column names:
"CREATE TABLE table_name (columnA text, columnB integer)"