Store json file in MS SQL - python

I have a JSON file, how to store a JSON file in MS SQL? And read the file data after storing it into Database?
I'm using a python script to interact with SQL Server.
Note: I don't want to store key-value pairs as an individual records in DB, I want to store the whole file in DB using python.

There is no specific data type for JSON in SQL Server, unlike say XML which has the xml data type.
If you are, however, storing JSON data in SQL Server then you will want to use an nvarchar(MAX). If you are on SQL Server 2016+ I also recommend adding a CHECK CONSTRAINT to the column to ensure that the JSON is valid, as otherwise parsing it (in SQL) will be impossible. You can check if a value is valid JSON using ISJSON. For example, if you were adding the column to an existing table:
ALTER TABLE dbo.YourTable ADD YourJSON nvarchar(MAX) NULL;
GO
ALTER TABLE dbo.YourTable ADD CONSTRAINT chk_YourTable_ValidJSON CHECK (ISJSON(YourJSON) = 1 OR YourJSON IS NULL);

SQL server has a JSON data type for this. This is wrong.
If your version doesn’t, you can just store it as a string with VARCHAR or TEXT.
This article reckons NVARCHAR(max) is the answer for documents greater than 8KB, for documents under that you can use NVARCHAR(4000) which apparently has better performance.

Related

storing sql in config file

I am using a big nested sql query in python and need to run that query using different dates (date is a field name used in the query).
I also need to change the table name (assume that there are various tables where I need to update the date or insert new records).
Now I am finding it a bit clunky and want to shift the query to config (or .ini) file. I also want to do this so that user can easily change the query without opening the code.
I am able to read the sql but python can't change the variables inside the code.
For example in .ini file the sql is stored as [SQL]:
p_insert_query = insert into + tbl_p + <...nested sql>
I read this inside python and having the tbl_p already defined as 'My_tbl' in python but the query string is not updating the table name.
Is there any other way to this?
You could store a .sql or .txt file containing a "parameterized query".
If you use psycopg library you can do it that way (as stated in the doc: http://initd.org/psycopg/docs/sql.html) :
from psycopg2 import sql
my_query_text = "insert into {} values (%s, %s)" # just load that str from .sql or .txt file instead
tbl_p = "my_table_name"
cur.execute(sql.SQL(my_query_text).format(sql.Identifier(tbl_p)), [10, 20]) # [10,20] are sample values

saving python object in postgres table with pickle

I have a python script which creates some objects.
I would like to be able to save these objects into my postgres database for use later.
My thinking was I could pickle an object, then store that in a field in the db.
But I'm going round in circles about how to store and retrieve and use the data.
I've tried storing the pickle binary string as text but I can't work out how to encode / escape it. Then how to load the string as a binary string to unpickle.
I've tried storing the data as bytea both with psycopg2.Binary(data) and without.
Then reading into buffer and encoding with base64.b64encode(result) but it's not coming out the same and cannot be unpickled.
Is there a simple way to store and retrieve python objects in a SQL (postgres) database?
Following the comment from #SergioPulgarin I tried the following which worked!
N.B Edit2 following comment by #Tomalak
Storing:
Pickle the object to a binary string
pickle_string = pickle.dumps(object)
Store the pickle string in a bytea (binary) field in postgres. Use simple INSERT query in Psycopg2
Retrieval:
Select the field in Psycopg2. (simple SELECT query)
Unpickle the decoded result
retrieved_pickle_string = pickle.loads(decoded_result)
Hope that helps anybody trying to do something similar!

Entire JSON into One SQLite Field with Python

I have what is likely an easy question. I'm trying to pull a JSON from an online source, and store it in a SQLite table. In addition to storing the data in a rich table, corresponding to the many fields in the JSON, I would like to also just dump the entire JSON into a table every time it is pulled.
The table looks like:
CREATE TABLE Raw_JSONs (ID INTEGER PRIMARY KEY ASC, T DATE DEFAULT (datetime('now','localtime')), JSON text);
I've pulled a JSON from some URL using the following python code:
from pyquery import PyQuery
from lxml import etree
import urllib
x = PyQuery(url='json')
y = x('p').text()
Now, I'd like to execute the following INSERT command:
import sqlite3
db = sqlite3.connect('a.db')
c = db.cursor()
c.execute("insert into Raw_JSONs values(NULL,DATETIME('now'),?)", y)
But I'm told that I've supplied the incorrect number bindings (i.e. thousands, instead of just 1). I gather it's reading the y variable as all the different elements of the JSON.
Can someone help me store just the JSON, in it's entirety?
Also, as I'm obviously new to this JSON game, any online resources to recommend would be amazing.
Thanks!
.execute() expects a sequence, better give it a one-element tuple:
c.execute("insert into Raw_JSONs values(NULL,DATETIME('now'),?)", (y,))
A Python string is a sequence too, one of individual characters. So the .execute() call tried to treat each separate character as a parameter for your query, and unless your string is one character short that means it'll not provide the right number of parameters.
Don't forget to commit your inserts:
db.commit()
or use the database connection as a context manager:
with db:
# inserts executed here will automatically commit if no exceptions are raised.
You may also be interested to know about the built in sqlite modules adapters. These can convert any python object to an sqlite column both ways. See the standard documentation and the adapters section.

How to read a csv using sql

I would like to know how to read a csv file using sql. I would like to use group by and join other csv files together. How would i go about this in python.
example:
select * from csvfile.csv where name LIKE 'name%'
SQL code is executed by a database engine. Python does not directly understand or execute SQL statements.
While some SQL database store their data in csv-like files, almost all of them use more complicated file structures. Therefore, you're required to import each csv file into a separate table in the SQL database engine. You can then use Python to connect to the SQL engine and send it SQL statements (such as SELECT). The engine will perform the SQL, extract the results from its data files, and return them to your Python program.
The most common lightweight engine is SQLite.
littletable is a Python module I wrote for working with lists of objects as if they were database tables, but using a relational-like API, not actual SQL select statements. Tables in littletable can easily read and write from CSV files. One of the features I especially like is that every query from a littletable Table returns a new Table, so you don't have to learn different interfaces for Table vs. RecordSet, for instance. Tables are iterable like lists, but they can also be selected, indexed, joined, and pivoted - see the opening page of the docs.
# print a particular customer name
# (unique indexes will return a single item; non-unique
# indexes will return a Table of all matching items)
print(customers.by.id["0030"].name)
print(len(customers.by.zipcode["12345"]))
# print all items sold by the pound
for item in catalog.where(unitofmeas="LB"):
print(item.sku, item.descr)
# print all items that cost more than 10
for item in catalog.where(lambda o : o.unitprice>10):
print(item.sku, item.descr, item.unitprice)
# join tables to create queryable wishlists collection
wishlists = customers.join_on("id") + wishitems.join_on("custid") + catalog.join_on("sku")
# print all wishlist items with price > 10
bigticketitems = wishlists().where(lambda ob : ob.unitprice > 10)
for item in bigticketitems:
print(item)
Columns of Tables are inferred from the attributes of the objects added to the table. namedtuples are good also, as well as a types.SimpleNamespaces. You can insert dicts into a Table, and they will be converted to SimpleNamespaces.
littletable takes a little getting used to, but it sounds like you are already thinking along a similar line.
You can easily query an SQL Database using PHP script. PHP runs serverside, so all your code will have to be on a webserver (the one with the database). You could make a function to connect to the database like this:
$con= mysql_connect($hostname, $username, $password)
or die("An error has occured");
Then use the $con to accomplish other tasks such as looping through data and creating a table, or even adding rows and columns to an existing table.
EDIT: I noticed you said .CSV file. You can upload a CSV file into a SQL database and create a table out of it. If you are using a control panel service such as phpMyAdmin, you can simply import a CSV file into your database like this:
If you are looking for a free web host to test your SQL and PHP files on, check out x10 hosting.

load data into table in Mysql

I created 4 columns for a table
cur.execute("""CREATE TABLE videoinfo (
id INT UNSIGNED PRIMARY KEY AUTO INCREMENT,
date DATETIME NOT NULL,
src_ip CHAR(32),
hash CHAR(150));
""")
I have a .txt file which has three columns of data inside. I want to use LOAD DATA LOCAL INFILEcommand to insert data, but the problem is ,the table I created now has four columns, the first one is the id, so, can mysql automatically insert data from the second column or extra command is needed?
Many thanks!
AUTO INCREMENT isn't valid syntax. If you check MySQL's documentation for the CREATE TABLE statement, you'll see the proper keyword is AUTO_INCREMENT.
Additionally, date is a keyword, so you'll need to quote it with backticks, as mentioned on the MySQL identifier documentation page. The documentation also lists all keywords, which must be quoted to use them as identifiers. To be safe, you could simply quote all identifiers.
To insert data only into some columns, you can explicitly specify columns. For LOAD DATA INFILE:
LOAD DATA INFILE 'file_name'
INTO TABLE videoinfo
(`date`, src_ip, hash)
For the INSERT statement:
INSERT INTO videoinfo (`date`, src_ip, hash)
VALUES (...);
This, too, is revealed in the MySQL manual. Notice a pattern?

Categories

Resources