First of all, I know that my question duplicate this question. But I supose it's not the same.
I need to save user "search filter". As I understand Django ORM create specific SQL query for different DB. So if I save SQL query I can't migrate on other database with different SQL syntax.
Am I wrong? If no, how can I save Django side of query, without accsesing to DB?
The short answer is that you're correct -- mostly. If the SQL dialect that Django compiled the query for isn't compatible with a different backend, it wouldn't work or might work unpredictably.
To save the Django side of the query, why not just save the actual filter() statement that you're using or a representation of it that you can convert back on the fly?
Edit: Okay in that case I think you're on the right track based on comments and above answer. If you're parsing a query string already save that in the database as a CharField and then just use it to build a Django QuerySet when you retrieve it. If I'm understanding.
If you can suggest better sulution I open for conversation
So... Pickle the function .filter() is not the best idea so as saving SQL string for specific DB. I think the best solution for this problem is saving search parameters. In my case it's GET string. I get it:
request.META["QUERY_STRING"]
And save to DB.
If I need to get it, i just parse:
from django.http import QueryDict
QueryDict(request.META["QUERY_STRING"])
Aditionally I use different form for validate this values (optional) SearchTrustedForm(), because if data structure has changed I can save backwards compatibility.
Related
I have scraped data from a website using their API on a Django application. The data is JSON (a Python dictionary when I retrieve it on my end). The data has many, many fields. I want to store them in a database, so that I can create endpoints that will allow for lookup and modifications (updates). I need to use their fields to create the structure of my database. Any help on this issue or on how to tackle it would be greatly appreciated. I apologize if my question is not concise enough, please let me know if there is anything I need to specify.
I have seen many, many people saying to just populate it, such as this example How to populate a Django sqlite3 database. The issue is, there are so many fields that I cannot go and actually create the django model fields myself. From what I have read, it seems like I may be able to use serializers.ModelSerializer, although that seems to just populate a pre-existing db with already defined model.
Tricky to answer without details, but I would consider doing this in two steps - first, convert your json data to a database schema, for example using a tool like sqlify: https://sqlify.io/convert/json/to/sqlite
Then, create a database from the generated schema file, and use inspectdb to generate your django models: https://docs.djangoproject.com/en/2.2/ref/django-admin/#inspectdb
You'll probably need to tweak the generated schema and/or models, but this should go a long way towards automating the process.
I would go for a document database, like Elasticsearch or MongoDB.
Those are made for this kind of situation, look it up.
Declared name,mobile,age in models.py and same fields added in the database also.How to insert JSON data into mysql database using Python DJango framework.Tried this but not working.
js_data = session.query(tbl_users).create({'name' : 'mmmmm','mobile':'123456','age':'35'})
session.add(js_data)
session.commit()
Firstly, please choose a better title and tags for your question. This question isn't really about JSON, it's about inserting values into the database. Even more importantly, your code does not appear to be Django at all; it seems to be SQLAlchemy. So asking about JSON and Django is unlikely to get you much help.
Now that we know it's SQLAlchemy, we can look at the documentation for how to add objects. There's no reference to query or create there at all. It seems like it should be:
js_data = tbl_users(**{'name' : 'mmmmm','mobile':'123456','age':'35'})
session.add(js_data)
session.commit()
Another option to solve your problem (insert JSON data into mysql) is to install and use this special package:
https://www.djangopackages.com/grids/g/json-fields/
This might sound like a bit of an odd question - but is it possible to load data from a (in this case MySQL) table to be used in Django without the need for a model to be present?
I realise this isn't really the Django way, but given my current scenario, I don't really know how better to solve the problem.
I'm working on a site, which for one aspect makes use of a table of data which has been bought from a third party. The columns of interest are liklely to remain stable, however the structure of the table could change with subsequent updates to the data set. The table is also massive (in terms of columns) - so I'm not keen on typing out each field in the model one-by-one. I'd also like to leave the table intact - so coming up with a model which represents the set of columns I am interested in is not really an ideal solution.
Ideally, I want to have this table in a database somewhere (possibly separate to the main site database) and access its contents directly using SQL.
You can always execute raw SQL directly against the database: see the docs.
There is one feature called inspectdb in Django. for legacy databases like MySQL , it creates models automatically by inspecting your db tables. it stored in our app files as models.py. so we don't need to type all column manually.But read the documentation carefully before creating the models because it may affect the DB data ...i hope this will be useful for you.
I guess you can use any SQL library available for Python. For example : http://www.sqlalchemy.org/
You have just then to connect to your database, perform your request and use the datas at your will. I think you can't use Django without their model system, but nothing prevents you from using another library for this in parallel.
I have been playing arround with django for a couple of days and it seems great, but I find it a pain if I want to change the structure of my database, I then am stuck with a few rather awkward options.
Is there a way to completely bypass djangos database abstraction so if I change the structure of the database I dont have to guess what model would have generated it or use a tool (south or ...) to change things?
I essentially want this: https://docs.djangoproject.com/en/dev/topics/db/sql/ (Raw SQL Queries) but instead of refering to a model, refering to an external database.
Could I just create an empty model and then only perform raw queries on it? (and set up the DB externally)
Thanks
P.S. I dont really mind if I have separate databases for the admin stuff and the app data
It's in your question already, just read the docs article from here: Executing custom SQL directly
Two questions:
i want to generate a View in my PostGIS-DB. How do i add this View to my geometry_columns Table?
What i have to do, to use a View with SQLAlchemy? Is there a difference between a Table and View to SQLAlchemy or could i use the same way to use a View as i do to use a Table?
sorry for my poor english.
If there a questions about my question, please feel free to ask so i can try to explain it in another way maybe :)
Nico
Table objects in SQLAlchemy have two roles. They can be used to issue DDL commands to create the table in the database. But their main purpose is to describe the columns and types of tabular data that can be selected from and inserted to.
If you only want to select, then a view looks to SQLAlchemy exactly like a regular table. It's enough to describe the view as a Table with the columns that interest you (you don't even need to describe all of the columns). If you want to use the ORM you'll need to declare for SQLAlchemy that some combination of the columns can be used as the primary key (anything that's unique will do). Declaring some columns as foreign keys will also make it easier to set up any relations. If you don't issue create for that Table object, then it is just metadata for SQLAlchemy to know how to query the database.
If you also want to insert to the view, then you'll need to create PostgreSQL rules or triggers on the view that redirect the writes to the correct location. I'm not aware of a good usage recipe to redirect writes on the Python side.