Basically, i've created a view to populate my database with Serial models from 0000 to 9999. below is the code i'm using for the view.
def insert_serials(request):
for i in range(0,10000):
serial = Serial(i,False)
serial.save()
else:
print 'The for loop is over'
what is the right way to do this, and i'm getting an IntegrityError, duplicate keys, my model defination is below:
class Serial(models.Model):
serial = models.CharField(max_length=4)
closed = models.BooleanField()
def __unicode__(self):
return "%s" %(self.serial)
def get_absolute_url(self):
return "/draw/serial/%s/" % (self.serial)
Your code is working on my site - Mac OS X, Python 2.6.3, django from trunk, sqlite3
I changed your view function code a bit, though -
from django.http import HttpResponse
from models import Serial
def insert_serials(request):
for i in range(0,10000):
serial = Serial(i,False)
serial.save()
return HttpResponse("Serials are inserted")
There may be positional default arguments, try using keywords:
from django.db import transaction
#transaction.commit_manually
def insert_serials(request):
for i in range(0,10000):
serial = Serial(serial=str(i),closed=False)
serial.save()
transaction.commit()
print 'The for loop is over'
It's wrapped in a transaction should speed it up a bit.
See transaction.commit_manually for details
Your id field (implied by the absence of a PK definition in your model) is not being autonumbered correctly and therefore every INSERT after the first is failing with a duplicate id value. What's your database? Did you have Django create the table, or did you do it yourself?
Try adding unique=False in the closed field declaration.
Also, you're trying to put integers into a string field. You should do it like Serial('%04d' % i, False) to put values from '0000' to '9999'.
Related
I am trying to create a web page that displays a status that is uploaded to a MySQL server from a sonic sensor.
I've tried many different arrangements of the selections but none seem to work.
When I created my first Django project I followed the tutorial and their example, selecting objects created in the past day seemed to work.
return self.pub_date >= timezone.now() - datetime.timedelta(days=1)
In my project, I just changed days to 7 and I changed around >= to be <. Then I started getting "TypeError: unorderable types: DeferredAttribute() < datetime.datetime()".
My models.py:
from django.db import models
import datetime
from django.utils import timezone
class garage_door(models.Model):
date = models.DateTimeField(auto_now_add=True)
state = models.CharField(max_length=6)
def __str__(self):
return str(self.date)[:19]
def select_old(self):
return self.date < timezone.now() - datetime.timedelta(days=7)
My views.py where I am calling this function:
from django.shortcuts import render
from django.http import HttpResponse
from dash.models import garage_door
def dashboard(request):
garage_door.select_old(garage_door).delete()
return HttpResponese("Temp Response")
I want the program to (as stated above) delete all objects older than 7 days, but for some reason, I get this error. What I got from it is that I am comparing two uncomparable variables, but I don't know how to fix that and I don't even know if my guess about this is correct.
Can it have something to do with the date being added into the table from a different raspberry like this:
sql = "INSERT INTO dash_garage_door(date, state) VALUES(%s, %s)"
now = datetime.datetime.now()
val = (now, g_state)
mycursor.execute(sql, val)
mydb.commit()
(The state is either "open" or "closed")
I would be very thankful for any help ! If there is something needed to solve this problem that I haven't provided I would be happy to provide it.
No it has nothing to do with how the data was inserted. It's that you are not doing a query.
Your method needs to be called on an instance of garage_door, and will return whether that particular instance is more than seven days old. But you are calling it on the class, apparently expecting it will query the database for all matching items. But that's not how Django works at all.
You need to do this in a filter expression:
seven_days_ago = timezone.now() - datetime.timedelta(days=7)
garage_door.objects.filter(date__lt=seven_days_ago)
You can wrap this into a custom manager:
class GarageManager(models.Manager):
def select_old(self):
seven_days_ago = timezone.now() - datetime.timedelta(days=7)
return self.filter(date__lt=seven_days_ago)
class garage_door(models.Model):
...
objects = GarageManager()
Now you can do:
garage_door.objects.select_old().delete()
Note, according to Python style your model should be called GarageDoor, not garage_door.
I'm trying to create an object from the console but not sure how to set that up.
This is my modelManager:
class MajorManager(models.Manager):
def __str__(self):
return self.name
def createMajor(self, name):
try:
name = name.lower()
major = self.create(name=name)
except IntegrityError:
print("This major has already been created")
And here is the model:
class Majors(models.Model):
name = models.CharField(max_length=30, unique=True)
objects = MajorManager()
Any help would be much appreciated.
You can go this route using Django's API - checkout the docs
First create a shell:
python manage.py shell
Then you can import your models and do basic CRUD on them.
>>> from polls.models import Choice, Question # Import the model classes we just wrote.
# No questions are in the system yet.
>>> Question.objects.all()
<QuerySet []>
# Create a new Question.
# Support for time zones is enabled in the default settings file, so
# Django expects a datetime with tzinfo for pub_date. Use timezone.now()
# instead of datetime.datetime.now() and it will do the right thing.
>>> from django.utils import timezone
>>> q = Question(question_text="What's new?", pub_date=timezone.now())
# Save the object into the database. You have to call save() explicitly.
>>> q.save()
Or, alternatively you can try the dbshell route, here's the documentation.
This command assumes the programs are on your PATH so that a simple
call to the program name (psql, mysql, sqlite3, sqlplus) will find the
program in the right place. There’s no way to specify the location of
the program manually.
You can't use the Django's ORM though, it's pure SQL, so it would be instructions like:
CREATE TABLE user (
Id Int,
Name Varchar
);
Despite numerous recipes and examples in peewee's documentation; I have not been able to find how to accomplish the following:
For finer-grained control, check out the Using context manager / decorator. This allows you to specify the database to use with a given list of models for the duration of the wrapped block.
I assume it would go something like...
db = MySQLDatabase(None)
class BaseModelThing(Model):
class Meta:
database = db
class SubModelThing(BaseModelThing):
'''imagine all the fields'''
class Meta:
db_table = 'table_name'
runtime_db = MySQLDatabase('database_name.db', fields={'''imagine field mappings here''', **extra_stuff)
#Using(runtime_db, [SubModelThing])
#runtime_db.execution_context()
def some_kind_of_query():
'''imagine the queries here'''
but I have not found examples, so an example would be the answer to this question.
Yeah, there's not a great example of using Using or the execution_context decorators, so the first thing is: don't use the two together. It doesn't appear to break anything, just seems to be redundant. Logically that makes sense as both of the decorators cause the specified model calls in the block to run in a single connection/transaction.
The only(/biggest) difference between the two is that Using allows you to specify the particular database that the connection will be using - useful for master/slave (though the Read slaves extension is probably a cleaner solution).
If you run with two databases and try using execution_context on the 'second' database (in your example, runtime_db) nothing will happen with the data. A connection will be opened at the start of the block and closed and the end, but no queries will be executed on it because the models are still using their original database.
The code below is an example. Every run should result in only 1 row being added to each database.
from peewee import *
db = SqliteDatabase('other_db')
db.connect()
runtime_db = SqliteDatabase('cmp_v0.db')
runtime_db.connect()
class BaseModelThing(Model):
class Meta:
database = db
class SubModelThing(Model):
first_name = CharField()
class Meta:
db_table = 'table_name'
db.create_tables([SubModelThing], safe=True)
SubModelThing.delete().where(True).execute() # Cleaning out previous runs
with Using(runtime_db, [SubModelThing]):
runtime_db.create_tables([SubModelThing], safe=True)
SubModelThing.delete().where(True).execute()
#Using(runtime_db, [SubModelThing], with_transaction=True)
def execute_in_runtime(throw):
SubModelThing(first_name='asdfasdfasdf').save()
if throw: # to demo transaction handling in Using
raise Exception()
# Create an instance in the 'normal' database
SubModelThing.create(first_name='name')
try: # Try to create but throw during the transaction
execute_in_runtime(throw=True)
except:
pass # Failure is expected, no row should be added
execute_in_runtime(throw=False) # Create a row in the runtime_db
print 'db row count: {}'.format(len(SubModelThing.select()))
with Using(runtime_db, [SubModelThing]):
print 'Runtime DB count: {}'.format(len(SubModelThing.select()))
I have a JSON file with data as such :
['dbname' : 'A', 'collection' : 'ACollection', 'fields' : ['name', 'phone_no', 'address']}
['dbname' : 'B', 'collection' : 'BCollection', 'fields' : ['name', 'phone_no', 'address', 'class']}
These are 2 examples amongst many other dictionaries of the same format.
I have a python code that does the following : Accepts 2 inputs from the user - phone_no and dbname. For example, the user enters phone_no as xxxxxxxxxx and dbname as A. The python code then reads the JSON file and matches the user input with the dictionary element having the name of the database as 'A'. It then opens the database 'A', opens the respective collection 'ACollection' and prints the respective fields of posts within the collection that have the phone_no value as xxxxxxxxxx. The databases are implemented with mongoDB.
I need to build a django rest api for this code. The end goal is to access the code from a browser. The user provides the 2 inputs in the browser and the code is executed, returning the data, which is displayed on the browser. I have gone through the django-rest framework documentation but I'm new to this whole concept and would like some guidance.
How do I implement these functions and create an API? What code should the models, serializers, views and urls files have related to my program?
models.py
from django.db import models
class App(object):
def __init__(self, phone_no, name, address, categories):
self.phone_no = phone_no
self.name = name
self.address = address
self.categories = categories
This is what I'm working with so far, to get started. The problem, however, is that the models class should essentially be dynamic. Ex: If 'A' is the database, the program returns 3 fields but if 'B' is the database, the program returns 4 values so I'm not sure what the models class would be like.
views.py
from django.views.decorators.csrf import csrf_exempt
from rest_framework.decorators import api_view
from rest_framework.response import Response
from pymongo import Connection
from models import App
from serializers import AppSerializer
import json
import pymongo
from os import listdir
import re
from django import forms
#csrf_exempt
#api_view(['GET'])
def pgs(request):
#connect to our local mongodb
db = Connection('localhost',27017)
#get a connection to our database
dbconn = db.general
dbCollection = dbconn['data']
if request.method == 'GET':
#get our collection
items = []
for r in dbCollection.find():
post = App(r["phone_no"],r["name"],r["address"],r["categories"])
items.append(post)
serializedList = AppSerializer(items, many=True)
return Response(serializedList.data)
Let's say you have identical tables in two different databases. We'd start by creating two db connections in settings.py. Let's say those are called db_a and db_b. We might model this as so:
class PhoneGenome(models.Model):
phone_no = models.CharField(length=255)
name = models.CharField(length=255)
# and so on...
class Meta:
# if database pre-exists, may want to use managed=False
managed = False
This gives us a model. Now we choose which database to pull from based on user input. In a view, you might have something like:
db_used = request.GET.get('database')
db_conns = {'a': 'db_a', 'b': 'db_b'}
if db_conns.has_key(db_used):
records = PhoneGenome.objects.using(db_conns[db_used]).filter( user_criteria_here)
The using() method in your queryset is what allows you to select which database to run the query against.
There's a lot to manage here potentially, so this would be a good time to look at the docs: https://docs.djangoproject.com/en/1.7/topics/db/multi-db/
And if you haven't already, you really should work through the Django tutorial at the very least before going much further.
For our Django App, we'd like to get an AutoField to start at a number other than 1. There doesn't seem to be an obvious way to do this. Any ideas?
Like the others have said, this would be much easier to do on the database side than the Django side.
For Postgres, it'd be like so: ALTER SEQUENCE sequence_name RESTART WITH 12345; Look at your own DB engine's docs for how you'd do it there.
For MySQL i created a signal that does this after syncdb:
from django.db.models.signals import post_syncdb
from project.app import models as app_models
def auto_increment_start(sender, **kwargs):
from django.db import connection, transaction
cursor = connection.cursor()
cursor = cursor.execute("""
ALTER table app_table AUTO_INCREMENT=2000
""")
transaction.commit_unless_managed()
post_syncdb.connect(auto_increment_start, sender=app_models)
After a syncdb the alter table statement is executed. This will exempt you from having to login into mysql and issuing it manually.
EDIT: I know this is an old thread, but I thought it might help someone.
A quick peek at the source shows that there doesn't seem to be any option for this, probably because it doesn't always increment by one; it picks the next available key: "An IntegerField that automatically increments according to available IDs" — djangoproject.com
Here is what I did..
def update_auto_increment(value=5000, app_label="xxx_data"):
"""Update our increments"""
from django.db import connection, transaction, router
models = [m for m in get_models() if m._meta.app_label == app_label]
cursor = connection.cursor()
for model in models:
_router = settings.DATABASES[router.db_for_write(model)]['NAME']
alter_str = "ALTER table {}.{} AUTO_INCREMENT={}".format(
_router, model._meta.db_table, value)
cursor.execute(alter_str)
transaction.commit_unless_managed()
I found a really easy solution to this! AutoField uses the previous value used to determine what the next value assigned will be. So I found that if I inserted a dummy value with the start AutoField value that I want, then following insertions will increment from that value.
A simple example in a few steps:
1.)
models.py
class Product(models.Model):
id = model.AutoField(primaryKey=True) # this is a dummy PK for now
productID = models.IntegerField(default=0)
productName = models.TextField()
price = models.DecimalField(max_digits=6, decimal_places=2)
makemigrations
migrate
Once that is done, you will need to insert the initial row where "productID" holds a value of your desired AutoField start value. You can write a method or do it from django shell.
From view the insertion could look like this:
views.py
from app.models import Product
dummy = {
'productID': 100000,
'productName': 'Item name',
'price': 5.98,
}
Products.objects.create(**product)
Once inserted you can make the following change to your model:
models.py
class Product(models.Model):
productID = models.AutoField(primary_key=True)
productName = models.TextField()
price = models.DecimalField(max_digits=6, decimal_places=2)
All following insertions will get a "productID" incrementing starting at 100000...100001...100002...
The auto fields depend, to an extent, on the database driver being used.
You'll have to look at the objects actually created for the specific database to see what's happening.
I needed to do something similar. I avoided the complex stuff and simply created two fields:
id_no = models.AutoField(unique=True)
my_highvalue_id = models.IntegerField(null=True)
In views.py, I then simply added a fixed number to the id_no:
my_highvalue_id = id_no + 1200
I'm not sure if it helps resolve your issue, but I think you may find it an easy go-around.
In the model you can add this:
def save(self, *args, **kwargs):
if not User.objects.count():
self.id = 100
else:
self.id = User.objects.last().id + 1
super(User, self).save(*args, **kwargs)
This works only if the DataBase is currently empty (no objects), so the first item will be assigned id 100 (if no previous objects exist) and next inserts will follow the last id + 1
For those who are interested in a modern solution, I found out to be quite useful running the following handler in a post_migrate signal.
Inside your apps.py file:
import logging
from django.apps import AppConfig
from django.db import connection, transaction
from django.db.models.signals import post_migrate
logger = logging.getLogger(__name__)
def auto_increment_start(sender, **kwargs):
min_value = 10000
with connection.cursor() as cursor:
logger.info('Altering BigAutoField starting value...')
cursor.execute(f"""
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplate"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplate";
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplatecollection"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplatecollection";
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplatecategory"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplatecategory";
""")
transaction.atomic()
logger.info(f'BigAutoField starting value changed successfully to {min_value}')
class Apiv1Config(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'apiV1'
def ready(self):
post_migrate.connect(auto_increment_start, sender=self)
Of course the downside of this, as some already have pointed out, is that this is DB specific.