I currently have an MSSQL server that has about 60 databases in it.
Does anyone know a way to list the database names with a for loop, so that I am able to access each table?
The structure should be something like this :
for each Database in Server
MyList{} = MyList + DatabaseName
endfor
I can't seem to find any documentation that includes something like this
You can query the sys tables in order to get all the objects in your server.
In order to get all the databases, run the following query:
SELECT name, database_id FROM sys.databases
Related
I have a list of databases I want to drop under a certain condition. I know GridDB SQL is not like Python
where you can write commands under a condition block. But I would like to know if I can delete
databases that have been updated only in the last two years.
Suppose I have 32 databases, or more, I want to delete the old databases with an SQL command similar
to the one below, if possible:
// Condition
DROP DATABASE database;
or using Pandas, in Python:
sql_statement = ('DROP DATABASE database')
df = pd.read_sql_query(sql_statement, cont)
Maybe database is from a generator or something. How can I go about that?
I am using GridDB Python client on my Ubuntu machine. Thanks in advance for your help!
I'm trying to migrate the database for an existing application from access to SQLite. The application uses Autonumbers to generate unique IDs in Access and some tables reference rows from other tables by these unique IDs.
What's a good way to migrate these and keep this functionality intact?
From what I've read, SQLite uses Auto indexing for this. How would I create the links between the tables? Do I have to search the other tables for the row with that unique ID and replace the reference with the SQL generated ID?
example:
table 1, has a column linkedID with a row with the value {7F99297A-DE91-4BD6-9ED8-FC13D668CDA2}, which is linked to a row in table 2 with primaryKey {7F99297A-DE91-4BD6-9ED8-FC13D668CDA2}.
Well, there not really a automated way to do this.
but, what I do to migrate data?
I setup a linked table in Access. Double check if that linked table works (you need to install the odbc driver).
Assuming you have a working linked table?
Then you can do this to export the Access table in VBA to sqlite.
Dim LocalTable As String ' name of local table link
Dim ServerTable As String ' name of table on SQL Lite
LocalTable = "Table1"
ServerTable = "TableSLite"
Dim strCon As String
strCon = CurrentDb.TableDefs("Test1").Connect
' above is a way to steal and get a working connection from a valid
' working linked table (I hate connection strings in code)
Debug.Print strCon
DoCmd.TransferDatabase acExport, "ODBC Database", strCon, acTable, LocalTable, ServerTable
Debug.Print "done export of " & LocalTable
That will get you the table in sqlite. But, there are no DDL (data definition commands) in sqlite to THEN change that PK id from Access to a PK and autonumber.
However, assuming you say have "db browser"?
Then simple export the table(s) as per above.
Now, in db browrser, open up the table, and choose modify, and simple check the AI (auto increemnt, and then PK settings - in fact if you check box AI, then the PK useally selects for you. So, after I done the above export. (and you should consider close down Access - since you had/have linked tables).
So, in db browser, we now do this:
so, for just a few tables, the above is not really hard.
However, the above export (transfer) of data does not set the PK, and auto increment for you.
If you need to do this with code, and this is not a one time export/transfer, then I don't have a good solution.
Unfortantly, SqlLite does NOT allow a alter table command to set PK and set auto increment (if that was possbile, then after a export, you could execute the DDL command in sqlite (or send the command from your client software) to make this alteration.
I not sure if sql lite can spit out the "create table" command that exists for a given table (but, I think it can). So, you might export the schema, get the DDL command, modify that command, drop the table, re-run the create table command (with current PK and auto increment), and THEN use a export or append query in Access.
But, transfer of the table(s) in question can be done quite easy as per above, but the result(s) do not set nor include the PK setting(s) for you.
However, if this is one time export? Then export of tables - and even the dirty work of figuring out the correct data types to be used?
The above works well - but you have to open up the tables in a tool like say db browser, and then set PK and auto increment.
I do the above quite often for transfer of Access tables to sqlLite tables, but it does then require some extra steps to setup the PK and auto increment.
Another possbile way if this had to be done more then one time?
I would export as per above, and then add the PK (and auto increment).
I would then grab say the 8 tables create commands from sqlLite, and save those create table commands in the client software.
then you execute the correct create table command, and then do a append query from Access. So, it really depends if this is a one time export, or this process of having to create the table(s) in sqlLite is to occur over and over.
I am working on my first heroku web-app using python and flask. I have it connected to a SQLite database locally and postgresql database through heroku. When running SQL commands in Heroku, I am able to query all of the data from another table, but when I try to access data from the "user" table, it says there is no column named "username", even though in the explorer it shows a column with the same name:
Does this mean my table is empty and the users aren't being added? I'm not getting any errors when adding the users in the app. Any help is appreciated.
"user" is a reserved word in postgresql, so when you're querying that user table, your query isn't going exactly where you expect. Specifying the full table name with its schema should work, which in your case is probably public.user.
So your query would look like this:
SELECT "username"
FROM public.user;
Another way of dealing with this is to surround "user" with quotation marks, like this:
SELECT "username"
FROM "user";
Is there a way to perform an SQL query that joins a MySQL table with a dict-like structure that is not in the database but instead provided in the query?
In particular, I regularly need to post-process data I extract from a database with the respective exchange rates. Exchange rates are not stored in the database but retrieved on the fly and stored temporarily in a Python dict.
So, I have a dict: exchange_rates = {'EUR': 1.10, 'GBP': 1.31, ...}.
Let's say some query returns something like: id, amount, currency_code.
Would it be possible to add the dict to the query so I can return: id, amount, currency_code, usd_amount? This would remove the need to post-process in Python.
This solution doesn't use a 'join', but does combine the data from Python into SQL via a case statement. You could generate the sql you want in python (as a string) that includes these values in a giant case statement.
You give no details, and don't say which version of Python, so it's hard to provide useful code. But This works with Python 2.7 and assumes you have some connection to the MySQL db in python:
exchange_rates = {'EUR': 1.10, 'GBP': 1.31, ...}
# create a long set of case conditions as a string
er_case_statement = "\n".join("mytable.currency = \"{0}\" then {1}".format(k,v) for (k,v) in exchange_rates.iteritems())
# build the sql with these case statements
sql = """select <some stuff>,
case {0}
end as exchange_rate,
other columns
from tables etc
where etc
""".format(er_case_statement)
Then send this SQL to MySQL
I don't like this solution; you end up with a very large SQL statement which can hit the maximum ( What is maximum query size for mysql? ).
Another idea is to use temporary tables in mysql. Again assuming you are connecting to the db in python, with python create the sql that creates a temporary table and insert the exchange rates, send that to MySQL, then build a query that joins your data to that temporary table.
Finally you say you don't want to post-process in python but you have a dict from somewhere do I don't know which environment you are using BUT if you can get these exchange rates from the web, say with CURL, then you could use shell to also insert these values into a MySQL temp table, and join there.
sorry this is general and not specific, but the question could use more specificity. Hope it helps someone else give a more targeted answer.
I have an SQL query thats runs on the Postgres database of my Django based webapp. The query runs against the data stored by Django-Notifications (a reusable app) and returns a list of email addresses that have not opted out of a specific notice type.
What I would really like to be able to do is to build an application that does this on demand, so I'm looking for an example of how to convert the SQL so it can run inside a Django view that will pass out a formatted email list. The SQL is currently thus:
gr_webapp=# select email from emailconfirmation_emailaddress where verified and user_id not in
(select user_id from notification_noticesetting s join notification_noticetype t on s.notice_type_id = t.id
where t.label = 'announcement' and not s.send);
You might have to make appropriate adjustments as far as model names go, since you didn't show them in your question:
users_to_exclude = Noticesetting.objects.filter(send=False, notice_type__label='announcement').values('user')
emails = Emailaddress.objects.exclude(user__in=users_to_exclude)