Calling python script in web2py framework over webserver - python

I have NGINX UWSGI and WEB2PY installed on the server. Web2py application performing only one function by accessing the database and printing rows in the table.
def fetch():
import psycopg2
conn = psycopg2.connect(database="postgres",
user="postgres",
password="qwerty",
host="127.0.0.1")
cur = conn.cursor()
cur.execute("SELECT id, name from TEST")
rows = cur.fetchall()
conn.close()
return rows
When the function is called locally the table contents is returned.
But when I'm trying to call the function from remote machine I get an internal error 500.
One more interesting thing, is when function looks like this:
def hello():
return 'hello'
String 'hello' is returned. Starting adding it an import directive immediately causes error page to be generated.
Can any one please suggest the proper application syntax/logic?

My guess is that your MySQL service doesn't allow remote access. Could you check your MySQL configuration?
vim /etc/mysql/my.cnf
Comment out the following lines.
#bind-address = 127.0.0.1
#skip-networking
If there is no skip-networking line in your configuration file, just add it and comment out it.
And then restart the mysql service.
service mysql restart

Forgive the stupid question but have you checked if the module is available on your server?
When you say that the error appears in your hello function as soon as you try to import, it's the same directive import psycopg2?
Try this:
Assuming that fetch() it's defined on controllers/default.py
open folder views/default and create a new file called fetch.html
paste this inside
{{extend 'layout.html'}}
{{=rows}}
fetch.html is a view or a template if you prefer
Modify fetch() to return a dictionary with rows for the view to print
return dict(rows=rows)
this is very basic tough, you can find more information about basic steps in the book -> http://www.web2py.com/books/default/chapter/29/03/overview#Postbacks

Related

Import Excel file into MSSQL via Python, using an SQL Agent job

Task specs:
Import Excel file(s) into MSSQL database(s) using Python, but in a parametrized manner, and using SQL Server Agent job(s).
With the added requirement to set parameter values and/or run the job steps from SQL (query or SP).
And without using Access Database Engine(s) and/or any code that makes use of such drivers (in any wrapping).
First. Let's get some preparatory stuff out of the way.
We will need to set some PowerShell settings.
Run windows PowerShell as Administrator and do:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted
Second. Some assumptions for reasons of clarity.
And those are:
1a. You have at least one instance of SQL2017 or later (Developer / Enterprise / Standard edition) installed and running on your machine.
1b. You have not bootstrapped the installation of this SQL instance so as to exclude Integration Services (SSIS).
1c. There exists SQL Server Agent running, bound to this SQL instance.
1d. You have some SSMS installed.
2a. There is at least one database attached to this instance (if not create one – please refrain from using in-memory filegroups for this exercise, I have not tested on those).
2b. There are no database level DML triggers that log all data changes in a designated table.
3. There is no active Server Audit Specification for this database logging everything we do.
4. Replication is not enabled (I mean the proper MSSQL replication feature not like scripts by 3rd party apps).
For 2b and 3 it's just cause I have not tested this with those on, but for number 4 it defo won't work with that on.
5. You are windows authenticated into the chosen SQL instance and your instance login and db mappings and privileges are sufficient for at least table creation and basic stuff.
Third.
We are going to need some kind of Python script to do this right?
Ok let's make one.
import pandas as pd
import sqlalchemy as sa
import urllib
import sys
import warnings
import os
import re
import time
#COMMAND LINE PARAMETERS
server = sys.argv[1]
database = sys.argv[2]
ExcelFileHolder = sys.argv[3]
SQLTableName = sys.argv[4]
#END OF COMMAND LINE PARAMETERS
excel_sheet_number_left_to_right = 0
warnings.filterwarnings('ignore')
driver = "SQL Server Native Client 11.0"
params = "DRIVER={%s};SERVER=%s;DATABASE=%s;Trusted_Connection=yes;QuotedID=Yes;" % (driver, server, database) #added the explicit "QuotedID=Yes;" to ensure no issues with column names
params = urllib.parse.quote_plus(params) #urllib.parse.quote_plus for Python 3
engine = sa.create_engine("mssql+pyodbc:///?odbc_connect=%s?charset=utf8" % params) #charset is cool to have here
conn = engine.connect()
def execute_sql_trans(sql_string, log_entry):
with conn.begin() as trans:
result = conn.execute(sql_string)
if len(log_entry) >= 1:
log.write(log_entry + "\n")
return result
excelfilesCursor = {}
def process_excel_file(excelfile, excel_sheet_name, tableName, withPyIndexOrSQLIndex, orderByCandidateFields):
withPyIndexOrSQLIndex = 0
excelfilesCursor.update({tableName: withPyIndexOrSQLIndex})
df = pd.read_excel(open(excelfile,'rb'), sheet_name=excel_sheet_name)
now = time.time()
mlsec = repr(now).split('.')[1][:3]
log_string = "Reading file \"" + excelfile + "\" to memory: " + str(time.strftime("%Y-%m-%d %H:%M:%S.{} %Z".format(mlsec), time.localtime(now))) + "\n"
print(log_string)
df.to_sql(tableName, engine, if_exists='replace', index_label='index.py')
now = time.time()
mlsec = repr(now).split('.')[1][:3]
log_string = "Writing file \"" + excelfile + "\", sheet " +str(excel_sheet_name)+ " to SQL instance " +server+ ", into ["+database+"].[dbo].["+tableName+"]: " + str(time.strftime("%Y-%m-%d %H:%M:%S.{} %Z".format(mlsec), time.localtime(now))) + "\n"
print(log_string)
def convert_datetimes_to_dates(tableNameParam):
sql_string = "exec [convert_datetimes_to_dates] '"+tableNameParam+"';"
execute_sql_trans(sql_string, "")
process_excel_file(ExcelFileHolder, excel_sheet_number_left_to_right, SQLTableName, 0, None)
sys.exit(0)
Ok you may or may not notice that my script contains some extra defs, I sometimes use them for convenience you may as well ignore them.
Save the python script somewhere nice say C:\PythonWorkspace\ExcelPythonToSQL.py
Also, needless to mention that you will need some py modules in your venv. The ones you don't already have you need to pip install them obviously.
Fourth.
Connect to your db, SSMS, etc. and create a new Agent job.
Let's call it "ExcelPythonToSQL".
New step, let's call it "PowerShell parametrized script".
Set the Type to PowerShell.
And place this code inside it:
$pyFile="C:\PythonWorkspace\ExcelPythonToSQL.py"
$SQLInstance="SomeMachineName\SomeNamedSQLInstance"
#or . or just the computername or localhost if your SQL instance is a default instance i.e. not a named one.
$dbName="SomeDatabase"
$ExcelFileFullPath="C:\Temp\ExampleExcelFile.xlsx"
$targetTableName="somenewtable"
C:\ProgramData\Miniconda3\envs\YOURVENVNAMEHERE\python $pyFile $SQLInstance $dbName $ExcelFileFullPath $targetTableName
Save the step and the job.
Now let's wrap it around something easier to handle. Because remember, this job and step is not like an SSIS step where you could potentially alter the parameter values in its configuration tab. You don't want to properties the job and the step each time and specify different excel file or target table.
So.
Ah also, do me a solid and do this little trick. Do a small alteration in the code, anything and then instead of OK do a Script to New Query Window. That way we can capture the guid of the job without having to query for it.
So now.
Create a SP like so:
use [YourDatabase];
GO
create proc [ExcelPythonToSQL_amend_job_step_params]( #pyFile nvarchar(max),
#SQLInstance nvarchar(max),
#dbName nvarchar(max),
#ExcelFileFullPath nvarchar(max),
#targetTableName nvarchar(max)='somenewtable'
)
as
begin
declare #sql nvarchar(max);
set #sql = '
exec msdb.dbo.sp_update_jobstep #job_id=N''7f6ff378-56cd-4a8d-ba40-e9057439a5bc'', #step_id=1,
#command=N''
$pyFile="'+#pyFile+'"
$SQLInstance="'+#SQLInstance+'"
$dbName="'+#dbName+'"
$ExcelFileFullPath="'+#ExcelFileFullPath+'"
$targetTableName="'+#targetTableName+'"
C:\ProgramData\Miniconda3\envs\YOURVENVGOESHERE\python $pyFile $SQLInstance $dbName $ExcelFileFullPath $targetTableName''
';
--print #sql;
exec sp_executesql #sql;
end
But inside it you must replace 2 things. One, the global uniqueidentifier for the Agent job that you found by doing the trick I described earlier, yes the one with the script to new query window. Two, you must fill in the name of your Python venv replacing the word YOURVENVGOESHERE in the code.
Cool.
Now, with a simple script we can play-test it.
Let's have in a new query window something like this:
use [YourDatabase];
GO
--to set parameters
exec [ExcelPythonToSQL_amend_job_step_params] #pyFile='C:\PythonWorkspace\ExcelPythonToSQL.py',
#SQLInstance='.',
#dbName='YourDatabase',
#ExcelFileFullPath='C:\Temp\ExampleExcelFile.xlsx',
#targetTableName='somenewtable';
--to execute the job
exec msdb.dbo.sp_start_job N'ExcelPythonToSQL', #step_name = N'PowerShell parametrized script';
--let's test that the table is there and drop it.
if object_id('YourDatabase..somenewtable') is not null
begin
select 'Table was here!' [test: table exists?];
drop table [somenewtable];
end
else select 'NADA!' [test: table exists?];
You can run the set parameters part, then the execution, carefull to then wait a little bit like a few seconds, calling the sp_start_job like in this script is asynchronous. And then run the test script to clean up and make sure it had gone in.
That's it.
Obviously lots of variations are possible.
Like in the job step, we could instead call a batch file, we could call a powershell .ps1 file and have the parameters in there, lots and lots of other ways of doing it. I merely described one in this post.

Pytest-django: cannot delete db after tests

I have a Django application, and I'm trying to test it using pytest and pytest-django. However, quite often, when the tests finish running, I get the error that the database failed to be deleted: DETAIL: There is 1 other session using the database.
Basically, the minimum test code that I could narrow it down to is:
#pytest.fixture
def make_bundle():
a = MyUser.objects.create(key_id=uuid.uuid4())
return a
class TestThings:
def test_it(self, make_bundle):
all_users = list(MyUser.objects.all())
assert_that(all_users, has_length(1))
Every now and again the tests will fail with the above error. Is there something I am doing wrong? Or how can I fix this?
The database that I am using is PostgreSQL 9.6.
I am posting this as an answer because I need to post a chunk of code and because this worked. However, this looks like a dirty hack to me, and I'll be more than happy to accept anybody else's answer if it is better.
Here's my solution: basically, add the raw sql that kicks out all the users from the given db to the method that destroys the db. And do that by monkeypatching. To ensure that the monkeypatching happens before tests, add that to the root conftest.py file as an autouse fixture:
def _destroy_test_db(self, test_database_name, verbosity):
"""
Internal implementation - remove the test db tables.
"""
# Remove the test database to clean up after
# ourselves. Connect to the previous database (not the test database)
# to do so, because it's not allowed to delete a database while being
# connected to it.
with self.connection._nodb_connection.cursor() as cursor:
cursor.execute(
"SELECT pg_terminate_backend(pg_stat_activity.pid) "
"FROM pg_stat_activity "
"WHERE pg_stat_activity.datname = '{}' "
"AND pid <> pg_backend_pid();".format(test_database_name)
)
cursor.execute("DROP DATABASE %s"
% self.connection.ops.quote_name(test_database_name))
#pytest.fixture(autouse=True)
def patch_db_cleanup():
creation.BaseDatabaseCreation._destroy_test_db = _destroy_test_db
Note that the kicking-out code may depend on your database engine, and the method that needs monkeypatching may be different in different Django versions.

Python sqlite3.OperationalError: no such table:

I am trying to store data about pupils at a school. I've done a few tables before, such as one for passwords and Teachers which I will later bring together in one program.
I have pretty much copied the create table function from one of these and changed the values to for the Pupil's information. It works fine on the other programs but I keep getting:
sqlite3.OperationalError: no such table: PupilPremiumTable
when I try to add a pupil to the table, it occurs on the line:
cursor.execute("select MAX(RecordID) from PupilPremiumTable")
I look in the folder and there is a file called PupilPremiumTable.db and the table has already been created before, so I don't know why it isn't working.
Here is some of my code, if you need more feel free to tell me so, as I said it worked before so I have no clue why it isn't working or even what isn't working:
with sqlite3.connect("PupilPremiumTable.db") as db:
cursor = db.cursor()
cursor.execute("select MAX(RecordID) from PupilPremiumTable")
Value = cursor.fetchone()
Value = str('.'.join(str(x) for x in Value))
if Value == "None":
Value = int(0)
else:
Value = int('.'.join(str(x) for x in Value))
if Value == 'None,':
Value = 0
TeacherID = Value + 1
print("This RecordID is: ",RecordID)
You are assuming that the current working directory is the same as the directory your script lives in. It is not an assumption you can make. Your script is opening a new database in a different directory, one that is empty.
Use an absolute path for your database file. You can base it on the absolute path of your script:
import os.path
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
db_path = os.path.join(BASE_DIR, "PupilPremiumTable.db")
with sqlite3.connect(db_path) as db:
You can verify what the current working directory is with os.getcwd() if you want to figure out where instead you are opening the new database file; you probably want to clean up the extra file you created there.
I had the same problem and here's how I solved it.
I killed the server by pressing Ctrl+C
I deleted the pychache folder. You'll find this folder in your project folder.
I deleted the sqlite db.
I made migrations with python manage.py makemigrations <app_name> where <app_name> is the specific app that contains the model that's causing the error. In my case it was the mail app so I ran python manage.py makemigrations app.
I migrated in the normal way.
Then I started the server and it was all solved.
I believe the issue is as Jorge Cardenas said:
Maybe you are loading views or queries to database but you haven´t
granted enough time for Django to migrate the models to DB. That's why
the "table doesn't exist".
This solution is based on this youtube video
First, you need to check if that table 100% exist in the database. You can use sqlite viewer for that: https://inloop.github.io/sqlite-viewer/.
If the table exists, then you can write your table name in '', for example:
Select * from 'TableName'
Whatever your query is, I am just using Select * as an example.
I have to face same issue and there are a couple of approaches, but the one I think is the most probable one.
Maybe you are loading views or queries to database but you haven´t granted enough time for Django to migrate the models to DB. That's why the "table doesn't exist".
Make sure you use this sort of initialization in you view's code:
form RegisterForm(forms.Form):
def __init__(self, *args, **kwargs):
super(RegisterForm, self).__init__(*args, **kwargs)
A second approach is you clean previous migrations, delete the database and start over the migration process.
I had the same issue when I was following the flask blog tutorial. I had initialized the database one time when it started giving me the sqlite3.OperationalError: then I tried to initialize again and turns out I had lots of errors in my schema and db.py file. Fixed them and initialized again and it worked.
Adding this worked for me:
import os.path
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
db_dir = (BASE_DIR + '\\PupilPremiumTable.db')
Note the need for \\ before PupilPremiumTable.bd for the code to work .

UPDATE statement on Access database fails silently under pyodbc

I have a problem with a simple UPDATE statement. I wrote a Python tool which creates a lot of UPDATE statements and after creating them I want to execute them on my Access database but it doesn't work This is one statement for example:
UPDATE FCL_B_COVERSHEET_A SET BRANCH = 0 WHERE OBJ_ID = '1220140910132011062005';
The statement syntax is not the problem. I tested it and it works.
This next code snippet shows the initialization for the connect object.
strInputPathMDB = "C:\\Test.mdb"
DRV = '{Microsoft Access Driver (*.mdb)}';
con = pyodbc.connect('Driver={0};Dbq={1};Uid={2};Pwd={3};'.format(DRV,strInputPathMDB,"administrator",""))
After that I wrote a method which execute one SQL statement
def executeSQLStatement(conConnection, strSQL):
arcpy.AddMessage(strSQL)
cursor = conConnection.cursor()
cursor.execute(strSQL)
conConnection.commit()
and if I execute this code everything seems to work - no error message or anything like that - but also the data is not updated and I don't know what I'm doing wrong ...
for strSQL in sqlStateArray:
executeSQLStatement(con, strSQL)
con.close()
I hope you understand what my problem is. Thanks for your help.
Chris
The issue here was that the .mdb file was in the root folder of the C: drive. Root folders often restrict normal users to read-only access so the database file was being opened as read-only. Moving the .mdb file to a public folder solved the problem.

SQLite Insert command in Python script Doesn't work on web

I'm trying to use an SQLite insert operation in a python script, it works when I execute it manually on the command line but when I try to access it on the web it won't insert it in the database. Here is my function:
def insertdb(unique_id,number_of_days):
conn = sqlite3.connect('database.db')
print "Opened database successfully";
conn.execute("INSERT INTO IDENT (ID_NUM,DAYS_LEFT) VALUES (?,?)",(unique_id,number_of_days));
conn.commit()
print "Records created successfully";
conn.close()
When it is executed on the web, it only shows the output "Opened database successfully" but does not seem to insert the value into the database. What am I missing? Is this a server configuration issue? I have checked the database permissions on writing and they are correctly set.
The problem is almost certainly that you're trying to create or open a database named database.db in whatever happens to be the current working directory, and one of the following is true:
The database exists and you don't have permission to write to it. So, everything works until you try to do something that requires write access (like commiting an INSERT).
The database exists, and you have permission to write to it, but you don't have permission to create new files in the directory. So, everything works until sqlite needs to create a temporary file (which it almost always will for execute-ing an INSERT).
Meanwhile, you don't mention what web server/container/etc. you're using, but apparently you have it configured to just swallow all errors silently, which is a really, really bad idea for any debugging. Configure it to report the errors in some way. Otherwise, you will never figure out what's going on with anything that goes wrong.
If you don't have control over the server configuration, you can at least wrap all your code in a try/except and manually log exceptions to some file you have write access to (ideally via the logging module, or just open and write if worst comes to worst).
Or, you can just do that with dumb print statements, as you're already doing:
def insertdb(unique_id,number_of_days):
conn = sqlite3.connect('database.db')
print "Opened database successfully";
try:
conn.execute("INSERT INTO IDENT (ID_NUM,DAYS_LEFT) VALUES (?,?)",(unique_id,number_of_days));
conn.commit()
print "Records created successfully";
except Exception as e:
print e # or, better, traceback.print_exc()
conn.close()

Categories

Resources