Run a SQL Server Agent Job from Python - python

I am trying to trigger a SQL Server Agent Job (takes a backup of the db and places into a directory) from python. Unfortunately, I haven't found anything in regards to python triggering a SQL Server Agent Job (only the other way around, SQL Server Agent Job triggering a python script).
Once I get that backup, I want to restore this db into a different SQL Server using the same python script.
Thanks for any help!!

You can run a job from Transact-SQL from Python:
EXEC dbo.sp_start_job N'My Job Name';
GO
See documentation for more information.

Related

How do I connect to a MySQL Database in Python with a RaspberryPi

I'm a student and I'm trying to write some sensor values into a MySQL database.
As IDE I'll be using Inteliji.
First off I started by installing the database Plug-in.
This was done successfully
Next I tried to connect to the data base (see figure below)
Figure of successful connection
Now The next thing I want to do is use a MySQL connector.
Therefore I've installed MySQL onto the r-PI and used following code to implement it.
import mysql.connector
print("Step 1")
cnx = mysql.connector.connect(user='a21ib2a01',
password='secret',
host='mysql.studev.groept.be',
database='a21ib2a01')
Print("Step 2")
When now I run my code the terminal will output:
Step1
For some reason I don't know; the connect function always times my program out with the next occurring errors:
mysql.connector.errors.InterfaceError: 2003: Can't connect to MySQL server on 'mysql.studev.groept.be:3306' (110 Connection timed out)
So does anyone know why my connection is successful but I can't connect to it?
Long story short what am I doing wrong and how do I fix this?
Thanks in advance!
Your timeout means the network on your rPi cannot reach -- cannot find a route to -- your MySQL host mysql.studev.groept.be.
If you do traceroute mysql.studev.groept.be in a shell in the rPi you may see what's wrong.
When in a shell on your rPi, can you ssh to any machine in your uni's network? If so, you might be able to use ssh port-forwarding to get a route to the database server.
Do you run intelliJ on the rPi directly, or on your laptop? If you run it on the laptop, it looks like the laptop can find a route to your server but the rPi cannot.
(If this were my project, I'd install a MySQL server on my laptop to reduce the network-engineering hassles of connecting through multiple hops involving a VPN.)

Linux Service running Python script does not update SQL database

I have a python script that scans a stock market for transactions and saves them in the SQL database. The script works on its own if I run it directly python3 fetchTradesLuno24Hours.py and this updates the database. However, if I run it as a service it stops updating the database. If I run systemctl status lunoDatabase.service it shows that service successfully run. The service is triggered by lunoDatabase.timer that runs it every several hours. If I run systemctl status lunoDatabase.timer or systemctl list-timers I see that the timer works and the script is triggered successfully. The service reports that the python script run-time is as expected (around 6 minutes) and the code exited successfully.
Before I tried running python script in the infinite loop and that worked fine and the database was updated correctly from the service. When I added timer it stopped updating the database. I would like the service to update the SQL database and to be trigger by the timer. How can I fix that?
The problem was in the python script. Since I address the python file from the root folder, I should have specified the absolute path the database in database.py.
db = sqlite3.connect('home/user/bot/transactions.db')
and not
db = sqlite3.connect('transactions.db')
Thank you, everyone!

Execute a Hadoop Job in remote server and get its results from python webservice

I have a Hadoop job packaged in a jar file that I can execute in a server using command line and store the results in the hdfs of the server using the command line.
Now, I need to create a Web Service in Python (Tornado) that must to execute the Hadoop Job and get the results to present them to the user. The Web Service is hosted in other server.
I googled a lot for call the Job from outside the server in python Script but unfortunately did not have answers.
Anyone have a solution for this?
Thanks
One option could be install the binaries of hadoop in your webservice server using the same configuration than in your hadoop cluster. You will require that to be able to talk with the cluster. You don't nead to lunch any hadoop deamon there. At least configure HADOOP_HOME, HADOOP_CONFIG_DIR, HADOOP_LIBS and set properly the PATH environment variables.
You need the binaries because you will use them to submit the job and the configurations to tell hadoop client where is the cluster (the namenode and the resourcemanager).
Then in python you can execute the hadoop jar command using subprocess: https://docs.python.org/2/library/subprocess.html
You can configure the job to notify your server when the job has finished using a callback: https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/
And finally you could read the results in HDFS using WebHDFS (HDFS WEB API) or using some python HDFS package like: https://pypi.python.org/pypi/hdfs/

use psycopg with installed PostgreSQL Database

I would like to get started with working with python and PSQL databases.
The script I have in mind will (at least not at the beginning) run on hosts with an installed PSQL Database, but I want the script to connect remotely to the database.
In fact (for the start):
user hosts: run the script (read xls, convert, manipulate, etc) and write into remote DB
DB Host: this will host the db and gets connections from the user hosts and the "Sync Host"
Sync Host: a cloud service which will connect to the db server to read the databases and do some "magic" with it.
from what I have read, the best python module for PSQL connection is psycopg, but this seems to require an installed PSQL Database, which is something I do not have (and don't want to install) on the user hosts.
At a later stage I will remove the "user hosts" and provide a webinterface for uploading the xls and do the conversion, etc on the db host, but for the beginning I wanted to start as mentioned above.
My questions:
Is my thinking totally wrong? should I start with a central approach (webinterface, etc) right awa
If not, how can I get a PSQL connection method implemented in python without installing a PSQL Database?
All User hosts are Mac OS X, so Python is already installed.
thanks a lot in advance
Andre

Batch execution of SAS using a Telnet connection in Python

I have been interested in finding an alternative to the UI in SAS for quite some time now. We license SAS on our server instead of our desktops, so furthermore we have to launch a remote desktop application to execute code.
I was able to use a Telnet connection instead to remotely connect to the server, and batch execute SAS programs. Then I was interested in whether a python script could be made to connect remotely, and batch execute code, and this script could be executed in jEdit as a BeanShell script.
So far, I have Python code which successfully opens and closes the Telnet connection. It can do basic shell functions like call "dir". However, when I pass the exact same line that I use to execute SAS from command prompt on the remote server with a telnet connection in Python, nothing happens.
Is it possible the server is preventing me from executing code from a script? I use a "read_until" statement for the prompt before running any code.
Here's a few ideas...
The issue you are having above may be related to Local Security Policy settings in Windows (if it is running on a windows server). I'm far from an expert on that stuff but I remember older SAS/Intranet installations required some rumaging around in there to get them working.
As an alternative to the approach you are trying above you could also setup a SAS session on the server that listens for incoming socket requests as per this article:
http://analytics.ncsu.edu/sesug/2000/p-1003.pdf
And finally... Not sure if this helps or not by I remotely execute SAS jobs using PSEXEC. A description of how I set it all up can be found here:
http://www.runsubmit.com/questions/260/hide-sas-batch-jobs-winxp
Good luck
This paper outlines how you can use a Python script to connect to a Unix server using SSH, copy the SAS program written locally onto the server, batch submit it, and download the results back to your local machine, all using a BeanShell macro script for jEdit.

Categories

Resources