Running a Python function from Ansible script - python

I have a Django project hosted on a remote server. This contains a file called tmp_file.py. There's a function called fetch_data() inside that file. Usually I follow the below approach to run that function.
# Inside Django Project
$ python manage.py shell
[Shell] from tmp_file import feth_data
[Shell] fetch_data()
Also the file doesn't contain __name__ section. So can't run as a stand alone script. What's the best way to perform this task using Ansible. I couldn't find anything useful from Ansible docs.

There's --command switch for shell django-admin command.
So you can try in Ansible:
- name: Fetch data
command: "django-admin shell --command='from tmp_file import feth_data; fetch_data()'"
args:
chdir: /path/to/tmp_file

Related

Can't execute shell script in snakemake

I recently started using snakemake and would like to run a shell script in my snakefile. However, I'm having trouble accessing input, output and params. I would appreciate any advice!
Here the relevant code snippets:
from my snakefile
rule ..:
input:
munged = 'results/munged.sumstats.gz'
output:
ldsc = 'results/ldsc.txt'
params:
mkdir = 'results/ldsc_results/',
ldsc_sumstats = '/resources/ldsc_sumstats/',
shell:
'scripts/run_gc.sh'
and the script:
chmod 770 {input.munged}
mkdir -p {params.mkdir}
ldsc=$(ls {params.ldsc_sumstats})
for i in $ldsc; do
...
I get the following error message:
...
chmod: cannot access '{input.munged}': No such file or directory
ls: cannot access '{params.ldsc_sumstats}': No such file or directory
...
The syntax of using {} statements applies only to shell scripts defined within Snakefile, while in the example you provide, the script is defined externally.
If you want to use the script as an external script you will need to pass the relevant arguments (and parse them inside the shell script). Otherwise, it should be possible to copy-paste the script content inside the shell directive and let snakemake substitute the {} variables.
From v7.14.0 Snakemake now supports executing external Bash scripts, with access to snakemake objects inside. See the docs for example usage.

Calling Matlab scripts from Django with Python's Popen class

I'm developing a Django app which runs Matlab scripts with Python's Popen class. The python script that calls Matlab scripts lives in the main folder of my Django app (with views.py). When I call the script from command line, it runs like a charm but when I make a request from the client in order to run the corresponding python script, I receive the following warning:
"< M A T L A B (R) > Copyright 1984-2018 The MathWorks, Inc. R2018a (9.4.0.813654) 64-bit (glnxa64) February 23, 2018 To get started, type one of these: helpwin, helpdesk, or demo. For product information, visit www.mathworks.com. >> [Warning: Unable to create preferences folder in /var/www/.matlab/R2018a. Preferences folder location must be writable. Using a temporary preferences folder for this MATLAB session. See the preferences documentation for more details.] >>
My app uses a Python virtual environment and it is being deployed with Apache web server.
Here is my python script that calls Matlab scripts:
import os
import subprocess as sp
import pymat_config
def pymat_run():
pwd = pymat_config.pwd_config['pwd']
cmd1 = "-r \"Arg_in = '/path/to/my/main/folder/input.txt'; Arg_out = '/path/to/my/main/folder/file.txt'; matlab_script1\""
baseCmd1 = ['/usr/local/MATLAB/R2018a/bin/matlab', '-nodesktop', '-nosplash', '-nodisplay', 'nojvm', cmd1]
os.chdir('/path/to/matlab_script1')
sudo_cmd = sp.Popen(['echo', pwd], stdout=sp.PIPE)
exec1 = sp.Popen(['sudo', '-S'] + baseCmd1, stdin=sudo_cmd.stdout, stdout=sp.PIPE, stderr=sp.PIPE)
out, err = exec1.communicate()
return out
Any suggestions ?
Finally I managed to find the solution of that issue by myself. The problem came from the kind of user who called the Matlab's script. When I was running the above script from a Python interpreter or from the shell, it was the user (with the user password) who was running the script while when I was calling the script from the client the user was the web server's user: www-data.
So at first to avoid the above warning I gave permissions to www-data user to the /var/www folder with the following command:
sudo chown -R www-data /var/www/
After that, the "Warning" disappeared but the script still didn't run because it was asking for www-data's password internally and taking user's password from pymat_config file.
To solve this, I edited /etc/sudoers file in order for www-data to be able to call Matlab scripts without asking password. So I added the following line:
www-data ALL=(ALL) NOPASSWD: /usr/local/MATLAB/R2018a/bin/matlab
and now it runs like a charm !

Referencing ini file fails in cron

I have a python script that queries a database. I run it from the terminal with python3 myscript.py
I've added a cron task for it in my crontab file
*/30 9-17 * * 1-5 python3 /path/to/my/python/script\ directory\ space/myscript.py
The script imports a function in the same directory that parses login info for a database located in database.ini in the same directory. The database.ini is:
[postgresql]
host=my-db-host-1-link.11.thedatabase.com
database=dbname
user=username
password=password
port=10898
But currently cron outputs to the file in my mail folder:
Section postgresql not found in the database.ini file
The section is clearly present in the database.ini file, so what am I missing here?
Instead of running "python3 myscript.py" in the directory where it is present, try running it from some other directory (like home directory). Most likely you will see the same issue.
Note that cron's current-working-directory is different on different systems. So, the safest method is to explicitly switch to the directory where your script is and run the command there:
cd /path/to/my/python/script\ directory\ space/ && python3 myscript.py
Try this:
import os
...
change --> filename=database.ini
for --------> filename=os.path.dirname(__file__)+'/database.ini'

running python script with an ECS task

I have an ECS task setup which, when with a Command override ls, produces expected results with my CloudWatch log stream: test.py. my script test.py takes one parameter. I am wondering how I can execute this script with python3 (which exists in my container) using the command override. Essentially, I want to execute the command:
python3 test.py hello
how can I do this?
Here's how I did something similar:
In your docker build file, make the command you want to run as the last instruction. In your case:
CMD python3 test.py hello
To make it more extensible, use environment variables. For instance, do something like:
CMD ["python3", "test.py"]
But make the parameter come from an environment variable you pass into the container definition in your task.

How do I run a Python script that is part of an application I uploaded in an AWS SSH session?

I'm trying to run a Python script I've uploaded as part of my AWS Elastic Beanstalk application from my development machine, but can't figure out how to. I believe I've located the script correctly, but when I attempt to run it under SSH, I get an import error.
For example, I have a Flask-Migrate migration script as part of my application (pretty much the same as the example in the documentation), but after successfully SSHing to my EB instance with
> eb ssh
and locating the script with
$ sudo find / -name migrate.py
when I run in the directory (/opt/python/current) where I located it with
$ python migrate.py db upgrade
at the SSH prompt I get
Traceback (most recent call last):
File "db_migrate.py", line 15, in <module>
from flask.ext.script import Manager
ImportError: No module named flask.ext.script
even though my requirements.txt (present along with the rest of my files in the same directory) has flask-script==2.0.5.
On Heroku I can accomplish all of this in two steps with
> heroku run bash
$ python migrate.py db upgrade
Is there equivalent functionality on AWS? How do I run a Python script that is part of an application I uploaded in an AWS SSH session? Perhaps I'm missing a step to set up the environment in which the code runs?
To migrate your database the best is to use container_commands, they are commands that will run every time you deploy your application. There is a good example in the EBS documentation (Step 6) :
container_commands:
01_syncdb:
command: "django-admin.py syncdb --noinput"
leader_only: true
The reason why you're getting an ImportError is because EBS installs your packages in a virtualenv. Before running arbitrary scripts in your application in SSH, first change to the directory containing your (latest) code with
cd /opt/python/current
and then activate the virtualenv
source /opt/python/run/venv/bin/activate
and set the environment variables (that your script probably expects)
source /opt/python/current/env

Categories

Resources