For a machine learning task at school I wrote my own MLP network. The data set is quite big, and training takes forever. I was alerted to the option of running my script on the Google Cloud Compute Engine. I tried to set this up, but did not succeed (yet).
The steps I undertook where:
Create an account
Create a VM
Open the VM via the browser
Can anyone help me with importing and running my python script into the Google Cloud. Or does anyone have clear a tutorial on how to solve this? I tried finding these myself, but had no success so far.
I finally figured this out so I'll post the same answer on my own post that worked for me here. Using Debian Stretch on my VM. I'm assuming you already uploaded your file(s) to the VM and that you are in the same directory of your script.
Make your script an executable
chmod +x myscript.py
Run the nohup command to execute the script in the background. The & option ensures that the process stays alive after exiting. I've added the shebang line to my python script so there's no need to call python here
nohup /path/to/script/myscript.py &
Logout from the shell if you want
logout
Done! Now your script is up and running. You can login back and make sure that your process is still alive by checking the output of this command:
ps -e | grep myscript.py
If anything went wrong, you can check out the nohup.out file to see the output of your script:
cat nohup.out
There is even a simpler approach to to run code in the background in gcp and in every linux terminal: using screen linux
Create a new background terminal window:
screen -S WRITE_A_NAME_OF_YOUR_CHOIC_HERE
now you are in a background window in the terminal. Run your code:
python3 mycode.py
Exit screen with the hotkeys and the job will keep running on the background.
ctrl + A + D
You can close all windows now. If you wanna go back and see what's happening. Log again in your terminal. And tap the following.
screen -ls
This one will give you the list of the created "windows". Now find yours and tap
screen -r WRITE_NAME_OF_YOUR_WINDOW
And there you have it :D
You can find more commands here
You can use Google Cloud Platform tutorials itself and is very simple to follow. Links are given below
Setting up Python
https://cloud.google.com/python/setup
Getting started
https://cloud.google.com/python/getting-started/hello-world
Please note that you don't have any free tier to run Python 3.x, Standard environment with free tier only supports Python 2.x.
Edit: As per the latest update Python 3.x is default in standard environment
Just navigate to the directory where the script is placed.
python thenameofscript.py
I used Way Script which is great and have a free plan to run every hour
You can check this video to see explanation
Related
So, on a Raspberry Pi I'm using a camera app with a web interface, I wanted to add LED lighting by adding a neopixel. I have successfully done this and can now turn it on and off running two python scripts.
Explanation and question:
I have a python script in /usr/local/bin that is executable.
It is owned by 'root root'.
I have a shell script in /var/www/html/macros that is executable and has to run the python script in /usr/local/bin.
The shell script is owned by 'www-data'
When I manually run the python file, it executes the script.
When I manually run the shell script, it executes the python script.
When I run the shell script by clicking on a button on my webpage, it seems to execute the shell script correctly, however, it looks like it doesn't execute the python script.
What can I do to fix this?
I'm not that experienced with permissions, but I wanted to emphasize on the fact that this is a closed system that does not contain any sensitive information. So safety/best practice is not a concern. I just want to make this work.
I'm not an expert in this area, but I believe to access /usr/local/bin/ you need root privileges which explains why you're having success but not Apache.
Rather than give Apache root permissions, it's best to simply remove the requirement from the individual file you want to execute. This can be accomplished by
$ cd /usr/local/bin
$ sudo chmod 777 your_script.py
Now, after 11 hours and a group of people thinking along we found a solution to the problem.
The problem turned out to be that the Web interface can only execute as 'www-data', and the NeoPixel library that the python script depends on needs to be executed as sudo/root.
These two factors make it so that there will never be a direct way of getting the scripts to work together.
However, the idea emerged to use some sort of pipe.
A brilliant user suggested to me to use sshpass. This would allow to pass data to ssh and have it essentially be executed as a root user.
The data from the web interface would be relayed to the sshpass and this would successfully run the needed scripts with the needed privileges.
Special thanks to Minty Trebor and Falcounet from the RRF for LPC/STM Discord!
Is there any way to run python script on Windows VM continuously. This script should run even computer restarts automatically.
What I am doing right now?
I have a script called FooBar.py, it contains infinite while loop to execute main() function continuously. I am running this script on powershell.
What is problem in this approach?
Sometime this VM restarts automatically or powershell window may close accidentally. This kind of issues causing failure of script execution.
What I have tried so far?
I tried pythonw.exe instead of python.exe to run the script but this does not resolve my problem.
Is there any way to run FooBar.py script continuously, is there any way in windows scheduler to restart script execution even after computer restarts
You can use pm2 to schedule the startup of your script. You can find more information on how to use it here:
https://towardsdatascience.com/automate-your-python-script-with-pm2-463238ea0b65
you can use a module of python named "schedule"
and possibly you can run your code at whatever time you need !
use "pip install schecule" to download the library or module.
for an example i will leave you a pic how to use it.
enter image description here
i mean job in the picture is a function what you want to do.
if you want it to do for a time interval in secdonds then,you can use
schedule.every(#duration#).seconds.do(#declared function#)
thank you!
Create a task in windows scheduler with trigger On a schedule and with trigger option Repeat task every 1 min. Then in a Settings tab there is the dropdown menu If the task is already running, then the following rule applies: where you can choose Do not start a new instance.
I am submitting a python script to condor. When condor runs it it gets
an import error. Condor runs it as
/var/lib/condor/execute/dir_170475/condor_exec.exe. If I manually copy
the python script to the execute machine and put it in the same place
and run it, it does not get an import error. I am wondering how to
debug this.
How can I see the command line condor uses to run it? Can the file
copied to /var/lib/condor/execute/dir_170475/condor_exec.exe be
retained after the failure so I can see it? Any other suggestions on
how to debug this?
You can simply run an interactive job (basically just a job with sleep or cat as command) and do ssh_to_job to run it.
Generally you need to set-up your python environment on the compute node, it is best to have a venv and activate it inside your start script.
I have a Python script that should open my Linux terminal, browser, file manager and text editor on system startup. I decided crontab is a suitable way to automatically run the script. Unfortunately, it doesn't went well, nothing happened when I reboot my laptop. So, I captured the output of the script to a file in order to get some clues. It seems my script is only partially executed. I use Debian 8 (Jessie), and here's my Python script:
#!/usr/bin/env python3
import subprocess
import webbrowser
def action():
subprocess.call('gnome-terminal')
subprocess.call('subl')
subprocess.call(('xdg-open', '/home/fin/Documents/Learning'))
webbrowser.open('https://reddit.com/r/python')
if __name__ == '__main__':
action()
here's the entry in my crontab file:
#reboot python3 /home/fin/Labs/my-cheatcodes/src/dsktp_startup_script/dsktp_startup_script.py > capture_report.txt
Here's the content of capture_report.txt file (I trim several lines, since its too long, it only prints my folder structures. seems like it came from 'xdg-open' line on Python script):
Directory list of /home/fin/Documents/Learning/
Type Format Sort
[Tree ] [Standard] [By Name] [Update]
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
/
... the rest of my dir stuctures goes here
I have no other clue what's possible going wrong here. I really appreciate your advice guys. thanks.
No, cron is not suitable for this. The cron daemon has no connection to your user's desktop session, which will not be running at system startup, anyway.
My recommendation would be to hook into your desktop environment's login scripts, which are responsible for starting various desktop services for you when you log in, anyway, and easily extended with your own scripts.
I'd do as tripleee suggested, but your job might be failing because it requires an X session, since you're trying to open a browser. You should put export DISPLAY=:0; after the schedule in your cronjob, as in
#reboot export DISPLAY=:0; python3 /home/fin/Labs/my-cheatcodes/src/dsktp_startup_script/dsktp_startup_script.py > capture_report.txt
If this doesn't work, you could try replacing :0 with the output of echo $DISPLAY in a graphical terminal.
I just set up my first aws server for a personal project I'm doing. I'm running ubuntu linux, and I have a python script that accesses an sqlite database file in order to send email. I have these same files on my own ubuntu machine and the script works fine. I'm having trouble, however, figuring out how to run my script from the terminal in my aws vm. Normally I use idle to run my python script on my linux machine, so I'm trying to figure out how to run it from the terminal and it's giving me some trouble.
I tried
python script.py
which did nothing, so I converted it to an executable, ran it, and got
./script.py: line 1: import: command not found
...and so on
I realized that I had to add
#!/usr/bin/env python3
to my script, so I did that, converted to executable again, and ran it by entering
./script.py
which also did nothing. If the program had run, it would have delivered an email to my inbox. Is there any way I can tell if it's actually trying to run my script? Or am I trying to run it incorrectly?
You can modify the script to add verbose that prints out to the console the status of the script, or if you just want to know whether your script is running in the background, you can check whether the process is active using ps (process name would be the name of the script) :
ps aux | grep "script.py"
Anyways the former is a better practice, since you can exactly know execution flow of your script.