How to attach to PyCharm debugger when executing python script from bash? - python

I know how to set-up run configurations to pass parameters to a specific python script. There are several entry points, I don't want a run configuration for each one do I? What I want to do instead is launch a python script from a command line shell script and be able to attach the PyCharm debugger to the python script that is executed and have it stop at break points. I've tried to use a pre-launch condition of a utility python script that will sleep for 10 seconds so I can attempt to "attach to process" of the python script. That didn't work. I tried to import pdb and settrace to see if that would stop it for attaching to the process, but that looks to be command line debugging specific only. Any clues would be appreciated.
Thanks!

You can attach the debugger to a python process launched from terminal:
Use Menu Tools --> Attach to process then select python process to debug.
If you want to debug a file installed in site-packages you may need to open the file from its original location.
You can to pause the program manually from debugger and inspect the suspended Thread to find your source file.

Related

Run Python script 24x7 on windows VM

Is there any way to run python script on Windows VM continuously. This script should run even computer restarts automatically.
What I am doing right now?
I have a script called FooBar.py, it contains infinite while loop to execute main() function continuously. I am running this script on powershell.
What is problem in this approach?
Sometime this VM restarts automatically or powershell window may close accidentally. This kind of issues causing failure of script execution.
What I have tried so far?
I tried pythonw.exe instead of python.exe to run the script but this does not resolve my problem.
Is there any way to run FooBar.py script continuously, is there any way in windows scheduler to restart script execution even after computer restarts
You can use pm2 to schedule the startup of your script. You can find more information on how to use it here:
https://towardsdatascience.com/automate-your-python-script-with-pm2-463238ea0b65
you can use a module of python named "schedule"
and possibly you can run your code at whatever time you need !
use "pip install schecule" to download the library or module.
for an example i will leave you a pic how to use it.
enter image description here
i mean job in the picture is a function what you want to do.
if you want it to do for a time interval in secdonds then,you can use
schedule.every(#duration#).seconds.do(#declared function#)
thank you!
Create a task in windows scheduler with trigger On a schedule and with trigger option Repeat task every 1 min. Then in a Settings tab there is the dropdown menu If the task is already running, then the following rule applies: where you can choose Do not start a new instance.

Python Code to Automate Abaqus Run Script

Error with Subprocess addition
This is the error I get regarding the abaqus module
I'm quite new to python and so please don't mind if this question might seem silly.
So I have python file that does the function of opening an abaqus viewer and I have another python file that describes the functions I want to do in the abaqus viewer.
I need a piece of code that can automate the second script without me having to manually go into file>run script.
Script to Open Abaqus:
import os
import subprocess
os.startfile('Q:/win_apps/scripts/simulia/Abaqus/6.14-3/Use_these_if_not_working/abq6143_viewer.bat')
And then I have a python script that has the code regarding my output requests from abaqus viewer.
What line can I add to the above file to automatically take the second python script and run it?
When running Abaqus with the typical start up scripts you can pass Abaqus/Viewer a script to run from the command line:
abq6143 viewer noGUI=script.py
where you replace script.py with the name of your Python script. This will start up Abaqus/Viewer with no user interface, run the script, and then quit.
If you want the user interface to come up and automatically run your script you can use the script= command instead of noGUI:
abq6143 viewer script=script.py
I see that you're using a custom batch file to start up Abaqus/Viewer. Without seeing those contents I couldn't say exactly how you would integrate the above, but you will probably need to adjust the relevant line in the batch file with the noGUI or script command.

Debugging htcondor issue running python script

I am submitting a python script to condor. When condor runs it it gets
an import error. Condor runs it as
/var/lib/condor/execute/dir_170475/condor_exec.exe. If I manually copy
the python script to the execute machine and put it in the same place
and run it, it does not get an import error. I am wondering how to
debug this.
How can I see the command line condor uses to run it? Can the file
copied to /var/lib/condor/execute/dir_170475/condor_exec.exe be
retained after the failure so I can see it? Any other suggestions on
how to debug this?
You can simply run an interactive job (basically just a job with sleep or cat as command) and do ssh_to_job to run it.
Generally you need to set-up your python environment on the compute node, it is best to have a venv and activate it inside your start script.

Running multiple Python scripts

I would like to create a simple Python program that will concurrently execute 2 independent scripts. For now, the two scripts just print a sequence of numbers but my intention is to use this program to concurrently run a few Twitter streaming programs in the future.
I suspect I need to use subprocess.Popen but I cannot quite get my head around what arguments I should put in there. There was a similar question on StackOverflow but the code provided there (pasted below) doesn't print anything. I will appreciate your help.
My files are:
thread1.py
thread2.py
import subprocess
subprocess.Popen(['screen', './thread1.py']))
subprocess.Popen(['screen', './thread2.py'])
Use supervisord
supervisord is process control system just for the purpose of running multiple command line scripts.
It features:
multiple controlled processes
autorestarting failed runs
log stdout and stderr output
starting scripts in order (using priority)
command line utility to view latest log output, stop, start, restart the processes
This solution works only on *nix based systems, it is not available on Windows.
As wanderlust mentioned, why do you want to do it this way and not via linux command line?
Otherwise, the solution you post is doing what it is meant to, i.e, you are doing this at the command line:
screen ./thread1.py
screen ./thread2.py
This will open a screen session and run the program and output within this screen session, such that you will not see the output on your terminal directly. To trouble shoot your output, just execute the scripts without the screen call:
import subprocess
subprocess.Popen(['./thread1.py'])
subprocess.Popen(['./thread2.py'])
Content of thread1.py:
#!/usr/bin/env python
def countToTen():
for i in range(10):
print i
countToTen()
Content of thread2.py:
#!/usr/bin/env python
def countToHundreds():
for i in range(10):
print i*100
countToHundreds()
Then don't forget to do this on the command line:
chmod u+x thread*.py
You can also just open several Command Prompt windows to run several Python programs at once - just run one in each of them:
In each Command Prompt window, go to the correct directory (such as C:/Python27) and then type 'python YourCodeNo1.py' in one Command Prompt window, 'python YourCodeNo2.py' in the next one ect. .
I'm currently running 3 codes at one time in this way, without slowing any of them down.

launch external shell python instance in shell from python

I'd like to call a separate non-child python program from a python script and have it run externally in a new shell instance. The original python script doesn't need to be aware of the instance it launches, it shouldn't block when the launched process is running and shouldn't care if it dies. This is what I have tried which returns no error but seems to do nothing...
import subprocess
python_path = '/usr/bin/python'
args = [python_path, '&']
p = subprocess.Popen(args, shell=True)
What should I be doing differently
EDIT
The reason for doing this is I have an application with a built in version of python, I have written some python tools that should be run separately alongside this application but there is no assurance that the user will have python installed on their system outside the application with the builtin version I'm using. Because of this I can get the python binary path from the built in version programatically and I'd like to launch an external version of the built in python. This eliminates the need for the user to install python themselves. So in essence I need a simple way to call an external python script using my current running version of python programatically.
I don't need to catch any output into the original program, in fact once launched I'd like it to have nothing to do with the original program
EDIT 2
So it seems that my original question was very unclear so here are more details, I think I was trying to over simplify the question:
I'm running OSX but the code should also work on windows machines.
The main application that has a built in version of CPython is a compiled c++ application that ships with a python framework that it uses at runtime. You can launch the embedded version of this version of python by doing this in a Terminal window on OSX
/my_main_app/Contents/Frameworks/Python.framework/Versions/2.7/bin/python
From my main application I'd like to be able to run a command in the version of python embedded in the main app that launches an external copy of a python script using the above python version just like I would if I did the following command in a Terminal window. The new launched orphan process should have its own Terminal window so the user can interact with it.
/my_main_app/Contents/Frameworks/Python.framework/Versions/2.7/bin/python my_python_script
I would like the child python instance not to block the main application and I'd like it to have its own terminal window so the user can interact with it. The main application doesn't need to be aware of the child once its launched in any way. The only reason I would do this is to automate launching an external application using a Terminal for the user
If you're trying to launch a new terminal window to run a new Python in (which isn't what your question asks for, but from a comment it sounds like it's what you actually want):
You can't. At least not in a general-purpose, cross-platform way.
Python is just a command-line program that runs with whatever stdin/stdout/stderr it's given. If those happen to be from a terminal, then it's running in a terminal. It doesn't know anything about the terminal beyond that.
If you need to do this for some specific platform and some specific terminal program—e.g., Terminal.app on OS X, iTerm on OS X, the "DOS prompt" on Windows, gnome-terminal on any X11 system, etc.—that's generally doable, but the way to do it is by launching or scripting the terminal program and telling it to open a new window and run Python in that window. And, needless to say, they all have completely different ways of doing that.
And even then, it's not going to be possible in all cases. For example, if you ssh in to a remote machine and run Python on that machine, there is no way it can reach back to your machine and open a new terminal window.
On most platforms that have multiple possible terminals, you can write some heuristic code that figures out which terminal you're currently running under by just walking os.getppid() until you find something that looks like a terminal you know how to deal with (and if you get to init/launchd/etc. without finding one, then you weren't running in a terminal).
The problem is that you're running Python with the argument &. Python has no idea what to do with that. It's like typing this at the shell:
/usr/bin/python '&'
In fact, if you pay attention, you're almost certainly getting something like this through your stderr:
python: can't open file '&': [Errno 2] No such file or directory
… which is exactly what you'd get from doing the equivalent at the shell.
What you presumably wanted was the equivalent of this shell command:
/usr/bin/python &
But the & there isn't an argument at all, it's part of sh syntax. The subprocess module doesn't know anything about sh syntax, and you're telling it not to use a shell, so there's nobody to interpret that &.
You could tell subprocess to use a shell, so it can do this for you:
cmdline = '{} &'.format(python_path)
p = subprocess.Popen(cmdline, shell=True)
But really, there's no good reason to. Just opening a subprocess and not calling communicate or wait on it already effectively "puts it in the background", just like & does on the shell. So:
args = [python_path]
p = subprocess.Popen(args)
This will start a new Python interpreter that sits there running in the background, trying to use the same stdin/stdout/stderr as your parent. I'm not sure why you want that, but it's the same thing that using & in the shell would have done.
Actually I think there might be a solution to your problem, I found a useful solution at another question here.
This way subprocess.popen starts a new python shell instance and runs the second script from there. It worked perfectly for me on Windows 10.
You can try using screen command
with this command a new shell instance created and the current instance runs in the background.
# screen; python script1.py
After running above command, a new shell prompt will be seen where we can run another script and script1.py will be running in the background.
Hope it helps.

Categories

Resources