I want to run cd and ls in python debugger. I try to use !ls but I get
*** NameError: name 'ls' is not defined
Simply use the "os" module and you will able to easily execute any os command from within pdb.
Start with:
(Pdb) import os
And then:
(Pdb) os.system("ls")
or even
(Pdb) os.system("sh")
the latest simply spawns a subshell. Exiting from it returns back to debugger.
Note: the "cd" command will have no effect when used as os.system("cd dir") since it will not change the cwd of the python process. Use os.chdir("/path/to/targetdir") for that.
PDB doesn't let you run shell commands, unfortunately. The reason for the error that you are seeing is that PDB lets you inspect a variable name or run a one-line snippet using !. Quoting from the docs:
[!]statement
Execute the (one-line) statement in the context of the current stack frame. The exclamation point can be omitted unless the first word of the statement resembles a debugger command. To set a global variable, you can prefix the assignment command with a global command on the same line, e.g.:
(Pdb) global list_options; list_options = ['-l']
(Pdb)
Thus !ls mean "print the value of ls", which causes the NameError that you observed.
PDB works very similarly to the normal python console so packages can be imported and used as you would normally do in the python interactive session.
Regarding the directory listing you should use the os module (inside the PDB, confirming each line with return aka. enter key ;) ):
from os import listdir
os.listdir("/path/to/your/folder")
Or if you want to do some more advanced stuff like start new processes or catch outputs etc. you need to have a look on subprocess module.
Related
I want to execute another program that I compiled earlier from python using the subprocess.call(command) function.
However, python states that it cannot find the command. I suspect that subprocess is not able to find my custom command since it does not know the PATH variable of my Ubuntu system. Is it possible to somehow execute the following code, where command is part of my PATH?
import subprocess
subprocess.run("command -args")
Running this code leads to the error command not found.
You can either provide the explicit path to your command:
subprocess.run('/full/path/to/command.sh')
or else modify your PATH variable in your Python code:
import os
os.environ['PATH'] += ':'+'/full/path/to/'
subprocess.run('command.sh')
You can modify the environment variables. But be careful when you pass arguments.
Try something like this:
import os
import subprocess
my_env = os.environ.copy()
my_env["PATH"] = "/usr/test/path:" + my_env["PATH"]
subprocess.run(["command", "-args"], env=my_env)
I can run one program by typing: python enable_robot.py -e in the command line, but I want to run it from within another program.
In the other program, I imported subprocess and had subprocess.Popen(['enable_robot', 'baxter_tools/scripts/enable_robot.py','-e']), but I get an error message saying something about a callback.
If I comment out this line, the rest of my program works perfectly fine.
Any suggestions on how I could change this line to get my code to work or if I shouldn't be using subprocess at all?
If enable_robot.py requires user input, probably it wasn't meant to run from another python script. you might want to import it as a module: import enable_robot and run the functions you want to use from there.
If you want to stick to the subprocess, you can pass input with communicate:
p = subprocess.Popen(['enable_robot', 'baxter_tools/scripts/enable_robot.py','-e'])
p.communicate(input=b'whatever string\nnext line')
communicate documentation, example.
Your program enable_robot.py should meet the following requirements:
The first line is a path indicating what program is used to interpret
the script. In this case, it is the python path.
Your script should be executable
A very simple example. We have two python scripts: called.py and caller.py
Usage: caller.py will execute called.py using subprocess.Popen()
File /tmp/called.py
#!/usr/bin/python
print("OK")
File /tmp/caller.py
#!/usr/bin/python
import subprocess
proc = subprocess.Popen(['/tmp/called.py'])
Make both executable:
chmod +x /tmp/caller.py
chmod +x /tmp/called.py
caller.py output:
$ /tmp/caller.py
$ OK
Consider that you are in the Windows command prompt or similar command line environment. How can you get info about a Python module from its docstring printed to the console?
Ideally you will want to load the module without executing it, which could have side effects. This is supported by Python’s ast module, which even has a helper for getting docstrings. Try this:
python3 -c"import ast, sys; a = ast.parse(open(sys.argv[1]).read()); print(ast.get_docstring(a))" "$1"
The Shortut (Hack)
Generally (in 2.7):
python -c"print 'Hello world'"
(in 3.x):
python -c"print('Hello world')"
will output: Hello world
But if you pass -c as an argument into a module, something else happens.
For an example, navigate to [your Python folder]\Tools\Scripts. If your script does not take parameter -c, a shortcut is simply to run:
python reindent.py -c
This will result in an error for the argument: "-c not recognized", but it will also return the docstring to the console. (One limitation is that the output cannot be routed to the clipboard using |clip.)
Generally, if your script myscript.py contains a docstring and expects no argument -c:
python myscript.py -c
returns
option -c not recognized
[__docstring__]
The Works
Once you are in the folder of reindent.py you can get an error-free docstring:
python -c"import reindent; print reindent.__doc__"
For producing a browsable help text, which prints both the docstring and lists the containing classes, functions, and global variables, use:
python -c"import reindent; help(reindent)"
To output to the clipboard only (Warning: Contents will be replaced!):
python -c"import reindent; help(reindent)"|clip
Deeper
Now that you have figured out what classes and functions are accessible (see above), you can retrieve the methods of a class and inner docstrings:
python -c"from reindent import Reindenter; help(Reindenter)"
If you mean print interactively, just start python without any arguments to get a REPL (read–eval–print loop).
python
import mypackage
help(mypackage)
dir(mypackage)
executed one after another and so forth.
If you mean programmatically, see #noumenal's answer.
I'm writing a python (ver 2.7) script to automate the set of commands in this Getting Started example for INOTOOL.
Problem: When I run this entire script, I repeatedly encounter these errors:
Current Directory is not empty
No project is found in this directory
No project is found in this directory
But, when I run a first script only up till the code line marked, and manually type in the next three lines, or when I run these last three lines (starting from the "ino init -t blink" line) after manually accessing the beep folder, then I am able to successfully execute the same code.
Is there a limitation with os.system() that I'm encountering?
My code:
import os,sys
def upload()
os.system("cd /home/pi/Downloads")
os.system("mkdir beep")
os.system("cd beep") #will refer to this code junction in question description
os.system("ino init -t blink")
os.system("ino build")
os.system("ino upload")
sys.exit(0)
Yes, when os.system() commands are run for cd , it does not actually change the current directory for the python process' context. From documentation -
os.system(command)
Execute the command (a string) in a subshell. This is implemented by calling the Standard C function system(), and has the same limitations. Changes to sys.stdin, etc. are not reflected in the environment of the executed command.
So even though you are changing directory in os.system() call, the next os.system call still occurs in same directory. Which could be causing your issue.
You shoud try using os.chdir() to change the directory instead of os.system() calls.
The Best would be to use subprocess module as #PadraicCunningham explains in his answer.
You can use the subprocess module and os.mkdir to make the directory, you can pass the current working directory cwd to check_callso you actually execute the command in the directory:
from subprocess import check_call
import os
def upload():
d = "/home/pi/Downloads/beep"
os.mkdir(d)
check_call(["ino", "init", "-t", "blink"],cwd=d)
check_call(["ino", "build"],cwd=d)
check_call(["ino", "upload"],cwd=d)
A non-zero exit status will raise CalledProcessError which you may want to catch but once successful you know the commands all returned a 0 exit status.
I want to implement a userland command that will take one of its arguments (path) and change the directory to that dir. After the program completion I would like the shell to be in that directory. So I want to implement cd command, but with external program.
Can it be done in a python script or I have to write bash wrapper?
Example:
tdi#bayes:/home/$>python cd.py tdi
tdi#bayes:/home/tdi$>
Others have pointed out that you can't change the working directory of a parent from a child.
But there is a way you can achieve your goal -- if you cd from a shell function, it can change the working dir. Add this to your ~/.bashrc:
go() {
cd "$(python /path/to/cd.py "$1")"
}
Your script should print the path to the directory that you want to change to. For example, this could be your cd.py:
#!/usr/bin/python
import sys, os.path
if sys.argv[1] == 'tdi': print(os.path.expanduser('~/long/tedious/path/to/tdi'))
elif sys.argv[1] == 'xyz': print(os.path.expanduser('~/long/tedious/path/to/xyz'))
Then you can do:
tdi#bayes:/home/$> go tdi
tdi#bayes:/home/tdi$> go tdi
That is not going to be possible.
Your script runs in a sub-shell spawned by the parent shell where the command was issued.
Any cding done in the sub-shell does not affect the parent shell.
cd is exclusively(?) implemented as a shell internal command, because any external program cannot change parent shell's CWD.
As codaddict writes, what happens in your sub-shell does not affect the parent shell. However, if your goal is to present the user with a shell in a different directory, you could always have Python use os.chdir to change the sub-shell's working directory and then launch a new shell from Python. This will not change the working directory of the original shell, but will leave the user with one in a different directory.
As explained by mrdiskodave
in Equivalent of shell 'cd' command to change the working directory?
there is a hack to achieve the desired behavior in pure Python.
I made some modifications to the answer from mrdiskodave to make it work in Python 3:
The pipes.quote() function has moved to shlex.quote().
To mitigate the issue of user input during execution, you can delete any previous user input with the backspace character "\x08".
So my adaption looks like the following:
import fcntl
import shlex
import termios
from pathlib import Path
def change_directory(path: Path):
quoted_path = shlex.quote(str(path))
# Remove up to 32 characters entered by the user.
backspace = "\x08" * 32
cmd = f"{backspace}cd {quoted_path}\n"
for c in cmd:
fcntl.ioctl(1, termios.TIOCSTI, c)
I shall try to show how to set a Bash terminal's working directory to whatever path a Python program wants in a fairly easy way.
Only Bash can set its working directory, so routines are needed for Python and Bash. The Python program has a routine defined as:
fob=open(somefile,"w")
fob.write(dd)
fob.close()
"Somefile" could for convenience be a RAM disk file. Bash "mount" would show tmpfs mounted somewhere like "/run/user/1000", so somefile might be "/run/user/1000/pythonwkdir". "dd" is the full directory path name desired.
The Bash file would look like:
#!/bin/bash
#pysync ---Command ". pysync" will set bash dir to what Python recorded
cd `cat /run/user/1000/pythonwkdr`