Forcing -i from a command line script [duplicate] - python

I have a python script that I like to run with python -i script.py, which runs the script and then enters interactive mode so that I can play around with the results.
Is it possible to have the script itself invoke this option, such that I can just run python script.py and the script will enter interactive mode after running?
Of course, I can simply add the -i, or if that is too much effort, I can write a shell script to invoke this.

From within script.py, set the PYTHONINSPECT environment variable to any nonempty string. Python will recheck this environment variable at the end of the program and enter interactive mode.
import os
# This can be placed at top or bottom of the script, unlike code.interact
os.environ['PYTHONINSPECT'] = 'TRUE'

In addition to all the above answers, you can run the script as simply ./script.py by making the file executable and setting the shebang line, e.g.
#!/usr/bin/python -i
this = "A really boring program"
If you want to use this with the env command in order to get the system default python, then you can try using a shebang like #donkopotamus suggested in the comments
#!/usr/bin/env PYTHONINSPECT=1 python
The success of this may depend on the version of env installed on your platform however.

You could use an instance of code.InteractiveConsole to get this to work:
from code import InteractiveConsole
i = 20
d = 30
InteractiveConsole(locals=locals()).interact()
running this with python script.py will launch an interactive interpreter as the final statement and make the local names defined visible via locals=locals().
>>> i
20
Similarly, a convenience function named code.interact can be used:
from code import interact
i = 20
d = 30
interact(local=locals())
This creates the instance for you, with the only caveat that locals is named local instead.
In addition to this, as #Blender stated in the comments, you could also embed the IPython REPL by using:
import IPython
IPython.embed()
which has the added benefit of not requiring the namespace that has been populated in your script to be passed with locals.

I think you're looking for this?
import code
foo = 'bar'
print foo
code.interact(local=locals())

I would simply accompany the script with a shell script that invokes it.
exec python -i "$(dirname "$0")/script.py"

Related

Where is the Python interpreter startup script for putting a history function?

I want to add a function like this:
def history():
import readline
for i in range(readline.get_current_history_length()):
print (readline.get_history_item(i + 1))
so that whenever I'm in a Python shell (like running python3) or if I hit a breakpoint using ipdb in Python code, that I can just call history().
Where do I add this code? (MacOS)
You might be able to achive that using the PYTHONSTARTUP env var.
Create a script that contains this code, and export the variable to your shell's profile
https://docs.python.org/3/using/cmdline.html#envvar-PYTHONSTARTUP

How to get a python script to invoke "python -i" when called normally?

I have a python script that I like to run with python -i script.py, which runs the script and then enters interactive mode so that I can play around with the results.
Is it possible to have the script itself invoke this option, such that I can just run python script.py and the script will enter interactive mode after running?
Of course, I can simply add the -i, or if that is too much effort, I can write a shell script to invoke this.
From within script.py, set the PYTHONINSPECT environment variable to any nonempty string. Python will recheck this environment variable at the end of the program and enter interactive mode.
import os
# This can be placed at top or bottom of the script, unlike code.interact
os.environ['PYTHONINSPECT'] = 'TRUE'
In addition to all the above answers, you can run the script as simply ./script.py by making the file executable and setting the shebang line, e.g.
#!/usr/bin/python -i
this = "A really boring program"
If you want to use this with the env command in order to get the system default python, then you can try using a shebang like #donkopotamus suggested in the comments
#!/usr/bin/env PYTHONINSPECT=1 python
The success of this may depend on the version of env installed on your platform however.
You could use an instance of code.InteractiveConsole to get this to work:
from code import InteractiveConsole
i = 20
d = 30
InteractiveConsole(locals=locals()).interact()
running this with python script.py will launch an interactive interpreter as the final statement and make the local names defined visible via locals=locals().
>>> i
20
Similarly, a convenience function named code.interact can be used:
from code import interact
i = 20
d = 30
interact(local=locals())
This creates the instance for you, with the only caveat that locals is named local instead.
In addition to this, as #Blender stated in the comments, you could also embed the IPython REPL by using:
import IPython
IPython.embed()
which has the added benefit of not requiring the namespace that has been populated in your script to be passed with locals.
I think you're looking for this?
import code
foo = 'bar'
print foo
code.interact(local=locals())
I would simply accompany the script with a shell script that invokes it.
exec python -i "$(dirname "$0")/script.py"

Run python script from shell and keeping defined modules/data

I am new to Python and I am wondering how can I keep the modules and data from a script I run from the Python shell?
For example: I have the script helloworld.py and it contains:
import numpy as donkey
a = 55
Then I want to run that script from my Python shell:
execfile('helloworld.py')
However, if I then try to call for 'a' or 'donkey', they are not found.
How can I fix that?
I will assume you are using IDLE. IDLE is not a shell in Unix sense. So there is not a direct equivalent of the bash shell . ~/.bash_profile.
Instead I edit the file and then use F5 Run Module. All my variables, functions, and classes get loaded. I can use the interpretive shell to manipulate. However, if I run another module, I reset the environment, losing the first module's definitions.
The execfile() approach is limited to Python 2 as noted here:
How to run a Python script from IDLE command line?

Set environment variable in one script and use this variable in another script

I am trying to set environment variable using python. And this variable is used in another script.
My code is:
#!/usr/bin/env python
import os
os.environ['VAR'] = '/current_working_directory'
after executing above python script,i execute second script which uses the same variable 'VAR', but it is not working.
But when i do export VAR='/current_working_directory and run the second script, it works fine. I tried putenv() also.
This depends on how the second python script get's called.
If you have a shell, and the shell first runs the first python script, then the second, it won't work. The reason is that the first python script inherits the environment from the shell. But modifying os.environ[] or calling putenv() will then only modify the inherited environment --- the one from the second python script, not the one from the shell script.
If now the shell script runs the second python script, it will again inherit the environment from the shell ... and because the shell script is unmodified, you cannot see the modification the first python script did.
One way to achive your goal is using a helper file:
#!/bin/bash
rm -f envfile
./first_pythonscript
test -f envfile && . envfile
rm -f envfile
./second_pythonscript
That code is crude, it won't work if two instances of the shell script run, so don't use it as-is. But I hope you get the idea.
Even another way is to make your second_pythonscript not a program, but a Python module that the first_pythonscript can import. You can also make it a hybrid, library when imported, program when run via the if __name__ == "__main__": construct.
And finally you can use one of the os function, e.g. os.spawnvpe
This code should provide the required environment to your 2nd script:
#!/usr/bin/env python
import os
os.environ['VAR'] = '/current_working_directory'
execfile("/path/to/your/second/script.py")
A child process cannot change the environment of its parent process.
The general shell programming solution to this is to make your first script print out the value you need, and assign it in the calling process. Then you can do
#!/bin/sh
# Assign VAR to the output from first.py
VAR="$(first.py)"
export VAR
exec second.py
or even more succinctly
#!/bin/sh
VAR="$(first.py)" second.py
Obviously, if the output from first.py is trivial to obtain without invoking Python, that's often a better approach; but for e.g. having two scripts call different functions from a library and/or communicating with a common back end, this is a common pattern.
Using Python for the communication between two pieces of Python code is often more elegant and Pythonic, though.
#!/usr/bin/env python
from yourlib import first, second
value=first()
# maybe putenv VAR here if that's really, really necessary
second(value)

why does setting an initial environment using env stall the launch of my Python script on Ubuntu?

I have a test script, e.g. "test.py", and I want to make it so that it executes with a particular environment variable set before the script begins:
#!/usr/bin/env TEST=anything python
print "Hello, world."
Running this normally works as expected:
$ python test.py
Hello, world.
However, if I run it as a program:
$ chmod +x test.py
$ ./test.py
The string is never printed, instead the execution just stalls and "top" reports a process called "test.py" which is using 100% CPU.
This only happens on my Ubuntu machine, seems to be fine on OS X.
The reason is that eventually I want to make a particular script always run in a 32-bit Python by setting:
#!/usr/bin/env VERSIONER_PYTHON_PREFER_32_BIT=yes python
at the top of the file. However this is a no-go if it means the script won't execute on Linux machines. I found there is a similar effect no matter what the specified environment variable is called. However, if there is no environment variable set:
#!/usr/bin/env python
print "Hello, world."
The script runs just fine:
$ ./test.py
Hello, world.
Is this a bug in Python or in env, or am I doing something wrong?
On Linux,
#!/usr/bin/env TEST=anything python
passes TEST=anything python as one argument to env.
So env will not process the argument properly.
The bottom line is you can only put one command after env on the shebang line, all else will at best be ignored.
From the Wikipedia entry on Shebang:
Another portability problem is the interpretation of the command
arguments. Some systems, including Linux, do not split up the
arguments[24]; for example, when running the script with the first
line like,
#!/usr/bin/env python -c
That is, python -c will be passed as one argument to /usr/bin/env,
rather than two arguments. Cygwin also behaves this way.
I doubt /usr/bin/env VERSIONER_PYTHON_PREFER_32_BIT=yes python is going to run properly.
Instead, try setting the environment variables with Python:
import os
os.environ['VERSIONER_PYTHON_PREFER_32_BIT'] = 'yes'
You'll probably need to forget about VERSIONER_PYTHON_PREFER_32_BIT, at least on Linux. On Mac, you could use a shell wrapper for it.
Then on Linux, you'll probably need to reinvent VERSIONER_PYTHON_PREFER_32_BIT using a small stub python script or bash script or something, hinging on something like the following:
>>> import platform
>>> platform.machine()
'x86_64'
>>> platform.processor()
'x86_64'

Categories

Resources