Python interpreters and Alias's - python

Is there any mechanism similar to alias's (with something like BASH) that can be used in Ipython or the Python Interpreter?
For instance if I find myself frequently doing something like:
var = urllib2.urlopen('http://programmers.stackexchange.com')
But I don't want to continually type out those strings.
Is there any method of (Persistently between exits) shortening the request other than writing a script for it?

No, but in your interpreter, write this:
def pse_url():
global var
var = urllib2.urlopen('http://programmers.stackexchange.com')
Then, write pse_url() whenever you need to affect your variable.
It would be cleaner to not use a global variable:
var = pse_url()
If you have many such utilities, put them in your own module and load them once when you start the REPL.

My guess is for such one line expressions you can create lambdas functions ( and functions for bigger one as #coredump suggested), see code below:
se_open = (lambda: urllib2.urlopen('http://programmers.stackexchange.com'))
so_open = (lambda: urllib2.urlopen('http://programmers.stackexchange.com'))
Know if you have to create new var you have to simply run command:
var_se = se_open()
var_so = so_open()
Also you can create script which contains all thst shortcuts and start python with imported script by command:
$ python -i script.py
All functions defined in script.py would be available in your REPL.

Related

Calling python functions without running from the editor

Please excuse what I know is an incredibly basic question that I have nevertheless been unable to resolve on my own.
I'm trying to switch over my data analysis from Matlab to Python, and I'm struggling with something very basic: in Matlab, I write a function in the editor, and to use that function I simply call it from the command line, or within other functions. The function that I compose in the matlab editor is given a name at the function definition line, and it's generally best for the function name to match the .m file name to avoid confusion.
I don't understand how functions differ in Python, because I have not been successful translating the same approach there.
For instance, if I write a function in the Python editor (I'm using Python 2.7 and Spyder), simply saving the .py file and calling it by its name from the Python terminal does not work. I get a "function not defined" error. However, if I execute the function within Spyder's editor (using the "run file" button), not only does the code execute properly, from that point on the function is also call-able directly from the terminal.
So...what am I doing wrong? I fully appreciate that using Python isn't going to be identical to Matlab in every way, but it seems that what I'm trying to do isn't unreasonable. I simply want to be able to write functions and call them from the python command line, without having to run each and every one through the editor first. I'm sure my mistake here must be very simple, yet doing quite a lot of reading online hasn't led me to an answer.
Thanks for any information!
If you want to use functions defined in a particular file in Python you need to "import" that file first. This is similar to running the code in that file. Matlab doesn't require you to do this because it searches for files with a matching name and automagically reads in the code for you.
For example,
myFunction.py is a file containing
def myAdd(a, b):
return a + b
In order to access this function from the Python command line or another file I would type
from myFunction import myAdd
And then during this session I can type
myAdd(1, 2)
There are a couple of ways of using import, see here.
You need to a check for __main__ to your python script
def myFunction():
pass
if __name__ == "__main__":
myFunction()
then you can run your script from terminal like this
python myscript.py
Also if your function is in another file you need to import it
from myFunctions import myFunction
myFunction()
Python doesn't have MATLAB's "one function per file" limitation. You can have as many functions as you want in a given file, and all of them can be accessed from the command line or from other functions.
Python also doesn't follow MATLAB's practice of always automatically making every function it can find usable all the time, which tends to lead to function name collisions (two functions with the same name).
Instead, Python uses the concept of a "module". A module is just a file (your .py file). That file can have zero or more functions, zero or more variables, and zero or more classes. When you want to use something from that file, you just import it.
So say you have a file 'mystuff.py':
X = 1
Y = 2
def myfunc1(a, b):
do_something
def myfunc2(c, d):
do_something
And you want to use it, you can just type import mystuff. You can then access any of the variables or functions in mystuff. To call myfunc2, you can just do mystuff.myfunc2(z, w).
What basically happens is that when you type import mystuff, it just executes the code in the file, and makes all the variables that result available from mystuff.<varname>, where <varname> is the name of the variable. Unlike in MATLAB, Python functions are treated like any other variable, so they can be accessed just like any other variable. The same is true with classes.
There are other ways to import, too, such as from mystuff import myfunc.
You run python programs by running them with
python program.py

From Python execute shell command and incorporate environment changes (without subprocess)?

I'm exploring using iPython as shell replacement for a workflow that requires good logging and reproducibility of actions.
I have a few non-python binary programs and bash shell commands to run during my common workflow that manipulate the environment variables affecting subsequent work. i.e. when run from bash, the environment changes.
How can I incorporate these cases into the Python / iPython interactive shell and modify the environment going forward in the session?
Let's focus on the most critical case.
From bash, I woud do:
> sysmanager initialize foo
where sysmanager is a function:
> type sysmanager
sysmanager is a function
sysmanager ()
{
eval `/usr/bin/sysmanagercmd bash $*`
}
I don't control the binary sysmanagercmd and it generally makes non-trivial manipulations of the environment variables. Use of the eval built-in means these manipulations affect the shell process going forward -- that's critical to the design.
How can I call this command from Python / iPython with the same affect? Does python have something equivalent to bash's eval built-in for non-python commands?
Having not come across any built-in capability to do this, I wrote the following function which accomplishes the broad intent. Environment variable modifications and change of working directory are reflected in the python shell after the function returns. Any modification of shell aliases or functions are not retained but that could be done too with enhancement of this function.
#!/usr/bin/env python3
"""
Some functionality useful when working with IPython as a shell replacement.
"""
import subprocess
import tempfile
import os
def ShellEval(command_str):
"""
Evaluate the supplied command string in the system shell.
Operates like the shell eval command:
- Environment variable changes are pulled into the Python environment
- Changes in working directory remain in effect
"""
temp_stdout = tempfile.SpooledTemporaryFile()
temp_stderr = tempfile.SpooledTemporaryFile()
# in broader use this string insertion into the shell command should be given more security consideration
subprocess.call("""trap 'printf "\\0`pwd`\\0" 1>&2; env -0 1>&2' exit; %s"""%(command_str,), stdout=temp_stdout, stderr=temp_stderr, shell=True)
temp_stdout.seek(0)
temp_stderr.seek(0)
all_err_output = temp_stderr.read()
allByteStrings = all_err_output.split(b'\x00')
command_error_output = allByteStrings[0]
new_working_dir_str = allByteStrings[1].decode('utf-8') # some risk in assuming index 1. What if commands sent a null char to the output?
variables_to_ignore = ['SHLVL','COLUMNS', 'LINES','OPENSSL_NO_DEFAULT_ZLIB', '_']
newdict = dict([ tuple(bs.decode('utf-8').split('=',1)) for bs in allByteStrings[2:-1]])
for (varname,varvalue) in newdict.items():
if varname not in variables_to_ignore:
if varname not in os.environ:
#print("New Variable: %s=%s"%(varname,varvalue))
os.environ[varname] = varvalue
elif os.environ[varname] != varvalue:
#print("Updated Variable: %s=%s"%(varname,varvalue))
os.environ[varname] = varvalue
deletedVars = []
for oldvarname in os.environ.keys():
if oldvarname not in newdict.keys():
deletedVars.append(oldvarname)
for oldvarname in deletedVars:
#print("Deleted environment Variable: %s"%(oldvarname,))
del os.environ[oldvarname]
if os.getcwd() != os.path.normpath(new_working_dir_str):
#print("Working directory changed to %s"%(os.path.normpath(new_working_dir_str),))
os.chdir(new_working_dir_str)
# Display output of user's command_str. Standard output and error streams are not interleaved.
print(temp_stdout.read().decode('utf-8'))
print(command_error_output.decode('utf-8'))

set bash variable from python script

i'm calling a python script inside my bash script and I was wondering if there is a simple way to set my bash variables within my python script.
Example:
My bash script:
#!/bin/bash
someVar=""
python3 /some/folder/pythonScript.py
My python script:
anotherVar="HelloWorld"
Is there a way I can set my someVar to the value of anotherVar? I was thinking of printing properties in a file inside the python script and then read them from my bash script but maybe there is another way. Also I don't know and don't think it makes any difference but I can name both variable with the same name (someVar/someVar instead of someVar/anotherVar)
No, when you execute python, you start a new process, and every process has access only to their own memory. Imagine what would happen if a process could influence another processes memory! Even for parent/child processes like this, this would be a huge security problem.
You can make python print() something and use that, though:
#!/usr/bin/env python3
print('Hello!')
And in your shell script:
#!/usr/bin/env bash
someVar=$(python3 myscript.py)
echo "$someVar"
There are, of course, many others IPC techniques you could use, such as sockets, pipes, shared memory, etc... But without context, it's difficult to make a specific recommendation.
shlex.quote() in Python 3, or pipes.quote() in Python 2, can be used to generate code which can be evaled by the calling shell. Thus, if the following script:
#!/usr/bin/env python3
import sys, shlex
print('export foobar=%s' % (shlex.quote(sys.argv[1].upper())))
...is named setFoobar and invoked as:
eval "$(setFoobar argOne)"
...then the calling shell will have an environment variable set with the name foobar and the value argOne.

Access a script's variables and functions in interpreter after runtime

So let's say I have a script script1. Is there a way to interact with script1's variables and functions like an interpreter after or during its runtime?
I'm using IDLE and Python 2.7, but I'm wondering if I could do this in any interpreter not just IDLE's.
Say in my script, get = requests.get("example.com"). I'd like to hit F5 or whatever to run my script, and then instead of the console unloading all of the variables from memory, I'd like to be able to access the same get variable.
Is this possible?
That's a serious question. You might need to consult this page:
https://docs.python.org/2/using/cmdline.html#miscellaneous-options
Note the -i option, it makes interpreter enter interactive mode after executing given script.
you can do like this:
#file : foo.py
import requests
def req():
get = requests.get("example.com")
return get
and then run the script from a console
import foo
get = foo.req()

Can Lupa be used to run untrusted lua code in python?

Let's say I create LuaRuntime with register_eval=False and an attribute_filter that prevents access to anything except a few python functions. Is it safe to assume that lua code won't be able to do os.system("rm -rf *") or something like that?
From looking at the Lupa doc:
Restricting Lua access to Python objects
Lupa provides a simple mechanism to control access to Python objects. Each attribute access can be passed through a filter function as follows...
It doesn't say anything about preventing or limiting access to facilities provided by Lua itself. If no other modifications are done to the LuaRuntime environment then a lua script can indeed do something like os.execute("rm -rf *").
To control what kind of environment the lua script works in you can use the setfenv and getfenv to sandbox the script before running it. For example:
import lupa
L = lupa.LuaRuntime()
sandbox = L.eval("{}")
setfenv = L.eval("setfenv")
sandbox.print = L.globals().print
sandbox.math = L.globals().math
sandbox.string = L.globals().string
sandbox.foobar = foobar
# etc...
setfenv(0, sandbox)
Now doing something like L.execute("os.execute('rm -rf *')") will result in a script error.

Categories

Resources