interacting with objects in execfile/subprocess.call on python file with arguments - python

Python novice here.
I am trying to interact with the variables/objects in a python file that requires arguments for its data. Let's say I can only get this data from arguments, rather than make a script that includes arguments that would be passed (which means I must use execfile or subprocess.call).
Let's say this was my python file, foo.py:
bar = 123
print("ran foo.py, bar = %d" % bar)
Of course, there would be more to the file to parse arguments, but that is irrelevant.
Now, in a python shell (either python or ipython in a terminal):
>>> import subprocess
>>> subprocess.call("python foo.py 'args'", shell=True)
ran foo.py, bar = 123
0
>>> bar
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'bar' is not defined
As the example output above shows, bar is not defined after stdout from foo.py.
Is there any way to interact with variables or data objects in a python file called by execfile or subprocess.call? Normally I would eval in a REPL or run in an IDE so I can test out values and objects after execution, but I cannot figure out how to do this with a python file that builds data from parsed arguments.
Does anyone know how I can do this? Thanks for your consideration, at any rate.

Is foo.py under your control? If yes, simply change it, such that it can be run by being imported as module and interacted with.
If no, you may need to catch the output as string and newly build your variable bar from it.
How you capture the output of a subprocess is answered here for example:
Running shell command from Python and capturing the output

Related

Using sys.argv to pass command line arguments to a python script

I know this question has been asked in various variations but none of them seem to work for my specific case:
(btw all mentioned files are in my PATH)
I have a simple python script called test.py:
import sys
print('Hello world!')
print(sys.argv)
counter = 0
while True:
counter +=1
The counter is just there to keep the command window open so don't worry about that.
When I enter
test.py test test
into cmd I get the following output:
Hello world!
['C:\\Users\\path\\to\\test.py']
For some reason unknown to me the two other commands (sys.argv[1] and sys.argv[2]) are missing.
However when I create a .bat file like this:
#C:\Users\path\to\python.exe C:\Users\path\to\test.py %*
and call it in cmd
test.bat test test
I get my desired output:
Hello world!
['C:\\Users\\path\\to\\test.py', 'test', 'test']
I've read that the
%*
in the .bat file means that all command line arguments are passed to the python script but why are exactly these arguments not passed to the python script when I explicitly call it in cmd?
From what I've read all command line arguments entered after the script name should be passed to said script but for some reason it doesn't work that way.
What am I overlooking here?
I'm guessing that you need to actually run the script through the command line with C:\Users\path\to\python.exe test.py test test.
I'm not sure how Windows handles just test.py test test but from my limited experience it is probably just trying to open all 3 of those files. Since the first one (test.py) has a .py extension, it is opened with the Python Interpreter and is run automatically. The other two are not actually being passed in as an argument.

How to add python dictionary as a command line argument when running other file?

I'm running a python file from another python file using subprocess.call() method.
import job_parser.py
job_info=dict()
job_info['key1']='v1'
job_info['key2']='v2'
job_command="python job_parser.py --job {0}"
subprocess.call(job_command.format(job_info).split())
here I want place this job_info dictionary into the job_command. later I will parse this argument in job_parser.py and use it as dictionary.
I tried doing json.dumps() and converting to a raw string , these methods are not working.
Thanks in Advance
subprocess module is for calling external command like you want to run a os command or any external file you can use subprocess module.
If you want call a python script from another python script, you can import and python module and call the required function from imported python module.
in your case you can define a function call entry_point which is like entry point for the task you want to do in job_parser.py.
job_parser.py
def entry_point(input_dict):
# do_your_stuff
then you call this function like this in you script
import job_parser
job_info=dict()
job_info['key1']='v1'
job_info['key2']='v2'
job_parser.entry_point(job_info)

python script from the command line ,the arguments should be a list of options

I would like to run my python script from the command line when supplies with some arguments. However, one of the arguments should be a list of options specific to a segment of the script.
Example:
MODULES_TO_INSTALL = ['sale','purchase','account_accountant',]
how can I do this: python fichier.py liste_modules_to_install
I've done something similar in the past. It might be easier if instead of sending them as a list already, you call your script like so,
python script.py module1 module2 ... moduleN
Then a simple line of code to read in these passed modules from command line would be,
import sys
MODULES_TO_INSTALL = sys.argv[1:]

How to write an ipython alias which executes in python instead of shell?

We can define an alias in ipython with the %alias magic function, like this:
>>> d
NameError: name 'd' is not defined
>>> %alias d date
>>> d
Fri May 15 00:12:20 AEST 2015
This escapes to the shell command date when you type d into ipython.
But I want to define an alias to execute some python code, in the current interpreter scope, rather than a shell command. Is that possible? How can we make this kind of alias?
I work in the interactive interpreter a lot, and this could save me a lot of commands I find myself repeating often, and also prevent some common typos.
The normal way to do this would be to simply write a python function, with a def. But if you want to alias a statement, rather than a function call, then it's actually a bit tricky.
You can achieve this by writing a custom magic function. Here is an example, which effectively aliases the import statement to get, within the REPL.
from IPython.core.magic import register_line_magic
#register_line_magic
def get(line):
code = f"import {line}"
print("-->", code)
exec(code, globals())
del get # in interactive mode-remove from scope so function doesn't shadow magic
edit: below is the previous code, for older versions of IPython
from IPython.core.magic_arguments import argument, magic_arguments
#magic_arguments()
#argument('module')
def magic_import(self, arg):
code = 'import {}'.format(arg)
print('--> {}'.format(code))
self.shell.run_code(code)
ip = get_ipython()
ip.define_magic('get', magic_import)
Now it is possible to execute get statements which are aliased to import statements.
Demo:
In [1]: get json
--> import json
In [2]: json.loads
Out[2]: <function json.loads>
In [3]: get potato
--> import potato
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<string> in <module>()
ImportError: No module named potato
In [4]:
Of course, this is extendible to arbitrary python code, and optional arguments are supported aswell.
I don't know since when IPython provides with macro. And now you can simply do this:
ipy = get_ipython()
ipy.define_macro('d', 'date')
You can put this code into any file located in ~/.ipython/profile_default/startup/, and then this macro will be automatically available when you start IPython.
However, a macro doesn't accept arguments. So pleaes keep this in mind before you choose to define a macro.

Access previous bash command in python

I would like to be able to log the command used to run the current python script within the script itself. For instance this is something I tried:
#test.py
import sys,subprocess
with open('~/.bash_history','r') as f:
for line in f.readlines():
continue
with open('logfile','r') as f:
f.write('the command you ran: %s'%line.strip('\n'))
However the .bash_history does not seem to be ordered in chronological order. What's the best recommended way to achieve the above for easy logging? Thanks.
Update: unfortunately sys.argv doesn't quite solve my problem because I need to use process subtitution as input variables sometimes.
e.g. python test.py <( cat file | head -3)
What you want to do is not universally possible. As devnull says, the history file in bash is not written for every command typed. In some cases it's not written at all (user sets HISTFILESIZE=0, or uses a different shell).
The command as typed is parsed and processed long before your python script is invoked. Your question is therefore not related to python at all. Wether what you want to do is possible or not is entirely up to the invoking shell. bash does not provide what you want.
If your can control the caller's shell, you could try using zsh instead. There, if you setopt INC_APPEND_HISTORY, zsh will append to its history file for each command typed, so you can do the parse history file hack.
One option is to use sys.argv. It will contain a list of arguments you passed to the script.
import sys
print 'Number of arguments:', len(sys.argv), 'arguments.'
print 'Argument List:', str(sys.argv)
Example output:
>python test.py
Number of arguments: 1 arguments.
Argument List: ['test.py']
>python test.py -l ten
Number of arguments: 3 arguments.
Argument List: ['test.py', '-l', 'ten']
As you can see, the sys.argv variable contains the name of the script and then each individual parameter passed. It does miss the python portion of the command, though.

Categories

Resources