I wrote a Python program which will be executed on both the Primary Production server, as well on the Disaster Recovery server. There is a slight difference in behavior when the program is run on the Disaster Recovery server.
Therefore the program needs to determine which server it is running on.
We have many other ksh programs running on these servers, which have the same requirement: run on both servers, but there could be a slight difference on the DR server. All of these scripts 'dot' in an environment file, then check " if environment variable $DR_SITE equal 1" to determine if its running on the DR server.
I want to use the existing environment file from my Python program - to determine if it is running on the DR server. I can not just read the this environment file, it is actually a ksh script that itself has some logic prior to setting the DR_SITE variable.
Which brings me to the original question:
How do you 'dot' in (or execute an environment file as described above in python, in order to inherit the environment variables set by the ?
For example, in ksh I would execute this:
. /path/env.set
I tried this, but it did not seem to work (I printed out the DR_SITE value before calling the os.system call, and after, it did not change):
os.system(". /appl/gfpd2/current/D2soe_set")
You could write a ksh script that sources and executes the environment setter and then invokes the Python program;
#!/usr/bin/env ksh
. /path/env.set
exec python /path/your_script.py
Exec is used to save some memory.
(I'm omitting the passing of variables to the Python script since I'm not familiar with ksh.)
Related
I'm new to python and enjoying learning the language. I like using the interpreter in real time, but I still don't understand completely how it works. I would like to be able to define my environment with variables, imports, functions and all the rest then run the interpreter with those already prepared. When I run my files (using PyCharm, Python 3.6) they just execute and exit.
Is there some line to put in my .py files like a main function that will invoke the interpreter? Is there a way to run my .py files from the interpreter where I can continue to call functions and declare variables?
I understand this is a total newbie question, but please explain how to do this or why I'm completely not getting it.
I think you're asking three separate things, but we'll count it as one question since it's not obvious that they are different things:
1. Customize the interactive interpreter
I would like to be able to define my environment with variables, imports, functions and all the rest then run the interpreter with those already prepared.
To customize the environment of your interactive interpreter, define the environment variable PYTHONSTARTUP. How you do that depends on your OS. It should be set to the pathname of a file (use an absolute path), whose commands will be executed before you get your prompt. This answer (found by Tobias) shows you how. This is suitable if there is a fixed set of initializations you would always like to do.
2. Drop to the interactive prompt after running a script
When I run my files (using PyCharm, Python 3.6) they just execute and exit.
From the command line, you can execute a python script with python -i scriptname.py and you'll get an interactive prompt after the script is finished. Note that in this case, PYTHONSTARTUP is ignored: It is not a good idea for scripts to run in a customized environment without explicit action.
3. Call your scripts from the interpreter, or from another script.
Is there a way to run my .py files from the interpreter where I can continue to call functions and declare variables?
If you have a file myscript.py, you can type import myscript in the interactive Python prompt, or put the same in another script, and your script will be executed. Your environment will then have a new module, myscript. You could use the following variant to import your custom definitions on demand (assuming a file myconfig.py where Python can find it):
from myconfig import *
Again, this is not generally a good idea; your programs should explicitly declare all their dependencies by using specific imports at the top.
You can achieve the result you intend by doing this:
Write a Python file with all the imports you want.
Call your script as python -i myscript.py.
Calling with -i runs the script then drops you into the interpreter session with all of those imports, etc. already executed.
If you want to save yourself the effort of calling Python that way every time, add this to your .bashrc file:
alias python='python -i /Users/yourname/whatever/the/path/is/myscript.py'
You set the environment variable PYTHONSTARTUP as suggested in this answer:
https://stackoverflow.com/a/11124610/1781434
I have an application that has environment variables which I cannot control and are modified from the "default" environment variables applications are given when started from cmd or explore (or whatever). My application allows me to run vbscripts, but these scripts take the parent application's environmental variables, which is okay. I would like to use the vbscript to start python with the "normal"/"default" environment variables most application get when started from cmd or explorer (or whatever). I've attempted to use the vbscript to call a bat file that runs python, however python is still maintaining the environment variables of its grand-grand-parent.
How can I get a vbscript to call application (python) without passing its "modified" environment variables to its child? I've also tried using start /i, but because the vbscript's parent modified its variables before the vbscript started, it won't reset to a clean environment.
My VBScript looks something like this:
sub run
dim wshShell
Set wshShell = CreateObject( "WScript.Shell" )
Dim cmd
cmd = ""
cmd = cmd & chr(34) ' double quote character'
cmd = cmd & "startPython.bat"
cmd = cmd & chr(34)
wshShell.run( cmd )
end sub
My application will start the run subroutine with modified undesirable environment variables.
The startPython.bat file looks like this:
start /i python "python3file.py" %*
rem pause
The batch file is not required, but seems like possible point that the environment chain could be broken, but it does not seem to.
In the end, I'd like to have the vbscript start python using the "default"/"normal" environment variables that new application started by the user via cmd or explorer would be given (I'm not picky about how old they are, or if they are the OS startup variables, or ones that are slightly modified during startup before any cmd or explorer, or user startup. These modifications are acceptable, however modifications made by my parent application should be gone.
(It's also important that the session environment variables of the parent DO NOT change, however after pages of stackoverflow and other readings, that seems to be difficult (and non-recommended) anyways.)
(I understand their isn't a "default" or "normal" set of environment variables, but there exists system level and user level, either of which would be acceptable in my situation, as long as I'm not using the session variables created by my parent application. Acceptable solutions would also include "stealing" a copy of the environment variables, but a fresh set of OS generated environment variables would be most preferred (either of the system level or user level type).)
An ideal solution would be limited to vbscript (optionally via bat files) to start python with clean set of environment variables.
EDIT Additional updates:
Based on Harry Johnston suggestions:
I tried using ShellExecute by modifying the VBScript to look something like:
dim objShell
set objShell = CreateObject("shell.application")
objShell.ShellExecute cmd , "" , "blahBlahBlah"
set objShell = nothing
Based on the msdn link suggestion, but I'm setting getting a modified environmental variable session.
(I also removed /i then start from the batch script file in case they were getting the parents session variables.)
You want the ShellExecute method of the shell.application object.
This causes the Windows shell (aka Windows Explorer, more or less) to open the specified application or file on your behalf. That means the newly launched application gets a fresh set of environment variables, just as if you'd launched it by double-clicking in Explorer.
Inheritance of environment variables is a built-in concept of Windows. However, anyone calling the CreateProcess() function can specify the environment variables for that process (lpEnvironment). Problem is: how to know what the default environment variables are?
The only workaround I see at this time, is to create a task in task scheduler and having the command executed by task scheduler instead of your own application. This approach assumes that task scheduler is started with an original set of environment variables. Commands you'll need for that:
schtasks /create ...
schtasks /run ...
schtasks /delete ...
Setting it all up is a bit tricky. I can't tell whether that works, but maybe you can give it a try.
Recently, I came across the Linux command source and then found this answer on what it does.
My understanding was that source executes the file that is passed to it, and it did work for a simple shell script. Then I tried using source on a Python script–but it did not work.
The Python script has a shebang (e.g. #!/usr/bin/python) and I am able to do a ./python.py, as the script has executable permission. If that is possible, source python.py should also be possible, right? The only difference is ./ executes in a new shell and source executes in the current shell. Why is it not working on a .py script? Am I missing something here?
You're still not quite on-target understanding what source does.
source does indeed execute commands from a file in the current shell process. It does this effectively as if you had typed them directly into your current shell.
The reason this is necessary is because when you run a shell script without sourcing it, it will spawn a subshell—a new process. When this process exits, any changes made within that script are lost as you return to the shell from which it spawned.
It follows, then, that you cannot source Python into a shell, because the Python interpreter is always a different process from your shell. Running a Python script spawns a brand-new process, and when that process exits, its state is lost.
Of course, if your shell is actually Python (which I would not recommend!), you can still "source" into it—by using import.
source executes the files and places whatever functions/aliases/environment variables created in that script within the shell that called it. It does this by not spawning a new process, but instead executing the script in the current process.
The shabang is used by the shell to indicate what to use to spawn the new process, so for source it is ignored, and the file is interpreted as the language of the current process (bash in this case). This is why using source on a python file failed for you.
I am trying to write what should be a super simple bash script. Basically activate a virtual env and than change to the working directory. A task i do a lot and condesing to one command just made sense.
Basically ...
#!/bin/bash
source /usr/local/turbogears/pps_beta/bin/activate
cd /usr/local/turbogears/pps_beta/src
However when it runs it just dumps back to the shell and i am still in the directory i ran the script from and the environment isn't activated.
All you need to do is to run your script with the source command. This is because the cd command is local to the shell that runs it. When you run a script directly, a new shell is executed which terminates when it reaches the script's end of file. By using the source command you tell the shell to directly execute the script's instructions.
The value of cd is local to the current script, which ends when you fall off the end of the file.
What you are trying to do is not "super simple" because you want to override this behavior.
Look at exec for replacing the current process with the process of your choice.
For feeding commands into an interactive Bash, look at the --rcfile option.
I imagine you wish your script to be dynamic, however, as a quick fix when working on a new system I create an alias.
begin i.e
the env is called 'py1' located at ~/envs/py1/ with a repository
location at ~/proj/py1/
alias py1='source ~/envs/py1/bin/activate; cd ~/proj/py1/;
end i.e
You can now access your project and virtualenv by typing py1 from anywhere in the CLI.
I know that this is no where near ideal, violates DRY, and many other programming concepts. It is just a quick and dirty way of getting your env and project accessible quickly without having to setup the variables.
I know that I'm late to the game here, but may I suggest using virtualenvwrapper? It provides a nice bash hook that appears to do exactly what you want.
Check out this tutorial: http://blog.fruiapps.com/2012/06/An-introductory-tutorial-to-python-virtualenv-and-virtualenvwrapper
My main goal is to get this up and running.
My hook gets called when I do the commit with Tortoise SVN, but it always exits when I get to this line: Python "%~dp0trac-post-commit-hook.py" -p "%TRAC_ENV%" -r "%REV%" || EXIT 5
If I try and replace the call to the python script with any simple Python script it still doesn't work so I'm assuming it is a problem with the call to Python and not the script itself.
I have tried setting the PYTHON_PATH variable and also set %PATH% to include Python.
I have trac up and running so Python is working on the server itself.
Here is some background info:
Python is installed on Windows server and script is called from local machine so
IF NOT EXIST %TRAC_ENV% EXIT 3
and
SET PYTHON_PATH=X:\Python26
IF NOT EXIST %PYTHON_PATH% EXIT 4
fail unless I point set them to the mapped network drive (That is point them at X and Y drives not C and E drives)
Python scripts can be called anywhere from the command line from the server regardless of the drive so the PATH variable should be set correctly
Appears to be an issue with calling python scripts externally, but not sure how I go about changing the permissions for this.
Thanks in advance.
Take the following things into account:
network drive mappings and subst
mappings are user specific. Make sure
the drives exist for the user account
under which the svn server is
running.
subversion hook scripts are run
without any environment variables
being set for security reasons, not even %path%. Call
the python executable with an
absolute path, e.g.
c:\python25\python.exe.