Using python scripts in subversion hooks on windows - python

My main goal is to get this up and running.
My hook gets called when I do the commit with Tortoise SVN, but it always exits when I get to this line: Python "%~dp0trac-post-commit-hook.py" -p "%TRAC_ENV%" -r "%REV%" || EXIT 5
If I try and replace the call to the python script with any simple Python script it still doesn't work so I'm assuming it is a problem with the call to Python and not the script itself.
I have tried setting the PYTHON_PATH variable and also set %PATH% to include Python.
I have trac up and running so Python is working on the server itself.
Here is some background info:
Python is installed on Windows server and script is called from local machine so
IF NOT EXIST %TRAC_ENV% EXIT 3
and
SET PYTHON_PATH=X:\Python26
IF NOT EXIST %PYTHON_PATH% EXIT 4
fail unless I point set them to the mapped network drive (That is point them at X and Y drives not C and E drives)
Python scripts can be called anywhere from the command line from the server regardless of the drive so the PATH variable should be set correctly
Appears to be an issue with calling python scripts externally, but not sure how I go about changing the permissions for this.
Thanks in advance.

Take the following things into account:
network drive mappings and subst
mappings are user specific. Make sure
the drives exist for the user account
under which the svn server is
running.
subversion hook scripts are run
without any environment variables
being set for security reasons, not even %path%. Call
the python executable with an
absolute path, e.g.
c:\python25\python.exe.

Related

How to prefix file with custom commands when debugging in PyCharm?

This question is very similar to this one but for PyCharm.
I need to use aws-vault to access AWS resources in my script, but this seems to be impossible to accomplish in PyCharm debugging mode. It gives ability to enter script path, parameters, environment variables and there is also external tools functionality, but neither of these work.
Here is the format that works in shell:
aws-vault exec ${AWS_PROFILE} -- script.py
I thought that I've almost arrived at a solution by using external tools and setting the program to "aws-vault" and its arguments to "exec your-profile -- $FilePath$", but it wants to run the script in $FilePath$, finish and only after completion run the debugged script in PyCharm (which is the same one as the one inserted by $FilePath$).
How it would work for my case is by running needed script in debug mode in conjunction with external tool, so the script would go into arguments of the external tool and run as one command.
There are ways to deal with this by launching PyCharm from command line with aws-vault as a prefix or editing its .desktop file and writing the prefix directly into the Exec field, but the app needs to be restarted when AWS profile has to be changed.
Any ideas would be appreciated, thanks.
I was able to do this by installing the envfile plugin in PyCharm. This plugin can read in a .env file when starting a process. Basically I did the following:
Create a script that generates a .env file, envfile.env and name the script generate.sh
This generate.sh script is a shell script that basically does: aws-vault exec $AWS_PROFILE -- env | grep AWS_ > envfile.env, so all the aws creds are in the envfile.env. Possibly add other environment variables if you need so.
Execute command above at least once.
Install the envfile plugin in pycharm.
In the Run configuration, a new tab appears with 'EnvFile'. In this tab, enable the EnvFile. Add the generated envfile.env (see previous).
Do Tool / External Tools and create an external tool for the generate.sh. This way you can execute the script from PyCharm.
Again in the Run configuration add a Before Launch that executes the External Tool generate.sh.
Warning, the temporary aws-creds are in the plaintext envfile.env.

add a new path to PTYHONPATH and use it for program that is doing it using NSIS

Platform: Windows 7
Python: 2.7.3
StrCpy $NETWORK_PATH "\\someserver\network\path\here"
DetailPrint "$\n"
DetailPrint "Setting up paths required"
Push "SETX PYTHONPATH $NETWORK_PATH;$NETWORK_PATH\lib"
Call Execute
Push '"C:\Python27\python.exe" setup.py deploy'
Call Execute
Function Execute
Exch $0
# execution of the command and return success or failure
FunctionEnd
This above is compiled as NSIS installer and run on multiple machines.
Problem
"C:\Python27\python.exe" setup.py deploy depends on that $NETWORK_PATH for successful execution.
First time when we run it, $NETWORK_PATH is appended to PYTHONPATH environmental variable, but "C:\Python27\python.exe" setup.py deploy fails as new PYTHONPATH will be effective only either in new command prompt or in next run.
Is there a way to make the appended PYTHONPATH effective in the same run itself?
Currently, we are running it twice - once for setting PYTHONPATH and accepting the failure, second time it runs successfully.
Another alternative approach we tried is - we made 2 executables, one for setting PYTHONPATH and another for Python Script to run. Then we put both of them in batch script to run.
But my preference is to achieve whole of this in one file and in one run.
You can update the installers environment, it will be inherited by child processes:
System::Call 'Kernel32::SetEnvironmentVariable(t "PYTHONPATH", t "$NETWORK_PATH;$NETWORK_PATH\lib")i.r0' ; $0 will be != "0" on success
Push '"C:\Python27\python.exe" setup.py deploy'
Call Execute

How to 'dot' in an environment file in Python

I wrote a Python program which will be executed on both the Primary Production server, as well on the Disaster Recovery server. There is a slight difference in behavior when the program is run on the Disaster Recovery server.
Therefore the program needs to determine which server it is running on.
We have many other ksh programs running on these servers, which have the same requirement: run on both servers, but there could be a slight difference on the DR server. All of these scripts 'dot' in an environment file, then check " if environment variable $DR_SITE equal 1" to determine if its running on the DR server.
I want to use the existing environment file from my Python program - to determine if it is running on the DR server. I can not just read the this environment file, it is actually a ksh script that itself has some logic prior to setting the DR_SITE variable.
Which brings me to the original question:
How do you 'dot' in (or execute an environment file as described above in python, in order to inherit the environment variables set by the ?
For example, in ksh I would execute this:
. /path/env.set
I tried this, but it did not seem to work (I printed out the DR_SITE value before calling the os.system call, and after, it did not change):
os.system(". /appl/gfpd2/current/D2soe_set")
You could write a ksh script that sources and executes the environment setter and then invokes the Python program;
#!/usr/bin/env ksh
. /path/env.set
exec python /path/your_script.py
Exec is used to save some memory.
(I'm omitting the passing of variables to the Python script since I'm not familiar with ksh.)

Running Ansible Playbooks under uWSGI not working

I have Ansible Playbooks running from the command line just fine, since it seems Ansible uses the executing application (in this case Python) as the command to invoke Playbooks with.
Problem is when you try to run Ansible Playbooks under uWSGI, the command that attempts to run the Playbook uses /usr/bin/uwsgi.
Somehow Ansible is finding the command it is running under. Is there a way to change that?
UPDATE: I believe that the command to run is just sys.executable. Is this overridable?
Didn't quite understand the overall picture, but does it help if you're able to specify the interpreter per remote host using a "behavioral-inventory-parameter":
ansible_python_interpreter The target host python path. This is useful for systems with more than one Python or not located at
"/usr/bin/python" such as *BSD, or where /usr/bin/python is not a 2.X
series Python. We do not use the "/usr/bin/env" mechanism as that
requires the remote user's path to be set right and also assumes the
"python" executable is named python, where the executable might be
named something like "python26".
e.g. this is how your inventory file would look like if you specify them at group level (you can also specify at host level of course, your choice):
# I think specifying ansible_ssh_host won't be needed, but if needed here is how it can be done.
# localhost ansible_ssh_host=127.0.0.1 ansible_python_interpreter=/usr/local/bin/python
localhost ansible_python_interpreter=/usr/local/bin/python
[rhel5-boxes]
rhelhost1
# ...
# other groups...
[rhel5-boxes:vars]
ansible_python_interpreter=/usr/bin/python2.6
[rhel6-boxes]
ansible_python_interpreter=/usr/bin/python
[iron-boxes:vars]
ansible_python_interpreter=/usr/bin/ipython

Getting unmodified enviromental variables when starting new process via vbscript or bat file call to python

I have an application that has environment variables which I cannot control and are modified from the "default" environment variables applications are given when started from cmd or explore (or whatever). My application allows me to run vbscripts, but these scripts take the parent application's environmental variables, which is okay. I would like to use the vbscript to start python with the "normal"/"default" environment variables most application get when started from cmd or explorer (or whatever). I've attempted to use the vbscript to call a bat file that runs python, however python is still maintaining the environment variables of its grand-grand-parent.
How can I get a vbscript to call application (python) without passing its "modified" environment variables to its child? I've also tried using start /i, but because the vbscript's parent modified its variables before the vbscript started, it won't reset to a clean environment.
My VBScript looks something like this:
sub run
dim wshShell
Set wshShell = CreateObject( "WScript.Shell" )
Dim cmd
cmd = ""
cmd = cmd & chr(34) ' double quote character'
cmd = cmd & "startPython.bat"
cmd = cmd & chr(34)
wshShell.run( cmd )
end sub
My application will start the run subroutine with modified undesirable environment variables.
The startPython.bat file looks like this:
start /i python "python3file.py" %*
rem pause
The batch file is not required, but seems like possible point that the environment chain could be broken, but it does not seem to.
In the end, I'd like to have the vbscript start python using the "default"/"normal" environment variables that new application started by the user via cmd or explorer would be given (I'm not picky about how old they are, or if they are the OS startup variables, or ones that are slightly modified during startup before any cmd or explorer, or user startup. These modifications are acceptable, however modifications made by my parent application should be gone.
(It's also important that the session environment variables of the parent DO NOT change, however after pages of stackoverflow and other readings, that seems to be difficult (and non-recommended) anyways.)
(I understand their isn't a "default" or "normal" set of environment variables, but there exists system level and user level, either of which would be acceptable in my situation, as long as I'm not using the session variables created by my parent application. Acceptable solutions would also include "stealing" a copy of the environment variables, but a fresh set of OS generated environment variables would be most preferred (either of the system level or user level type).)
An ideal solution would be limited to vbscript (optionally via bat files) to start python with clean set of environment variables.
EDIT Additional updates:
Based on Harry Johnston suggestions:
I tried using ShellExecute by modifying the VBScript to look something like:
dim objShell
set objShell = CreateObject("shell.application")
objShell.ShellExecute cmd , "" , "blahBlahBlah"
set objShell = nothing
Based on the msdn link suggestion, but I'm setting getting a modified environmental variable session.
(I also removed /i then start from the batch script file in case they were getting the parents session variables.)
You want the ShellExecute method of the shell.application object.
This causes the Windows shell (aka Windows Explorer, more or less) to open the specified application or file on your behalf. That means the newly launched application gets a fresh set of environment variables, just as if you'd launched it by double-clicking in Explorer.
Inheritance of environment variables is a built-in concept of Windows. However, anyone calling the CreateProcess() function can specify the environment variables for that process (lpEnvironment). Problem is: how to know what the default environment variables are?
The only workaround I see at this time, is to create a task in task scheduler and having the command executed by task scheduler instead of your own application. This approach assumes that task scheduler is started with an original set of environment variables. Commands you'll need for that:
schtasks /create ...
schtasks /run ...
schtasks /delete ...
Setting it all up is a bit tricky. I can't tell whether that works, but maybe you can give it a try.

Categories

Resources