Ansible Tower - How to get a list of the environment variables - python

Within Tower there is a lot of options to add environment variables before execution. I have set some variables that get pulled into a python inventory script. However the script is responding with an error. I think the python code is not getting the values or the values are not in the correct format.
I would like to see how the environment variables are being exposed to the python script. Is there a way to get these added to the debug output in the job log?

The problem was I wasn't executing a playbook. I was executing a custom python inventory script and I need to be able to see how Ansible was loading the variables to be able to troubleshoot why the script wouldn't load the variables. I added some code to the python script to send me an email with the list of environmental variables. You can also write this to a file on the drive, but if you are using tower, you have to expose the folder location under the admin settings -> Jobs -> paths to expose. I decided it would be easier to just get an email while testing.
import smtplib
import datetime
import time
ts = time.time()
st = datetime.datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
output = ""
output +=('Time Start {} \r\n '.format(st))
for a in os.environ:
output +=('Var: {} Value: {} \r\n'.format(a, os.getenv(a)))
def send_email(addr_from, addr_to, contents):
svr = smtplib.SMTP('smtp.mail.local', 25)
msg = 'Subject: Subject goes here.\n\n{0}'.format(contents)
svr.sendmail(addr_from, addr_to, msg)
send_email('addr_from#mail.com','addr_to#mail.com',output)
Here is a picture of the variables
Then here is a picture of the script.
But this didn't work. Here is the code that worked.
The problem was that when you query the environmental variable in python, if its a dictionary, it will return with single quotes and you have to convert that to double quotes and json.loads it to get it to load as a dictionary.
There is multiple problems solved with this. I hope this helps others needing to troubleshoot Ansible with Python.

Just use debug along with the 'env' lookup.Below home is the environment variable.
- name: Show env variable
debug:
msg: "{{ lookup('env','HOME') }}"
https://docs.ansible.com/ansible/latest/plugins/lookup/env.html

Related

Using powershell variables in my python script?

I have made a user creation script in powershell, and I am almost done writing the website automation part of it in python with selenium. My problem lies in the joining of the 2. I would like my Python script to use the new user creds I entered in powershell.
So hopefully the PS script would run fully, but before exiting it starts my python script and uses the creds to build him website profiles as well. I have done quite a bit of research the last couple days and cannot figure this out.
Thank you!
You can solve it with passing them only one time rather than saving passwords clear text. Although, if you enable Powershell logging, check what will end up in those logs.
$user1 = "test1"
$cred1 = "testpass1"
# you can also concatenate if necessary, adding all users/pws with some separators
$user2 = "test2"
$cred2 = "testpass2"
$users=$user1+","+$user2
$creds=$cred1+","+$cred2
PS > py .\path_to\create_web_profiles.py $users $creds # make sure you use _py_ and not python / python3.
create_web_profiles.py:
import sys
users = sys.argv[1]
passwords = sys.argv[2]
def getusers(users, passwords):
users=users.split(",")
passwords=passwords.split(",")
print('Usernames: ',users,'Passwords: ',passwords)
for user,passw in zip(users,passwords):
create_web_user(user,passw)
def create_web_user(user, passw):
# your web functions come here
print(user,passw)
pass
getusers(users, passwords)
Output:
PS > py .\path_to\create_web_profiles.py $users $creds
Usernames: ['test', 'test2'] Passwords: ['tpass', 'tpass2']
test tpass
test2 tpass2

Using Jenkins variables/parameters in Python Script with os.path.join

I'm trying to learn how to use variables from Jenkins in Python scripts. I've already learned that I need to call the variables, but I'm not sure how to implement them in the case of using os.path.join().
I'm not a developer; I'm a technical writer. This code was written by somebody else. I'm just trying to adapt the Jenkins scripts so they are parameterized so we don't have to modify the Python scripts for every release.
I'm using inline Jenkins python scripts inside a Jenkins job. The Jenkins string parameters are "BranchID" and "BranchIDShort". I've looked through many questions that talk about how you have to establish the variables in the Python script, but with the case of os.path.join(),I'm not sure what to do.
Here is the original code. I added the part where we establish the variables from the Jenkins parameters, but I don't know how to use them in the os.path.join() function.
# Delete previous builds.
import os
import shutil
BranchID = os.getenv("BranchID")
BranchIDshort = os.getenv("BranchIDshort")
print "Delete any output from a previous build."
if os.path.exists(os.path.join("C:\\Doc192CS", "Output")):
shutil.rmtree(os.path.join("C:\\Doc192CS", "Output"))
I expect output like: c:\Doc192CS\Output
I am afraid that if I do the following code:
if os.path.exists(os.path.join("C:\\Doc",BranchIDshort,"CS", "Output")):
shutil.rmtree(os.path.join("C:\\Doc",BranchIDshort,"CS", "Output"))
I'll get: c:\Doc\192\CS\Output.
Is there a way to use the BranchIDshort variable in this context to get the output c:\Doc192CS\Output?
User #Adonis gave the correct solution as a comment. Here is what he said:
Indeed you're right. What you would want to do is rather:
os.path.exists(os.path.join("C:\\","Doc{}CS".format(BranchIDshort),"Output"))
(in short use a format string for the 2nd argument)
So the complete corrected code is:
import os
import shutil
BranchID = os.getenv("BranchID")
BranchIDshort = os.getenv("BranchIDshort")
print "Delete any output from a previous build."
if os.path.exists(os.path.join("C:\\Doc{}CS".format(BranchIDshort), "Output")):
shutil.rmtree(os.path.join("C:\\Doc{}CS".format(BranchIDshort), "Output"))
Thank you, #Adonis!

Export environment variables at runtime with airflow

I am currently converting workflows that were implemented in bash scripts before to Airflow DAGs. In the bash scripts, I was just exporting the variables at run time with
export HADOOP_CONF_DIR="/etc/hadoop/conf"
Now I'd like to do the same in Airflow, but haven't found a solution for this yet. The one workaround I found was setting the variables with os.environ[VAR_NAME]='some_text' outside of any method or operator, but that means they get exported the moment the script gets loaded, not at run time.
Now when I try to call os.environ[VAR_NAME] = 'some_text' in a function that gets called by a PythonOperator, it does not work. My code looks like this
def set_env():
os.environ['HADOOP_CONF_DIR'] = "/etc/hadoop/conf"
os.environ['PATH'] = "somePath:" + os.environ['PATH']
os.environ['SPARK_HOME'] = "pathToSparkHome"
os.environ['PYTHONPATH'] = "somePythonPath"
os.environ['PYSPARK_PYTHON'] = os.popen('which python').read().strip()
os.environ['PYSPARK_DRIVER_PYTHON'] = os.popen('which python').read().strip()
set_env_operator = PythonOperator(
task_id='set_env_vars_NOT_WORKING',
python_callable=set_env,
dag=dag)
Now when my SparkSubmitOperator gets executed, I get the exception:
Exception in thread "main" java.lang.Exception: When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment.
My use case where this is relevant is that I have SparkSubmitOperator, where I submit jobs to YARN, therefore either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment. Setting them in my .bashrc or any other config is sadly not possible for me, which is why I need to set them at runtime.
Preferably I'd like to set them in an Operator before executing the SparkSubmitOperator, but if there was the possibility to pass them as arguments to the SparkSubmitOperator, that would be at least something.
From what I can see in the spark submit operator you can pass in environment variables to spark-submit as a dictionary.
:param env_vars: Environment variables for spark-submit. It
supports yarn and k8s mode too.
:type env_vars: dict
Have you tried this?

Can't I run file in temporary folder using subprocess?

I was trying to using this code below...
**subprocess.Popen('%USERPROFILE%\\AppData\\Local\\Temp\\AdobeARM - Copy.log').communicate()**
but I got an error message.
Is there anyone can help this?
Since there's an environment variable in the path you can add shell=True to force running a batch process which will evaluate env. vars:
subprocess.Popen('"%USERPROFILE%\\AppData\\Local\\Temp\\AdobeARM - Copy.log"',shell=True).communicate()
Note the protection with quotes since there are spaces. You can also drop the quotes if you pass a list containing one element to Popen, which is cleaner:
subprocess.Popen(['%USERPROFILE%\\AppData\\Local\\Temp\\AdobeARM - Copy.log'],shell=True).communicate()
alternately if you just want to activate the default editor for your logfile, there's a simpler way (which does not block the executing script, so it's slightly different):
p = os.path.join(os.getenv('USERPROFILE'),r"AppData\Local\Temp\AdobeARM - Copy.log")
os.startfile(p)
Maybe it can be even simpler since that may be the temporary directory you're trying to reach:
p = os.path.join(os.getenv('TEMP'),r"AdobeARM - Copy.log")
os.startfile(p)

Get change of variable environment from script python

I have a python script, this run a set_env.sh file.
Later, I can get this new variable environment changed into set_env.sh
Is possible?
set_env.sh
var1="value1" var2="value2" var3="value3"
get_env.py
import os
os.system("set_env.sh")
print os.environ['var1']
Result: KeyError: 'var1'
I know that I can use os.environ['var1'] = 'value' (or set var env) into python file.
But I have access to this environment info only from shell file.
Can you help me?
My temp solution was
(?P<variable>[\w]*)[ ]*[\=][ ]*[\"|\']{0,1}(?P<value>[\w]*)[\"|\']{0,1}
Debuggex Demo
enviroment_regex = "(?P<variable>[\w]*)[ ]*[\=][ ]*[\"|\']{0,1}(?P<value>[\w]*)[\"|\']{0,1}"
match_object = re.search( enviroment_regex, open("set_env.sh", "r").read() )
os.environ[match_object.group("variable")] = match_object.group("value")
In order to set an environment variable from python use:
os.environ["var1"] = "value1"
Pay attention that python spawns the environment, which means that the env will contain var1 as long as you're in the python program. If you'll type from shell env (after running the program) there will be no recollection of var1.
For more info: https://www.inkling.com/read/programming-python-mark-lutz-4th/chapter-3/shell-environment-variables

Categories

Resources