I am trying to get the response code for three sites using the below python code snippet. But wondering how I can parse each object in the array to pass through the for loop within the curl call.
import os
servers = ["google", "yahoo", "nonexistingsite"]
for i in range(len(servers)):
print(os.system('curl --write-out "%{http_code}\n" --silent --output'
' /dev/null "https://servers[i].com"'))
With the above code, it's not getting passed through servers[i].
You need to perform string formatting, like:
import os
servers = ["google", "yahoo", "nonexistingsite"]
for server in servers:
print(os.system('curl --write-out "%{{http_code}}\\n" --silent --output /dev/null "https://{}.wellsfargo.com"'.format(server)))
The above can however still go wrong, if for example the servers contain quotes, etc.
It might be better to here use subprocess.run and pass it a list of parameters, like:
servers = ["google", "yahoo", "nonexistingsite"]
for server in servers:
p = subprocess.run(
[
'curl'
'--write-out',
'%{http_code}\\n',
'--silent',
'--output'
'/dev/null',
'https://{}.wellsfargo.com'.format(server)
],
shell=True,
capture_output=True
)
Try using Python's string formatting, something like:
"This string uses an %s" %(argument) would become "This string uses an argument"
Something like this:
print(os.system('curl --write-out "%%{http_code}\n" --silent --output /dev/null "https://%s.wellsfargo.com"') % (servers[i])
More here: https://powerfulpython.com/blog/python-string-formatting/
Just use the requests library instead of shelling out to run curl:
for s in servers:
resp = requests.get(s)
print(resp.status_code)
Since you don't care about the body of the response, only whether it responds, you could save bandwidth by using the head function instead of get to retrieve only the headers from the server.
Related
I'm trying to write the response returned from RestApi URL into a csv file for provided port number(Interactive mode) and selected users users_list.txt (Script mode). I have the below code to do the job.
import json
import csv
import urllib.request
import subprocess
portvalue = input("Please enter an Port Number:\n")
portvalue = int(portvalue)
print(f'You entered {portvalue}')
tooluser='admin'
toolpassword='password'
user = open('users-list.txt')
for line in user:
bash_com = 'curl --user {tooluser}:{toolpassword} http://198.98.99.12:46567/{portvalue}/protects/{user} \
| jq --arg a_port {portvalue} --arg a_userid {user} 'map(.+{"userid":{user}}+{"port":{portvalue}})'' as url:
subprocess.Popen(bash_com)
output = subprocess.check_output(['bash','-c', bash_com])
print(line)
myfile.close()
# with urllib.request.urlopen("curl --user admin:password http://198.98.99.12:46567/{port}/protects/{user} | jq") as url:
data = json.loads(url.read().decode())
fname = "output.csv"
with open(fname, "w") as file:
csv_file = csv.writer(file,lineterminator='\n')
csv_file.writerow(["depotFile","host","isgroup","line","perm","user","port","userid"])
for item in data["raw_data"]:
csv_file.writerow([item['depotFile'],item['host'],item['isgroup'],item['line'],item['perm'],item['user'],item['port'],item['userid']])
Curl with URL to get data for single user - curl --user admin:password http://198.98.99.12:46567/2324/protects/sanchez.ricardo | jq
users_list.txt consists of users in below format.
sanchez.ricardo
varun.sharma
daniel.vel
One of the json output format as follows,
[
{
"depotFile": "//LIB/Include/...",
"host": "*",
"isgroup": "",
"line": "19",
"perm": "open",
"user": "5G",
"port": "2324",
"userid": "sanchez.ricardo"
},
....
......
.........
]
Expected output csv file:-
Sno depotFile host isgroup line perm user port userid
1 //LIB/Include/... * 19 open 5G 2324 sanchez.ricardo
2 //LIB/... * 19 write 6G 2324 varun.sharma
3 //AND/RIO/... * 20 write AND 2324 daniel.vel
I'm unable to process the RestApi URl in above code. Please help me to achieve this in python. Thanks in advance for your help.
I like to use requests to grab stuff as it is so easy now, it has improved drastically over its beginnings. It is so much easier than pyCurl now that I am in the process of refactoring code to use it as I make improvements to the code over time.
Here is a really cool website to convert a curl command to requests code:
https://curl.trillworks.com/
so:
curl --user admin:password http://198.98.99.12:46567/2324/protects/sanchez.ricardo
is:
import requests
response = requests.get('http://198.98.99.12:46567/2324/protects/sanchez.ricardo', auth=('admin', 'password'))
Now it looks to me like your loop isn't going to work. It loops and assigns the stuff to bash but then doesnt do anything until the loop is done. Probably you accidentally unindented it though. At any rate, here is a loop that should work, using the suggestion of f-strings as suggested in another answer.
I am not completely sure of what you get back from the api, seems like you need to make multiple requests to get json for each, so I wrote it that way. I am sure you need to tweak some stuff but this should get you closer.
import requests
import json
import csv
user_csv = '''sanchez.ricardo
varun.sharma
daniel.vel'''
# I cannot tell where you get the ports from, so I will put a list here to show the code working
# and you can fill it differently, maybe from the csv?? not sure
ports = [2324, 2324, 2324]
users = user_csv.split('\n')
fname = "output.csv"
with open(fname, "w") as file:
csv_file = csv.writer(file,lineterminator='\n')
csv_file.writerow(["depotFile", "host", "isgroup", "line", "perm", "user", "port", "userid"])
for user, port in zip(users, ports):
print(f'from csv and other method here is the user and port: {user}, {port}')
url = f'http://198.98.99.12:46567/{port}/protects/{user}'
print(f'grab data from rest api: {url}')
# cannot tell if there is one user:password for the REST API, this code assumes so
response = requests.get(url, auth=('admin', 'password'))
# assuming response returns json like this:
# response = '''[
# {
# "depotFile": "//LIB/Include/...",
# "host": "*",
# "isgroup": "",
# "line": "19",
# "perm": "open",
# "user": "5G",
# "port": "2324",
# "userid": "sanchez.ricardo"
# }]'''
data = json.loads(response)
for item in list(data):
csv_file.writerow(item.values())
In this case, f-string is probably what you're looking for. Consider this:-
user='sanchez.ricardo'
port=2324
url = f'http://198.98.99.12:46567/{port}/protects/{user}'
The second snippet you have only covers the part
curl --user $USERID:$PASSWORD http://198.98.99.12:46567/$PORT/protects/$u \
jq --arg a_port $PORT --arg a_userid .....
From what I undrtstand you want both funcionalities
from user input - interactive mode.
from file - script mode.
Many things can be done but what I would do is use Argeparse and maybe have a switch param -ulf|--users-list-file to pass a file or simply no param for interactive mode.
And have the port be passed via environment variable, and read that using os.environ, aswell as a -p|--port as an optional param
A quick'n dirty way is to use sys.argv to read passed arguments and write your conditions accordingly.
I have been going through your code, you can simply use request and use pandas to store data in CSV. This will make easy output.
import json
import pandas as pd
json_string = '{ "name":"John", "age":30, "car":"None" }'
a_json = json.loads(json_string)
print(a_json)
dataframe = pd.DataFrame.from_dict(a_json)
I run a curl command on windows command line prompt. It produces a json output. The command looks like this:
curl --data "action=details&user=user&project=project1&problemid=2021" https://website:9020/
I issue the same command in python as following:
import subprocess
output = subprocess.run(
[
"curl",
"--data",
"\"action=details&user=user&project=project1&problemid=2021\""
"https://website:9020/",
],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
)
print(output.stdout.decode("utf-8"))
The output is the following:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 199 100 64 100 123 64 123 0:00:01 --:--:-- 0:00:01 1254
{"status":400,"message":"Action parameter is missing"}
But the command line produces a json output. Yet the same command issued through subprocess.run produces this error. I also tried it with subprocess.Popen and subprocess.check_output. Same issue persists. What am I doing wrong here that is causing this error?
In some cases (e.g. if they require running command line as administrator), subprocess might not be the right choice to execute the command line commands.
You can use os.system() to see the output or os.popen() to read and store the output.
import os
import json
# to see the output
print(os.system("curl --data \"action=details&user=user&project=project1&problemid=2021\" https://website:9020/")
output = os.popen("curl --data \"action=details&user=user&project=project1&problemid=2021\" https://website:9020/").read()
outputjson = json.loads(output)
then you can access the json information.
Did you try running your code without the quotation marks around the action parameter? Because the API is complaining about missing action parameter. I think it's just not recognising it due to the escaped quotation marks.
import subprocess
output = subprocess.run(
[
"curl",
"--data",
"action=details&user=user&project=project1&problemid=2021"
"https://website:9020/",
],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
)
print(output.stdout.decode("utf-8"))
EDIT: I'm not sure if this is the case, but it could be that subprocess implicitly surrounds all its parameters with quotation marks in order to avoid code injection and globbing.
I was just googling again and found this answer, which states it quite well: why would you use curl, if you could use requests?
No guarantee, but I think something like this should work:
import requests
url = "https://website:9020/"
payload = {
"action": "details",
"user": "user",
"project": "project1",
"problemid": "2021"
}
res = requests.get(url, params=payload)
I'm trying to write a script that imitates a cURL command I make to change data on the target web page:
curl -u username:password "https://website.com/update/" --data "simChangesList=%5B%7B%22simId%22%3A760590802%2C%22changeType%22%3A2%2C%22targetValue%22%3A%220003077%22%2C%22effectiveDate%22%3Anull%7D%5D" --compressed
As you can see above, I am POSTing a url-encoded string to the target web page.
The following code does not work:
import requests
import urllib
enc = urllib.quote('[{"simId":760590802,"changeType":2,"targetValue":000307,"effectiveDate":null}]')
simChangesList = 'simChangesList=' + enc
print simChangesList
auth = s.post(url, data=simChangesList)
print auth.text
Even though I'm fairly certain the above code imitates my cURL command previously, but it obviously isn't.
I am getting a Required List parameter 'simChangesList' is not present error.
What is the equivalent of the cURL command to POST a url-encoded string with the requests module in Python?
EDIT:
I've tried to make multiple dictionaries with simChangesList as the key, but I cannot seem to do it.
Here are my attempts:
simChangesList: [{"simId":760590802,"changeType":2,"targetValue":000307,"effectiveDate":null}]
data = {'simChangesList': ['simId': 760590802, 'changeType': 2, 'targetValue': '0003077', 'effectiveDate': null]}
data['simChangesList'] = ['simId': 760590802, 'changeType': 2, 'targetValue': '0003077', 'effectiveDate': null]
simChangesList:[{"simId":760590802,"changeType":2,"targetValue":"000307","effectiveDate":null}]
payload = {
'simChangesList':
[{'simId': '760590802',
'changeType': '2',
'targetValue': '0003077',
'effectiveDate': 'null'}]
}
I have already gone through few StackOverflow existing links for this query, did not help me.
I would like to run few curl command(4) and each curl commands give output. From that output, I would like to parse the few group ids for next command.
curl --basic -u admin:admin -d \'{ "name" : "test-dev" }\' --header \'Content-Type: application/json\' http://localhost:8080/mmc/api/serverGroups
I have tried with as ,
#!/usr/bin/python
import subprocess
bash_com = 'curl --basic -u admin:admin -d '{ "name" : "test-dev" }' --header 'Content-Type: application/json' http://localhost:8080/mmc/api/serverGroups'
subprocess.Popen(bash_com)
output = subprocess.check_output(['bash','-c', bash_com]) # subprocess has check_output method
It gives me the syntax error, though I have changed from a single quote to double quote for that curl command.
I have been trying with Pycurl but i have to look more into that. Is there any way we can run curl commands in python and can parse the output values and pass it to next curl command.
You can use os.popen with
fh = os.popen(bash_com, 'r')
data = fh.read()
fh.close()
Or you can use subprocess like this
cmds = ['ls', '-l', ]
try:
output = subprocess.check_output(cmds, stderr=subprocess.STDOUT)
retcode = 0
except subprocess.CalledProcessError, e:
retcode = e.returncode
output = e.output
print output
There you have to organize your command and params in a list.
Or you just go the easy way and use requests.get(...).
And do not forget: Using popen you can get shell injections via parameters of your command!
Better output using os.open(bash_com,'r') and then fh.read()
python api.py
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
199 172 0 172 0 27 3948 619 --:--:-- --:--:-- --:--:-- 4027
{"href":"http://localhost:8080/mmc/api/serverGroups/39a28908-3fae-4903-adb5-06a3b7bb06d8","serverCount":0,"name":"test-dev","id":"39a28908-3fae-4903-adb5-06a3b7bb06d8"}
trying understand that fh.read() has executed the curl command? please correct me
I am trying to redirect the curl commands output to text file and then parse the file via JSON. All I am trying to get "id" from about output.
fh = os.popen(bash_com,'r')
data = fh.read()
newf = open("/var/tmp/t1.txt",'w')
sys.stdout = newf
print data
with open("/var/tmp/t1.txt") as json_data:
j = json.load(json_data)
print j['id']
I have checked the files content in JSONlint.com and got VALID JSON on it. It is throwing "ValueError: No JSON object could be decoded" at json.load line. Is there anything need to perform before parsing the redirected file.
I am trying to launch a Jenkins parametrized job from a python script. Due to environment requirements, I can't install python-jenkins. I am using raw requests module.
This job I am trying to launch has three parameters:
string (let's call it payload)
string (let's call it target)
file (a file, optional)
I've searched and search, without any success.
I managed to launch the job with two string parameters by launching:
import requests
url = "http://myjenkins/job/MyJobName/buildWithParameters"
target = "http://10.44.542.62:20000"
payload = "{payload: content}"
headers = {"Content-Type": "application/x-www-form-urlencoded"}
msg = {
'token': 'token',
'payload': [ payload ],
'target': [ target ],
}
r = requests.post(url, headers=headers, data=msg)
However I am unable to send a file and those arguments in single request.
I've tried requests.post file argument and failed.
It turns out it is impossible to send both data and file in a single request via HTTP.
import jenkinsapi
from jenkinsHandler import JenkinsHandler
in your python script
Pass parameters to buildJob() , (like < your JenkinsHandler object name>.buildJob())
JenkinsHandler module has functions like init() , buildJob(), isRunning() which helps in triggering the build
Here is an example:
curl -vvv -X POST http://127.0.0.1:8080/jenkins/job/jobname/build
--form file0='#/tmp/yourfile'
--form json='{"parameter": [{"name":"file", "file":"file0"}]}'
--form json='{"parameter": [{"name":"payload", "value":"123"}]
--form json='{"parameter": [{"name":"target", "value":"456"}]}'