Im editing some script for telegram bot, and I only want to add parsing mode html, so it allows me to use bold,italic etc..
I cant seem to find way to adopt parse_mode: "HTML" to curl line
if [ -n "${TOKEN}" ];
then
echo "Sending telegram...";
#Telegram notification
curl -s -X POST https://api.telegram.org/bot${TOKEN}/sendMessage -d chat_id=${CHAT_ID} -d text="${1}" >> /dev/null
fi
parse_mode is just another parameter like text or chat_id. You can use -d!
curl -s -X POST https://api.telegram.org/bot${TOKEN}/sendMessage -d chat_id=${CHAT_ID} -d text="${1}" -d "parse_mode='HTML'" >> /dev/null
Documentation
Related
I am attempting to query Confluence knowledge base pages using the API and failing.
I have researched the api and have the following:
Browsing Content
curl -u admin:admin http://localhost:8080/confluence/rest/api/content/ | python -mjson.tool
Read Content and Expand the Body:
curl -u admin:admin http://localhost:8080/confluence/rest/api/content/3965072?expand=body.storage | python -mjson.tool
What I actually want to do is dump out the contents of a page / pages in a "Space" as Confluence calls it to a file, or on screen.
This is an actual page:
"https://bob.atlassian.net/wiki/spaces/BKB/pages/1234567/BOB+KNOWLEDGE+BANK"
And this is an example I found of using the REST API to do what I want:
curl -u admin:admin -G "http://myinstance.address:8090/rest/api/content/search" --data-urlencode "cql=(type=page and space=low and title ~ "LOWERTOWN")" \ | python -m json.tool
How I am translating what I have researched:
curl -u "bob#bob.com:12345678" -G "https://bob.atlassian.net/rest/api/content/search" --data-urlencode "cql=(type=page and space=BKB and title ~ "BOB+KNOWLEDGE+BANK")" \ | python -m json.tool
Which results in this error:
curl: (3) URL using bad/illegal format or missing URL
No JSON object could be decoded
I have grabbed my logic here from this site:
https://community.atlassian.com/t5/Confluence-questions/How-to-get-Conflunece-knowledge-base-data-through-API/qaq-p/1030159
And I am assuming I have misunderstood this:
/rest/api/content/search
And where it belongs in my curl statement and linking it to the knowledge base. I am also unsure if applying the -mjson.tool is applicable in my case or if I actually have it installed / need to verify that.
Can you help me interpret this correctly?
You are almost there! You just need to pass cql as query parameter to the service as mentioned here in Atlassian documentation:
curl --request GET
--url 'https://your-domain.atlassian.net/wiki/rest/api/search?cql={cql}'
--header 'Authorization: Bearer <access_token>'
--header 'Accept: application/json'
I have to send the curl exact in that form to get a response in cli
curl -d '{"id":0,"params":["aa:aa:00:00:00:00",["mixer","volume","80"]],"method":"slim.request"}' http://localhost:9000/jsonrpc.js
curl -d '{"id":0,"params":["aa:aa:00:00:00:00",["playlist","play","/home/pi/mp3/File.mp3"]],"method":"slim.request"}' http://localhost:9000/jsonrpc.js
Running the command above in my script, I get a syntax error for the apostrophe
File "./updateTimers.py", line 126
strVolume = curl -d '{"id":0,"params":["aa:aa:00:00:00:00",["mixer","volume","80"]],"method":"slim.request"}' http://localhost:9000/jsonrpc.js
^
SyntaxError: invalid syntax
If I change apostrophes and quotation than my python script does not like the syntax also. Would be happy any advice. At least I need to run both commands one by one.
strVolume = curl -d '{"id":0,"params":["aa:aa:00:00:00:00",["mixer","volume","80"]],"method":"slim.request"}' http://localhost:9000/jsonrpc.js
strPlayMP3Command = curl -d '{"id":0,"params":["aa:aa:00:00:00:00",["playlist","play","/home/pi/adhan/mp3/Adhan-Makkah.mp3"]],"method":"slim.request"}' http://localhost:9000/jsonrpc.js
If I understand your question correctly, you want the curl -d ... to be a string in Python. To do this, you will also have to wrap curl in quotes: strVolume = "curl -d '{\"id\": 0, ...}' ...". Make sure you escape the quotes inside.
I ran the below script on terminal and it published all 4 topics; but when running it with crontab -e, every minute, it published only pm25 topics. The aqi values are calculated from the ready-made python module I downloaded from: python-aqi function What could be the cause of problem?
Here is the script:
#!/bin/sh
# to get pm2.5 from Home Assistant Rest API xiaomi airpurifier in Mezzanine and Bedroom
# then calculate aqi by using python aqi module obtained from https://pypi.org/project/python-aqi/
# output 4 topics:
# xiaomi_airpuriier/bedroom/pm25 999
# xiaomi_airpuriier/bedroom/aqi 999
# xiaomi_airpuriier/mezzanine/pm25 999
# xiaomi_airpuriier/mezzanine/aqi 999
# crontab -e every minute
bedroom=$(curl -s GET \
-H "Authorization: Bearer eyJ0eXAiOiJKV1QiLC.....gEZfY" \
-H "Content-Type: application/json" \
http://localhost:8123/api/states/sensor.xiaomi_airpurifier_air_quality_pm25)
mezzanine=$(curl -s GET \
-H "Authorization: Bearer eyJ0eXAiOiJKV1QiLC.....gEZfY" \
-H "Content-Type: application/json" \
http://localhost:8123/api/states/sensor.xiaomi_airpurifier_air_quality_pm252)
bedroom_pm25=$(echo "$bedroom" | python3 -c "import sys, json; print(json.load(sys.stdin)['state'])")
bedroom_aqi=$(aqi aqi.algos.epa pm25:$bedroom_pm25)
mezzanine_pm25=$(echo "$mezzanine" | python3 -c "import sys, json; print(json.load(sys.stdin)['state'])")
mezzanine_aqi=$(aqi aqi.algos.epa pm25:$mezzanine_pm25)
#echo $bedroom_pm25 $bedroom_aqi $mezzanine_pm25 $mezzanine_aqi
# publish to mqtt
mosquitto_pub -h localhost -t xiaomi_airpurifier/bedroom/pm25 -m $bedroom_pm25
mosquitto_pub -h localhost -t xiaomi_airpurifier/bedroom/aqi -m $bedroom_aqi
mosquitto_pub -h localhost -t xiaomi_airpurifier/mezzanine/pm25 -m $mezzanine_pm25
mosquitto_pub -h localhost -t xiaomi_airpurifier/mezzanine/aqi -m $mezzanine_aqi
The xiaomi airpurifier bash script in crontab -e
I have this python script
users=['mark','john','steve']
text=''
for user in users:
text+=str(user + " ")
print(text)
I want to output that string "text" into a curl terminal command.
I tried:
curl -d "#python-scirpt.py" --insecure -i -X POST https://10.10.10.6/hooks/84kk9emcdigz8xta1bykiymn5e
and
curl --insecure -i -X POST -H 'Content-Type: application/json' -d '{"text": 'python /home/scripts/python-script.py'}' https://10.10.10.6/hooks/84kk9emcdigz8xta1bykiymn5e
or without the quotations in the text option
Everything returns this error
{"id":"Unable to parse incoming data","message":"Unable to parse incoming data","detailed_error":"","request_id":"fpnmfds8zifziyc85oe5eyf3pa","status_code":400}
How to approach this ? Any help is appreciated thank you.
another approach would be to curl inside python but would need help in that too.
Thank you
Use command substitution (i.e. $(...)) to make the shell run the python code first.
So
curl -d "$(python-scirpt.py)" --insecure -i -X POST https://10.10.10.6/hooks/84kk9emcdigz8xta1bykiymn5e
I'm trying to schedule a crawler on EC2 and have the output export to a csv file cppages-nov.csv, while creating a jobdir encase I need to pause the crawl, but it is not creating any files. Am I using the correct feed exports?
curl http://awsserver:6800/schedule.json -d project=wallspider -d spider=cppages -d JOBDIR=/home/ubuntu/scrapy/sitemapcrawl/crawls/cppages-nov -d FEED_URI=/home/ubuntu/scrapy/sitemapcrawl/cppages-nov.csv -d FEED_FORMAT=csv
curl http://amazonaws.com:6800/schedule.json -d project=wallspider -d spider=cppages -d setting=FEED_URI=/home/ubuntu/scrapy/sitemapcrawl/results/cppages.csv -d setting=FEED_FORMAT=csv -d setting=JOBDIR=/home/ubuntu/scrapy/sitemapcrawl/crawl/cppages-nov
use this feed in your settings file
FEED_EXPORTERS = {
'csv': 'scrapy.contrib.exporter.CsvItemExporter',
}
FEED_FORMAT = 'csv'