devpi: manually upload toxresult.json - python

I currently have a Jenkins pipeline which builds and tests my python package using tox. If all unittests pass, it will be uploaded to my local devpi index.
Using devpi test <mypackage> I can attach the test results to the release file on the index.
But this will download the already built package again, repeat all of the already passed test suites defined in the tox.ini file and only then upload the results in form of a toxresult.json.
Is there any way to directly upload the toxresult.json alongside the release files?
According to the quickstart and the documentation of test command there seems to be no command line option, and neither in the upload command.
Of course I could change my Jenkins pipeline to skip the tests before uploading and then build, upload and test the package using devpi. If the devpi test command fails I can remove the package from the index.
But I would rather not upload a package with failing tests in the first place.

The anonymous uploads
It's relatively easy if you allow the anonymous user to upload test results (which is the default setting IIRC). Make a POST request to the URL of the uploaded dist, passing tox results as JSON payload. Example:
$ curl -i \
-H "content-type: application/json" \
-X POST \
--data-binary "#/tmp/toxreport.json" \
http://my-server/myuser/myindex/+f/19b/d3544d03b1716/mypkg-1.0.tar.gz
On success, you should get a result similar to
HTTP/1.1 100 Continue
HTTP/1.1 200 OK
Content-Length: 143
Content-Type: application/json
Date: Wed, 08 Jan 2020 15:48:32 GMT
Server: waitress
X-Devpi-Api-Version: 2
X-Devpi-Master-Uuid: d800735d04a14c2d9bde920149cb8dbc
X-Devpi-Serial: 42
X-Devpi-Server-Version: 5.3.1
X-Devpi-Uuid: d800735d04a14c2d9bde920149cb8dbc
{
"result": "myuser/myindex/+f/19b/d3544d03b1716/mypkg-1.0.tar.gz.toxresult-20200108154832-0",
"type": "toxresultpath"
}
You can find the target URL in the File column of the files table on the project page. Or query the JSON API and filter the results, e.g.
$ devpi getjson /myuser/myindex/mypkg | jq -r '[ .result[] | .["+links"][] | .href ]'
Authenticated uploads
devpi uses basic auth, so simply pass the base64-encoded credentials in the Authorization: Basic header. Example, with curl again:
$ curl -i \
--user myuser:mypass \
-H "content-type: application/json" \
-X POST \
--data-binary "#/tmp/toxreport.json" \
http://my-server/myuser/myindex/+f/19b/d3544d03b1716/mypkg-1.0.tar.gz
If you need details on the test upload authentication, check out my other answer here.

Related

Attempts to use curl in conjunction with api give 404 error

I am attempting to query Confluence knowledge base pages using the API and failing.
I have researched the api and have the following:
Browsing Content
curl -u admin:admin http://localhost:8080/confluence/rest/api/content/ | python -mjson.tool
Read Content and Expand the Body:
curl -u admin:admin http://localhost:8080/confluence/rest/api/content/3965072?expand=body.storage | python -mjson.tool
What I actually want to do is dump out the contents of a page / pages in a "Space" as Confluence calls it to a file, or on screen.
This is an actual page:
"https://bob.atlassian.net/wiki/spaces/BKB/pages/1234567/BOB+KNOWLEDGE+BANK"
And this is an example I found of using the REST API to do what I want:
curl -u admin:admin -G "http://myinstance.address:8090/rest/api/content/search" --data-urlencode "cql=(type=page and space=low and title ~ "LOWERTOWN")" \ | python -m json.tool
How I am translating what I have researched:
curl -u "bob#bob.com:12345678" -G "https://bob.atlassian.net/rest/api/content/search" --data-urlencode "cql=(type=page and space=BKB and title ~ "BOB+KNOWLEDGE+BANK")" \ | python -m json.tool
Which results in this error:
curl: (3) URL using bad/illegal format or missing URL
No JSON object could be decoded
I have grabbed my logic here from this site:
https://community.atlassian.com/t5/Confluence-questions/How-to-get-Conflunece-knowledge-base-data-through-API/qaq-p/1030159
And I am assuming I have misunderstood this:
/rest/api/content/search
And where it belongs in my curl statement and linking it to the knowledge base. I am also unsure if applying the -mjson.tool is applicable in my case or if I actually have it installed / need to verify that.
Can you help me interpret this correctly?
You are almost there! You just need to pass cql as query parameter to the service as mentioned here in Atlassian documentation:
curl --request GET
--url 'https://your-domain.atlassian.net/wiki/rest/api/search?cql={cql}'
--header 'Authorization: Bearer <access_token>'
--header 'Accept: application/json'

Deploying haystack model/workflow

I'm trying to deploy a haystack model for Question Answering for my application as a REST API /API. I want to query and get my answers directly and I need to do it soon so I'm finding a way to do it on Algorithmia. Any suggestions, tutorials, examples or any help is appreciated. Thanks!!
For reference, this could be an example model.
Not sure about Alorithmia, but here's a simple option to deploy a Haystack service incl. a REST API on any standard machine (e.g. AWS EC2 instance):
# Clone haystack repo
git clone https://github.com/deepset-ai/haystack.git
cd haystack
# Start (demo) containers
docker-compose pull
docker-compose up
# Run a query
curl -X 'POST' \
'http://127.0.0.1:8000/query' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"query": "Who is the father of Arya Stark?",
"params": {}
}'
This basically spins up:
Haystack REST API using this docker image
Elasticsearch with some demo data (see comment in the docker-compose.yaml for how to replace this with an empty instance for your own data)
A simple streamlit-based UI (you can easily remove this from the docker-compose if you don't need it)
If you want to customize the pipeline being deployed in the API (e.g. change a model):
Edit the pipelines.yaml in your cloned repo (in haystack/rest_api/pipeline/)
Mount this folder as a volume into the container by uncommenting this part in the docker-compose.yaml
If you want to deploy on a GPU machine, just execute instead:
docker-compose -f docker-compose-gpu.yml pull
docker-compose -f docker-compose-gpu.yml up
For more details, see the official documentation of the REST API here.

Send curl POST Request in Python without installing a lib

I wanna send a curl post request in python. But I can't install any lib like 'request'. I could send POST request like following example :
curl -H "Content-Type: application/json" -X POST -d {\"username\":\"myusername\",\"password\":\"mypassword\"} https://example.com/login
I need equal code as the above in python2. Then, i must read what it returns. I'm working on Windows 10.
Write a curl command in a way that it can work in the shell and then execute the command in the shell.
This way you don't need the requests package.
import os
command = "curl -u {username}:{password} [URL]"
os.system(command)

How to programmatically set the github repo inside jenkins job configuration page

I want to set the Github repository url in http://host-ip:8080/job/my_project/configure jenkins page via code each time I spawn a new Jenkins container.
I read that this can be done with python-jenkins's reconfig_job function by replacing the config.xml .
Well how would I do that?
You have some clues in "How can I update a jenkins job using the api?"
For instance, since you spawn a new Jenkins container, you can docker cp an updated config.xml to the container (at the right path for that job)
(the OP Kostas Demiris confirms in the comments it works, when run in git bash)
You can also use one of the Jenkins API libraries, but check if a simple curl is enough first
#Get the current configuration and save it locally
curl -X GET http://user:password#jenkins.server.org/job/myjobname/config.xml -o mylocalconfig.xml
#Update the configuration via posting a local configuration file
curl -X POST http://user:password#jenkins.server.org/job/myjobname/config.xml --data-binary "#mymodifiedlocalconfig.xml"
The updated Jenkins doc mentions (for updating just one parameter in an existing config job):
Simple example - sending "String Parameters":
curl -X POST JENKINS_URL/job/JOB_NAME/build \
--data token=TOKEN \
--data-urlencode json='{"parameter": [{"name":"id", "value":"123"}, {"name":"verbosity", "value":"high"}]}'
Another example - sending a "File Parameter":
curl -X POST JENKINS_URL/job/JOB_NAME/build \
--user USER:PASSWORD \
--form file0=#PATH_TO_FILE \
--form json='{"parameter": [{"name":"FILE_LOCATION_AS_SET_IN_JENKINS", "file":"file0"}]}'

Upload File using Django Rest Framework

I am new to django. Can anybody help me... How can I upload a file using the Rest Framework API ?
I have tried following this page:
http://www.django-rest-framework.org/api-guide/parsers/#fileuploadparser
File uploading in Django REST framework is the same with uploading files in multipart/form in django.
To test it you can use curl:
curl -X POST -H "Content-Type:multipart/form-data" -u {username}:{password} \
-F "{field_name}=#{filename};type=image/jpeg" http://{your api endpoint}
Other fields are just like normal form fields in Django.
Zhe answer is pretty well. Besides, you can add some parameters in order to see the response. Take this one for example:
curl -X PATCH --dump-header - -H "Content-Type:multipart/form-data" -u jorge:123456 -F "image=#/home/oscar/Pictures/dgnest/_MG_6445.JPG;type=image/jpeg" http://localhost:8000/api/project/3/

Categories

Resources