How to programmatically set the github repo inside jenkins job configuration page - python

I want to set the Github repository url in http://host-ip:8080/job/my_project/configure jenkins page via code each time I spawn a new Jenkins container.
I read that this can be done with python-jenkins's reconfig_job function by replacing the config.xml .
Well how would I do that?

You have some clues in "How can I update a jenkins job using the api?"
For instance, since you spawn a new Jenkins container, you can docker cp an updated config.xml to the container (at the right path for that job)
(the OP Kostas Demiris confirms in the comments it works, when run in git bash)
You can also use one of the Jenkins API libraries, but check if a simple curl is enough first
#Get the current configuration and save it locally
curl -X GET http://user:password#jenkins.server.org/job/myjobname/config.xml -o mylocalconfig.xml
#Update the configuration via posting a local configuration file
curl -X POST http://user:password#jenkins.server.org/job/myjobname/config.xml --data-binary "#mymodifiedlocalconfig.xml"
The updated Jenkins doc mentions (for updating just one parameter in an existing config job):
Simple example - sending "String Parameters":
curl -X POST JENKINS_URL/job/JOB_NAME/build \
--data token=TOKEN \
--data-urlencode json='{"parameter": [{"name":"id", "value":"123"}, {"name":"verbosity", "value":"high"}]}'
Another example - sending a "File Parameter":
curl -X POST JENKINS_URL/job/JOB_NAME/build \
--user USER:PASSWORD \
--form file0=#PATH_TO_FILE \
--form json='{"parameter": [{"name":"FILE_LOCATION_AS_SET_IN_JENKINS", "file":"file0"}]}'

Related

Deploying haystack model/workflow

I'm trying to deploy a haystack model for Question Answering for my application as a REST API /API. I want to query and get my answers directly and I need to do it soon so I'm finding a way to do it on Algorithmia. Any suggestions, tutorials, examples or any help is appreciated. Thanks!!
For reference, this could be an example model.
Not sure about Alorithmia, but here's a simple option to deploy a Haystack service incl. a REST API on any standard machine (e.g. AWS EC2 instance):
# Clone haystack repo
git clone https://github.com/deepset-ai/haystack.git
cd haystack
# Start (demo) containers
docker-compose pull
docker-compose up
# Run a query
curl -X 'POST' \
'http://127.0.0.1:8000/query' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"query": "Who is the father of Arya Stark?",
"params": {}
}'
This basically spins up:
Haystack REST API using this docker image
Elasticsearch with some demo data (see comment in the docker-compose.yaml for how to replace this with an empty instance for your own data)
A simple streamlit-based UI (you can easily remove this from the docker-compose if you don't need it)
If you want to customize the pipeline being deployed in the API (e.g. change a model):
Edit the pipelines.yaml in your cloned repo (in haystack/rest_api/pipeline/)
Mount this folder as a volume into the container by uncommenting this part in the docker-compose.yaml
If you want to deploy on a GPU machine, just execute instead:
docker-compose -f docker-compose-gpu.yml pull
docker-compose -f docker-compose-gpu.yml up
For more details, see the official documentation of the REST API here.

Incrementally add function to Azure Function Host without overwriting already deployed functions

I'm working with multiple teams that develop & test Azure Functions independently from each other but want to deploy all functions to a centralized Azure Function host, like so:
The publishing methods I know overwrite the existing content on the host which is not wanted, we strive for an incremental update (similar to this question with the only difference that we use Python on Linux-based host instead of C#).
My question is: What is the easiest way to do this (assuming that hosts.json and function settings are the same for both projects)?
If team A runs
curl -X POST -u <user> --data-binary #"./func1-2.zip" https://<funcname>.scm.azurewebsites.net/api/zipdeploy
in their release pipeline and afterwards team B runs
curl -X POST -u <user> --data-binary #"./func3-4.zip" https://<funcname>.scm.azurewebsites.net/api/zipdeploy
func1 and func2 from team A are gone. Using PUT on the https://<funcname>.scm.azurewebsites.net/api/zip/ endpoint as indicated here didn't seem to publish the functions at all. When using FTP, I don't see any files in site/wwwroot/ at all, even after already publishing functions.
You need to use continuous deployment:
First, create a repo on Github, then configure deploy center:
Then use git to upload your local function app to Github:
echo "# xxxxxx" >> README.md
git init
git add .
git commit -m "first commit"
git branch -M main
git remote add origin <git-url>
git push -u origin main
update:
git init
git add .
git commit -m "something"
git branch -M main
git remote add origin <git-url>
git push -u origin main
Refer to this official doc:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-continuous-deployment

Curl command line to import xml file to Xray

Good afternoon,
I'm using robotframework to run some tests.
After I run them I have an output.xml file with the results.
I searched ways to import these results to Xray and found these links:
https://docs.getxray.app/display/XRAY/Testing+using+Robot+Framework+integration+in+Python+or+Java
https://docs.getxray.app/display/XRAY/Import+Execution+Results+-+REST#ImportExecutionResultsREST-RobotFrameworkXMLresults
So I created a .sh file with this command line:
#!/bin/bash
PROJECT=myproject
TESTPLAN=mytestplan
curl -X POST -H "Content-Type: multipart/form-data" -u myuser:mypassword -F "file=output.xml" "https://myserver/rest/raven/1.0/import/execution/robot?projectKey=$PROJECT&testPlanKey=$TESTPLAN"
It displays this error '' Forbidden (403)''.
Do you know how to solve this?
I guess you're using Xray on Jira server/Data Center and not Jira Cloud, correct?
Is so, it should be something like:
curl -H "Content-Type: multipart/form-data" -u admin:admin -F "file=#output.xml" "http://<jira_base_url>/rest/raven/1.0/import/execution/robot?projectKey=ROB&testPlanKey=ROB-12&testEnvironments=$BROWSER"
Note that sometimes <jira_base_url> is something like http://<some_ip>/jira .. is it your case perhaps?
Note: In this tutorial, you can find a concrete example for Xray on Jira server/DC. A similar tutorial for Xray on Jira Cloud can be found here.

devpi: manually upload toxresult.json

I currently have a Jenkins pipeline which builds and tests my python package using tox. If all unittests pass, it will be uploaded to my local devpi index.
Using devpi test <mypackage> I can attach the test results to the release file on the index.
But this will download the already built package again, repeat all of the already passed test suites defined in the tox.ini file and only then upload the results in form of a toxresult.json.
Is there any way to directly upload the toxresult.json alongside the release files?
According to the quickstart and the documentation of test command there seems to be no command line option, and neither in the upload command.
Of course I could change my Jenkins pipeline to skip the tests before uploading and then build, upload and test the package using devpi. If the devpi test command fails I can remove the package from the index.
But I would rather not upload a package with failing tests in the first place.
The anonymous uploads
It's relatively easy if you allow the anonymous user to upload test results (which is the default setting IIRC). Make a POST request to the URL of the uploaded dist, passing tox results as JSON payload. Example:
$ curl -i \
-H "content-type: application/json" \
-X POST \
--data-binary "#/tmp/toxreport.json" \
http://my-server/myuser/myindex/+f/19b/d3544d03b1716/mypkg-1.0.tar.gz
On success, you should get a result similar to
HTTP/1.1 100 Continue
HTTP/1.1 200 OK
Content-Length: 143
Content-Type: application/json
Date: Wed, 08 Jan 2020 15:48:32 GMT
Server: waitress
X-Devpi-Api-Version: 2
X-Devpi-Master-Uuid: d800735d04a14c2d9bde920149cb8dbc
X-Devpi-Serial: 42
X-Devpi-Server-Version: 5.3.1
X-Devpi-Uuid: d800735d04a14c2d9bde920149cb8dbc
{
"result": "myuser/myindex/+f/19b/d3544d03b1716/mypkg-1.0.tar.gz.toxresult-20200108154832-0",
"type": "toxresultpath"
}
You can find the target URL in the File column of the files table on the project page. Or query the JSON API and filter the results, e.g.
$ devpi getjson /myuser/myindex/mypkg | jq -r '[ .result[] | .["+links"][] | .href ]'
Authenticated uploads
devpi uses basic auth, so simply pass the base64-encoded credentials in the Authorization: Basic header. Example, with curl again:
$ curl -i \
--user myuser:mypass \
-H "content-type: application/json" \
-X POST \
--data-binary "#/tmp/toxreport.json" \
http://my-server/myuser/myindex/+f/19b/d3544d03b1716/mypkg-1.0.tar.gz
If you need details on the test upload authentication, check out my other answer here.

Upload File using Django Rest Framework

I am new to django. Can anybody help me... How can I upload a file using the Rest Framework API ?
I have tried following this page:
http://www.django-rest-framework.org/api-guide/parsers/#fileuploadparser
File uploading in Django REST framework is the same with uploading files in multipart/form in django.
To test it you can use curl:
curl -X POST -H "Content-Type:multipart/form-data" -u {username}:{password} \
-F "{field_name}=#{filename};type=image/jpeg" http://{your api endpoint}
Other fields are just like normal form fields in Django.
Zhe answer is pretty well. Besides, you can add some parameters in order to see the response. Take this one for example:
curl -X PATCH --dump-header - -H "Content-Type:multipart/form-data" -u jorge:123456 -F "image=#/home/oscar/Pictures/dgnest/_MG_6445.JPG;type=image/jpeg" http://localhost:8000/api/project/3/

Categories

Resources