I'm trying to use OpenAPI for a Python project.
I've previously used OpenAPI with Java and it was really easy as you could configure it into pom.xml so that you would write a yaml file and then you would get an interface that you could implement into your controller.
I'm now working in Python and I'm trying to do a similar thing where you write an yaml file and get an interface or something similar that you can use.
I've tried openapi-generator-cli generate but it seems like it creates a lot of bloat files as it creates the whole server but I only need a single file that I can use further.
Is there something similar for Python as it is for Java?
Thanks in advance
You may want to try import flask_restx.
Then visit top-level URL of your flask app to see the swagger details.
https://flask-restx.readthedocs.io/en/latest/swagger.html#swaggerui
EDIT
Consider using apispec.
$ pip install -U 'apispec[yaml]'
can try flask-toolkits link very easy to setup as it will generate your openapi/swagger spec. Once you already done your code can try with app.run() then access http://localhost:5000/openapi.yaml
Related
I would like to type in my Google Cloud function:
from my_google_cloud_project import another_google_cloud_func
another_google_cloud_func()
I don't want to invoke that function via HTTP request. How can I just import it?
Both functions are Google Cloud functions, but not just python code!
you can create a code.py file and write your code there
read the content of that file
with open('code.py') as f
content = f.read()
then execute those content using
exec(content)
Please read the official documentation regarding :
Packaging local dependencies
You can also package and deploy dependencies alongside your function.
This approach is useful if your dependency is not available via the
pip package manager or if your Cloud Functions environment's internet
access is restricted. For example, you might use a directory structure
such as the following:
You can then use code as usual from the included local dependency, localpackage. You can use this approach to bundle
any Python packages with your deployment.
I want to run a python script as part of a jenkins pipline triggered from a github repo. If I store the script directly in the repo itself, I can just do sh 'python path/to/my_package/script.py' which work perfectly. However since I want to use this from multiple pipelines from multiple repos, I want to put this in a jenkins shared library.
I found this question which suggested storing the python file in the resources directory and copying it to a temp file before use. That only works if the script is one standalone file. Unfortunately, mine is a package with multiple python files and imports between them, so thats a no go. I also tried to copy the entire folder containing the python package from the answer to this question, which suggests getting the location of the library with
import groovy.transform.SourceURI
import java.nio.file.Path
import java.nio.file.Paths
class ScriptSourceUri {
#SourceURI
static URI uri
}
but its gives me the following error:
Scripts not permitted to use staticMethod java.net.URI create java.lang.String. Administrators can decide whether to approve or reject this signature.
It seems that some additional permissions are required, which I don't think I'll be able to acquire (its a shared machine).
So? Does anyone know how I can run a python package from jenkins shared library? Right now the only solution I can think of is to manually recreate the directory structure of the python package, which is obviously very messy and non-generic.
PS: There is no particular reason for using the python script over writing the same script in groovy. Its just that the python script is well tested, well understood and well supported. Rewriting the whole thing in groovy just isn't feasible right now.
You can go to http://host:8080/jenkins/scriptApproval/ page of your Jenkins installation and approve the request for your scripts, please see below:-
And follow the link for more information.
Atmosphere.js lacks a Skulpt package, so I'm unsure how to proceed. If Meteor doesn't support Skulpt, is there a way to let users type and run Python code on my Meteor app (and if so, can you provide a small example of how to implement it)?
I'm currently using the Meteor-Blaze stack.
I originally tried using trinket.io. Then I learned of the same-origin policy (https://en.wikipedia.org/wiki/Same-origin_policy), which prevents me from grabbing what users type from the trinket or inserting my own content.
Thanks!
As Skulpt is in Javascript, there is no reason why you shouldn't be able to use it in your Meteor app. There is an npm package, so you should be able to do
npm install skulpt
and
import skulpt from 'skulpt'
in your code where you want to use it.
I haven't tried this, I just found the npm package, which is a good indication for you.
I have a requirement where any file should be put in the artifact using python language. I tried to search all over internet but I couldn't find any help.
Please share code snippet or something which can help me to achieve this.
Any help is greatly appreciated here.
Artifactory python module can be used to upload artifacts into artifactory.
https://pypi.python.org/pypi/artifactory/0.1.17
Here is an example from the website used to upload a file into artifactory:
from artifactory import ArtifactoryPath
path = ArtifactoryPath("http://my-artifactory/artifactory/libs-snapshotlocal/myapp/1.0")
path.mkdir()
path.deploy_file('./myapp-1.0.tar.gz')
The defend against fruit project provides the integration with Artifactory and Python you are looking for.
http://teamfruit.github.io/defend_against_fruit/
Artifactory exposes a REST API. Here's a link to the documentation. See the section on "Deploy Artifact".
Basically you will need to build a REST client. There might already exist one for Artifactory? If you need to write it yourself there is a WADL file that might make things easier (see also wadllib).
I would like to build an admin section into my App Engine app that will allow me to download and install python modules dynamically.
Any ideas on the most efficient way to accomplish this, without the ability to write to file system?
Are there any examples of this being done?
Does python27 support of setuptools make this easier?
Edit:
My initial thought is that this could be accomplished by downloading an egg or zip file dynamically. Saving it to the blobstore, and loading if from there.
Is this posible?
What kind of performance issues would this create?
On GAE you has no access to the file system, that's why you can't install any third-party packages on your instance, you can only distribute they with your own code.
In your app directory create a folder called 'lib'. For any module you want to add, just unzip it and add it to lib. Then redeploy your application using the console or Google App Launcher. Repeat every time you want to add a new module.
I'm not 100% sure about this. But it seems to me that if don't want a manual process involved then I would suggest dynamically adding the module content as a blob store entry and loading the module at runtime.
But the trick as the previous answer states is that to use a package, its code needs to be present in your app.