Python server exposing both and xml and json rpc interface - python

As the title says, is there a way to expose a function to both the JSON-RPC and the XML-RPC interface? Preferably one server running on a single port would answer to both types of requests.
Thanks!

One of the usual ways of returning different formats is to specify the type you want in the url in some way. Most common are:
http://example.com/some/page.<format>
or
http://example.com/some/page?output=<format>
And then your returned object should be transformed into the wanted format:
// somewhere at the end of the method handling the request
return Formater(format_param).format(response_object)

Related

Protobuf how to use Any type with homebrew proto message

I'm currently building a python gRPC server that serializes tons of different proto messages into json to store them into a no-sql db. I'd like to simplify extension of this server such that we can add new types without rewriting the gRPC server and redeploying. Ideally, we would like to define a new message, put it in a proto file and update only the client. The server should expect any type at first but knows a .proto file or folder where to look for when it comes to serializing/deserializing.
I've read about the Any type and I'm exploring whether this is my way to do this. There is some documentation on it but very few examples to work with. One thing that I don't quite get is how to store/retrieve the type of an "Any" field.
All documentation use https as protocol for the type of an Any field (e.g. type.googleapis.com/google.protobuf.Duration). This is also the default. How would it look like if I use the local file system? How would I store this in the proto message on the client side?
How can I retrieve the type on the server side?
Where can I find a similar example?
Apologies, this is only a partial answer.
I've recently begun using Any in a project and can provide some perspective. I have a similar (albeit simpler) requirement to what you outline. Enveloped message content but, in my case, clients are required to ship a descriptor to the server and identify a specific method to help it (un)marshal etc.
I've been using Google's new Golang APIv2 and only familiar with it from Golang and Rust (not Python). The documentation is lacking but the Golang documents will hopefully help:
anypb
protoregistry
I too struggled with understanding the concept (implementation) of the global registry and so I hacked the above solution. The incoming message metadata provides sufficient context to the server that it can construct the message type and marshal the bytes into it.

Querying objects from mysql with python

Since I can't explain clearly what I don't understand I'll use an example.
Lets say I have a client application and a server application. The server awaits and when the client sends some keyword to the server so the server knows what should be queried. And lets say that the client requests a product object so the server queries the database and gets back the row that the client needs as a set object. So every time I need some object I need send it to the client in form of a string and then instantiate it ?
Am i missing something ? Isn't it expensive to instantiate objects on every query ?
TIA!
Your question is very vague and doesn't really ask something but I'll try to give you a generic answer of how to interact between server and client.
When a user request a item in the client, you should provide the client with an API to the server, something like http://example.com/search?param=test. The client will use this API in either an AJAX call or a direct call.
The server should parse the param, connect to database, retrieve the requested item and return to client. The most common data types for this exchange are JSON and Plain Text.
The client will then parse either of the data types, generate if required an object from these and finnally show the user the requested data.
If this is not what you need please update your question to ask specifically the issue you have and maybe provide some code where you have the issue and I'll update my answer accordingly.
MySQL Server uses custom protocol over TCP. If you don't want to use any library you will have to parse TCP messages. MySQL Connector / Python does exactly that - you can look at its source code if wish.

How to send https request with client certificate using Python programming language

I have two jks files truststore.jks and keystore.jks that I use when sending REST request with java based client , now I want to use Python but I didn't find a way to use them to authenticate so How can I use them in Python ?
You didn't provide much of info (e.g. what you tried before), so my answer will be not precise.
I think what you are looking for is urllib2.urlopen() (probably using Request object to tune up request properties), note SSL-related function parameters. But first you'll probably need to convert jks files to format accepted by Python (I guess it's OpenSSL format).

Attach a custom view to spyne

I am using Spyne to provide SOAP interface to my backend, but a need one or more custom views accessible on certain URLS. For example to show some statistics in HTML.
How do i do that?
Please have a look at the multiple protocols example here: https://github.com/arskom/spyne/tree/master/examples/multiple_protocols
If you want to return raw data via Http, you must set your output protocol to HttpRpc, your output type in your service to ByteArray or String and ctx.transport.mime_type to whatever type you're returning

Python: Sending a large dictionary to a server

I have an application that should communicate status information to a server. This information is effectively a large dictionary with string keys.
The server will run a web application based on Turbogears, so the server-side method called accepts an arbitrary number of keyword arguments.
In addition to the actual data, some data related to authentication (id, password..) should be transmitted. One approach would be to simply urlencode a large dictionary containing all this and send it in a request to the server.
urllib2.urlencode(dataPlusId)
But actually, the method doing the authentication and accepting the data set does not have to know much about the data. The data could be transmitted and accepted transparently and handed over to another method working with the data.
So my question is: What is the best way to transmit a large dictionary of data to a server in general? And, in this specific case, what is the best way to deal with authentication here?
I agree with all the answers about avoiding pickle, if safety is a concern (it might not be if the sender gets authenticated before the data's unpickled -- but, when security's at issue, two levels of defense may be better than one); JSON is often of help in such cases (or, XML, if nothing else will do...!-).
Authentication should ideally be left to the webserver, as SpliFF recommends, and SSL (i.e. HTTPS) is generally good for that. If that's unfeasible, but it's feasible to let client and server share a "secret", then sending the serialized string in encrypted form may be best.
I think the best way is to encode your data in an appropriate transfer format (you should not use pickle, as it's not save, but it can be binary) and transfer it as a multipart post request
What I do not know if you can make it work with repoze.who. If it does not support sign in and function call in one step, you'll perhaps have to verify the credentials yourself.
If you can wrap your data in xml you could also use XML-RPC.
Why don't you serialize the dictionary to a file, and upload the file? This way, the server can read the object back into a dictionary .
Do a POST of your python data (use binary as suggested in other answers) and handle security using your webserver. Apache and Microsoft servers can both do authentication using a wide variety of methods (SSL client certs, Password, System accounts, etc...)
Serialising/Deserialising to text or XML is probably overkill if you're just going to turn it back to dictionary again).
I'd personally use SimpleJSON at both ends and just post the "file" (it would really just be a stream) over as multipart data.
But that's me. There are other options.
Have you tried using pickle on the data ?

Categories

Resources