I am working on trying to send XML data to a .NET web service, from an Asterisk Linux box, using Python.
The problems I am running into are:
-- if I use HTTPlib, I can send the XML data; but, the not authenticate first.
-- if I use HTTPlib2, I can authenticate; but, I can't get it to send the data.
In the end, I just need to periodically send data to a client's web service. On my end, I'm not concerned with the response, just sending the data.
Any thoughts on this? Maybe I'm pursuing the wrong path using HTTPlib? Thanks for any assistance at all. I'm not opposed to using a different language; Python just seemed the simplest when this project started.
A very popular module for this is
http://docs.python-requests.org/en/latest/
works great for me for all sorts of requests with and without authentication.
Related
I’ve got a standard client-server set-up with ReScript (ReasonML) on the front-end and a Python server on the back-end.
The user is running a separate process on localhost:2000 that I’m connecting to from the browser (UI). I can send requests to their server and receive responses.
Now I need to issue those requests from my back-end server, but cannot do so directly. I’m assuming I need some way of doing it through the browser, which can talk to localhost on the user’s computer.
What are some conceptual ways to implement this (ideally with GraphQL)? Do I need to have a subscription or web sockets or something else?
Are there any specific libraries you can recommend for this (perhaps as examples from other programming languages)?
I think the easiest solution with GraphQL would be to use Subscriptions indeed, the most common Rescript GraphQL clients already have such a feature, at least ReasonRelay, Reason Apollo Hooks and Reason-URQL have it.
I am only told to create a pythonic web service. At the end of the day, I need to offer a HTTPS endpoint which will receive(from a post request), and be able to process/send back json objects from/to another web service.
To be able to receive post requests from other services, what kind of information do I need?
I have seen some examples using httplib2 such as sending HTTP get and post requests when given a website like www.something.com. But in my case, since I do not know the IP address/URL of the data source, should I create a listener waiting for the incoming data? How to achieve this?
I am really new with building python web server and the requirement I am given is really vague. Thank you in advance for helping me break down this problem.
Take a look at the Flask framework, it can do everything you want and then some. I can especially recommend the Quickstart: A Minimal Application and the JSON Support pages.
Enabling the build in debugger will help you a great deal as well.
All services is listening for incoming connections, so you are right about that :-)
Good luck!
I am currently working on a project to create simple file uploader site that will update the user of the progress of an upload.
I've been attempting this in pure python (with CGI) on the server side but to get the progress of the file I obviously need send requests to the server continually. I was looking to use AJAX to do this but I was wondering how hard it would be to, instead of changing to some other framerwork (web.py for instance), just write my own web server for receiving the XML HTTP Requests?
My main problem is that sending the request is done from HTML and Javascript so it all seems like magic trickery at the moment.
Can anyone advise me as to the best way to go about receiving these requests on the server?
EDIT: It seems that a framework would be the way to go. Would web.py be a good route to take?
I would recommend to use a microframework like Sinatra for Ruby. There seem to be some equivalents for Python. What python equivalent of Sinatra would you recommend?
Such a framework allows you to simply map a single method to a route.
Writing a very basic HTTP server won't be very hard (see http://docs.python.org/library/simplehttpserver.html for an example), but you will be missing many features that are provided by real servers and web frameworks.
For your project, I suggest you pick one of the many Python web frameworks and run your application behind Apache/mod_wsgi.
There's absolutely no need to write your own web server. Plenty of options exist, including lightweight ones like nginx.
You should use one of those, and either your own custom WSGI code to receive the request, or (better) one of the microframeworks like Flask or Bottle.
I've written a Python application that makes web requests using the urllib2 library after which it scrapes the data. I could deploy this as a web application which means all urllib2 requests go through my web-server. This leads to the danger of the server's IP being banned due to the high number of web requests for many users. The other option is to create an desktop application which I don't want to do. Is there any way I could deploy my application so that I can get my web-requests through the client side. One way was to use Jython to create an applet but I've read that Java applets can only make web-requests to the server it is deployed on and the only way to to circumvent this is to create a server side proxy which leads us back to the problem of the server's ip getting banned.
This might sounds sound like and impossible situation and I'll probably end up creating a desktop application but I thought I'd ask if anyone knew of an alternate solution.
Thanks.
You can use a signed Java applet, they can use the Java security mechanism to enable access to any site.
This tutorial explains exactly what you have to do: http://www-personal.umich.edu/~lsiden/tutorials/signed-applet/signed-applet.html
The same might be possible from a Flash applet. Javascript is also restricted to the published site and doesn't allow being signed or security exceptions like this, AFAIK.
You probably can use AJAX requests made from JavaScript that is a part of client-side.
Use server → client communication to give commands and necessary data to make a request
…and use AJAX communication from client to 3rd party server then.
This depends on the form of "scraping" you intend to do:
You might run into problems running an AJAX call to a third-party site. Please see Screen scraping through AJAX and javascript.
An alternative would be to do it server-side, but to cache the results so that you don't hit the third-party server unnecessarily.
Check out diggstripper on google code.
I have a requirement to build a client for Shopify's API, building it in Python & Django.
I've never done it before and so I'm wondering if someone might advise on a good starting point for the kinds of patterns and techniques needed to get a job like this done.
Here's a link to the Shopify API reference
Thanks.
Your question is somewhat open-ended, but if you're new to Python or API programming, then you should get a feel for how to do network programming in Python, using either the urllib2 or httplib modules that come with more recent versions of Python. Learn how to initiate a request for a page and read the response into a file.
Here is an overview of the httplib module in Python documentation:
http://docs.python.org/library/httplib.html
After you've managed to make page requests using the GET HTTP verb, learn about how to make POST requests and how to add headers, like Content-Type, to your request. When communicating with most APIs, you need to be able to send these.
The next step would be to get familiar with the XML standard and how XML documents are constructed. Then, play around with different XML libraries in Python. There are several, but I've always used xml.dom.minidom module. In order to talk to an API, you'll probably need to know to create XML documents (to include in your requests) and how to parse content out of them. (to make use of the API's responses) The minidom module allows a developer to do both of these. For your reference:
http://docs.python.org/library/xml.dom.minidom.html
Your final solution will likely put both of these together, where you create an XML document, submit it as content to the appropriate Shopify REST API URL, and then have your application deal with the XML response the API sends back to you.
If you're sending any sensitive data, be sure to use HTTPS over port 443, and NOT HTTP over port 80.
I have been working on a project for the last few months using Python and Django integrating with Shopify, built on Google App Engine.
Shopify has a valuable wiki resource, http://wiki.shopify.com/Using_the_shopify_python_api. This is what I used to get a good handle of the Shopify Python API that was mentioned, https://github.com/Shopify/shopify_python_api.
It will really depend on what you are building, but these are good resources to get you started. Also, understanding the Shopify API will help when using the Python API for Shopify.
Shopify has now released a Python API client: https://github.com/Shopify/shopify_python_api
I think you can find some inspiration by taking a look at this:
http://bitbucket.org/jespern/django-piston/wiki/Home
Although it is directly opposite what you want to do (Piston is for building APIs, and what you want is to use an API) it can give you some clues on common topics.
I could mention, of course, reading obvious sources like the Shopify developers forum:
http://forums.shopify.com/categories/9
But I guess you already had it in mind :)
Cheers,
H.