Are there any python programs which will grab detailed data from NOAA, particularly the hourly data and the 7-day day/night forecasts?
I found python-weather-api, python-weather, and python-noaa. However they are either no longer supported or lack the data that I'd like.
If not, then I'll probably make one myself although I'd prefer not to reinvent the wheel!
It looks like the National Digital Forecast Database (NDFD) Simple Object Access Protocol (SOAP) Web Service is what you're looking for. You can use SOAP to access NDFD XML objects and parse them with Python quite easily - no 3rd-party libraries needed. If you prefer RESTful services you can do that too.
They even have a great graphic on how SOAP requests work.
Related
How does e-commerce usually handle integrations with ERP software?
We are working on a project for a client, who previously planned to use an ERP system that had a REST API.
This API allowed us to:
Place orders
Inform the ERP if the order was paid for
Get order status
Get all of the items available
Check item availability
Get user data
That would allow us to build a fairly complex online store with a lot of features.
Now the client wants to use another ERP system:
http://www.netsuite.com/portal/platform.shtml
I researched it, and the difficulty of integration surprised me. No REST API, some weird SOAP protocol to communicate with the system, and you have to write a lot of logic using SuiteScript. A whole new, different programming language just to build an integration with an online store? Why not just give developers access to an API to place orders and fetch items? And there are absolutely no docs available online for the thing. People on forums are saying that the system lacks in documentation and one has figure it out himself, along the way.
Magento and Shopify integration is done by third parties and looks dodgy. Same thing with SAP ERP. Am I missing something? Why is such a basic thing as a REST API for e-commerce not available for those systems?
Why develop using Python Django for the back-end and using React.js for the frontend. What is the right way to integrate them with the ERP system?
NetSuite does have a REST API and webservices. "you have to write a lot of logic using SuiteScript" is true but it's just JavaScript and there are many talented developers out there.
I'm not sure there is a "right way" but there are many ways to connect to the data.
My suggestion would be to contact a partner company, such as SWK Technologies. http://swktech.com
NetSuite has two main APIs, SuiteTalk and SuiteScript.
SuiteTalk is the Web Services API, which is SOAP based and allows for pulling data from and updating NetSuite. The SuiteScript API is JavaScript based and allows you to customize accounts and export data at the appropriate event during your business process. The term "SuiteCloud" encompasses all APIs and integration tools.
As for documentation, this is mostly only available to clients and partners. If you have a client who provides you with access to their account, you will gain access to the NetSuite Help Center and all relevant documentation.
Your options for integrating with the e-commerce platform depends on the exact platform. This ranges from Webhooks to HTTP requests.
You can't say NetSuite is delimiting developers in any way. It depends on how you look at it. As I see it, NetSuite provides two main method for developers - SuiteTalk and SuiteScript.By this, developer can create his/her own API, define what kind of acces those API should have.
SuiteTalk is SOAP based.
I would suggest using SuiteScript to create your own API using either NS RESTlet or NS Suitelet.
They have the feature for External URL. By sending request to this external URL you can trigger your own custom functions written on the SuiteScript. By SuiteScript, you can create your own API and define your own function. Ie, developer is in full control.
The only problem I see with NetSuite is its higher barrier for entry. There is no way you can access NetSuite Help Centre without having a Client/Partner/Test account.
But obviously, those who need some kind of integration with NetSuite have NS account.
I failed to get approval for my application that I started to write against the TradeMe API. My API access was not approved. I'm therefore looking for alternatives.
Any NZ property for sale APIs out there? I have seen realestate.co.nz which according to the github repo, might provide something in PHP and Ruby, but the Ruby repo hasn't been touched in several years. Google API perhaps?
I'm specifically interested in obtaining geo-location information for the properties on sale.
The sandbox should let you access trademe without the need to access the main server.
realestate.co.nz seems to have both Javascript and Ruby APIs. I'm going to investigate the possibility of building a Python port as their code is on github/realestate.co.nz
I have no financial interest in either TradeMe or realestate.co.nz, for the record. Just a guy trying to avoid screen scraping.
This year me and a friend have to make a project for the final year of university. The plan is to make a proxy/sever that allows to store ontologies and RDF's, by this way this data is "chained" to a web, so you can make a request for that web and the proxy will send you the homepage with metadata.
We have been thinking to use python and rdflib, and for the web we don't know which framework is the best. We thought of django, but we think that is very big for our purpose, and we decided that webpy or web2py is a better option.
We don't have any python coding experience, this will be our very first time. We have been always programming in c++ and java.
So taking into account everything we've mentioned our question is, which would be the best web framework for our project? And will rdflib suit fine with this framework?
Thanks :)
I have developed several Web applications with Python framworks consuming RDF data. The choice always depends on the performance needed and the amount of data you'll have to handle.
If the number of triples you'll handle is in the magnitude of few thousands then you can easily put together a framework with RDFlib + Django. I have used this choice with toy applications but as soon as you have to deal with lots of data you'll realise that it simply doesn't scale. Not because of Django, the main problem is RDFlib's implementation of a triple store - it is not great.
If you're familiar with C/C++ I recommend you to have a look at Redland libraries. They are written in C and you have bindings for Python so you can still develop your Web layer with Django and pull out RDF data with Python. We do this quite a lot and it normally works. This option will scale a bit more but won't be great either.
In case your data grows to millions of triples then I recommend you to go for a Scalable Triple store. You can access them through SPARQL and HTTP. My choice is always 4store. Here you have a Python client to issue queries and assert/remove data 4store Python Client
I have a requirement to build a client for Shopify's API, building it in Python & Django.
I've never done it before and so I'm wondering if someone might advise on a good starting point for the kinds of patterns and techniques needed to get a job like this done.
Here's a link to the Shopify API reference
Thanks.
Your question is somewhat open-ended, but if you're new to Python or API programming, then you should get a feel for how to do network programming in Python, using either the urllib2 or httplib modules that come with more recent versions of Python. Learn how to initiate a request for a page and read the response into a file.
Here is an overview of the httplib module in Python documentation:
http://docs.python.org/library/httplib.html
After you've managed to make page requests using the GET HTTP verb, learn about how to make POST requests and how to add headers, like Content-Type, to your request. When communicating with most APIs, you need to be able to send these.
The next step would be to get familiar with the XML standard and how XML documents are constructed. Then, play around with different XML libraries in Python. There are several, but I've always used xml.dom.minidom module. In order to talk to an API, you'll probably need to know to create XML documents (to include in your requests) and how to parse content out of them. (to make use of the API's responses) The minidom module allows a developer to do both of these. For your reference:
http://docs.python.org/library/xml.dom.minidom.html
Your final solution will likely put both of these together, where you create an XML document, submit it as content to the appropriate Shopify REST API URL, and then have your application deal with the XML response the API sends back to you.
If you're sending any sensitive data, be sure to use HTTPS over port 443, and NOT HTTP over port 80.
I have been working on a project for the last few months using Python and Django integrating with Shopify, built on Google App Engine.
Shopify has a valuable wiki resource, http://wiki.shopify.com/Using_the_shopify_python_api. This is what I used to get a good handle of the Shopify Python API that was mentioned, https://github.com/Shopify/shopify_python_api.
It will really depend on what you are building, but these are good resources to get you started. Also, understanding the Shopify API will help when using the Python API for Shopify.
Shopify has now released a Python API client: https://github.com/Shopify/shopify_python_api
I think you can find some inspiration by taking a look at this:
http://bitbucket.org/jespern/django-piston/wiki/Home
Although it is directly opposite what you want to do (Piston is for building APIs, and what you want is to use an API) it can give you some clues on common topics.
I could mention, of course, reading obvious sources like the Shopify developers forum:
http://forums.shopify.com/categories/9
But I guess you already had it in mind :)
Cheers,
H.
Has anybody created a nice wrapper around Yahoo's geo webservice "GeoPlanet" yet?
After a brief amount of Googling, I found nothing that looks like a wrapper for this API, but I'm not quite sure if a wrapper is what is necessary for GeoPlanet.
According to Yahoo's documentation for GeoPlanet, requests are made in the form of an HTTP GET messages which can very easily be made using Python's httplib module, and responses can take one of several forms including XML and JSON. Python can very easily parse these formats. In fact, Yahoo! itself even offers libraries for parsing both XML and JSON with Python.
I know it sounds like a lot of libraries, but all the hard work has already been done for the programmer. It would just take a little "gluing together" and you would have yourself a nice interface to Yahoo! GeoPlanet using the power of Python.