I am having a hard time to deploy an R model and expose it as a web service using azuremlsdk for R. The Python side of Azure machine learning appears to be more mature as Python was deemed more important by Microsoft. Anyway, I was wondering if one score an R model, persisted as .rds file, in Python. I understand R can talk to Python via reticulate. Any input from Python experts would be very much appreciated. Thanks.
Related
I am implementing a social networking App Backend and I am using Node.js for development but in my app one feature is feed like in Instagram following and explore feed.
I have implemented backend in node js but for a feed algorithm I use python so how to integrate python function into node and also I want to know is it a good approach to in one single application you use two different languages.I use python because in future we use some machine learning so we all know no language is better than python for machine learning.
Please explain to me what is the right way to do it
Let us assume that, I have a Machine Learning model which uses the Tensorflow library and performs a function.
And of course, the function requires input and provides some output. Which can be easily done on my terminal.
Here comes the area where I need help, IO in the terminal is not what I wanted, I want to create a web application that has multiple functionalities. Let us only consider the area where my ML model has to work, say when I click a button after entering the inputs it runs the ML model back and provides me an output which I could display it on my web application.
I can develop websites with NodeJs. Is there any possibility of integrating the ML model with my web application made of NodeJs?
I tried running on some packages like child-process,python-shell nothing works, it gives me an error when it comes to the place of importing packages.
Please give me suggestions on the Integration of the ML model with a NodeJS web application using MongoDB as a database. All I wanted is that my web application runs my ML model wherever necessary to obtain the results which I can display through my web applications.
Also, provide information about technologies that can be easily integrated with the Machine learning model.
I'm also encountering similar problems.
Further to the previous reply, I would go for the micro services architecture design.
E.g. Use a node.js app to serve the front-end request, and the Python server only to serve the machine learning tasks.
You need to build a python web application around your model to serve responses (your model’s output) to the client (the webpage your nodejs app is serving).
Flask and Django are the major players there. REST is the most popular meta-framework.
Once you’ve built a REST API around your model, you just query it like any resource via HTTP/xmlhttprequest in JavaScript from your web page.
Or you can try to pickle your model and load into into JavaScript to query from your nodejs app.
Easier than either of those would be to use IBM Cloud Functions or AWS Lambda to expose your script/model. Very easy and cost effective.
We have an app which is created with Ionic. We used firebase as backend. All authentication and couple of things handled by firebase. We have two major things which are done by python. First we are scraping web for some data and can put it as direct Json file to firebase. Second we have to take some data from firebase, put it on some machine learning library methods and should have new data which again will be stored in firebase.
Question
How can I add python code which will be running periodicly in every 24 hours inside firebase for both scraping and machine learning scripts?
Confusion
I am not really familiar with server side like I would run python scripts in Apache server or not I have no clue. Since we started with firebase, I have to know how it is done. Python codes are actually ready. I checked python-firebase as well. Evenmore, I have put my machine learning data into firebase. But I do not know where to put this python code. Only thing that I have accomplished is firebase is capable of taking Json files with just one button. I would have not enough knowledge about server-side scripts as well.
Conclusion
If someone can enlighten me, I would deeply appreciate that.
I'm researching a good way to get my SPSS model logic into my website in real time. Currently we built a shody python script that mimics what the SPSS model does with the data. The problem is whenever we make an update or optimization to the SPSS model we have to go in and adjust the python code. How do people solve for this usually?
I've gotten a suggestion to create a config file for all the frequently updated functions in SPSS to translate over to the current python script. We're open to completely generating the python script from the SPSS model though, if there's a way to do that.
I've looked into cursor method but these seem have their main value in automating SPSS with python, which isn't really what we need.
You may want to look into the use of the Watson Machine Learning service on IBM Bluemix.
https://console.bluemix.net/catalog/services/machine-learning?taxonomyNavigation=data
The SPSS Streams Service that is part of this allows you to take your stream, deploy it, and then call it via REST API.
Note the Streams Service does not support legacy R nodes, Extension nodes (R or Python code), or Predictive Extensions. However, it does support Text Analytics.
The service allows up to 5000 calls a month for free and then can be purchased on a per prediction basis.
This year me and a friend have to make a project for the final year of university. The plan is to make a proxy/sever that allows to store ontologies and RDF's, by this way this data is "chained" to a web, so you can make a request for that web and the proxy will send you the homepage with metadata.
We have been thinking to use python and rdflib, and for the web we don't know which framework is the best. We thought of django, but we think that is very big for our purpose, and we decided that webpy or web2py is a better option.
We don't have any python coding experience, this will be our very first time. We have been always programming in c++ and java.
So taking into account everything we've mentioned our question is, which would be the best web framework for our project? And will rdflib suit fine with this framework?
Thanks :)
I have developed several Web applications with Python framworks consuming RDF data. The choice always depends on the performance needed and the amount of data you'll have to handle.
If the number of triples you'll handle is in the magnitude of few thousands then you can easily put together a framework with RDFlib + Django. I have used this choice with toy applications but as soon as you have to deal with lots of data you'll realise that it simply doesn't scale. Not because of Django, the main problem is RDFlib's implementation of a triple store - it is not great.
If you're familiar with C/C++ I recommend you to have a look at Redland libraries. They are written in C and you have bindings for Python so you can still develop your Web layer with Django and pull out RDF data with Python. We do this quite a lot and it normally works. This option will scale a bit more but won't be great either.
In case your data grows to millions of triples then I recommend you to go for a Scalable Triple store. You can access them through SPARQL and HTTP. My choice is always 4store. Here you have a Python client to issue queries and assert/remove data 4store Python Client