I am trying to run through the following tutorial on Bluemix:
https://www.ibm.com/watson/developercloud/doc/retrieve-rank/get_start.shtml
However, I am not able to install Python locally onto my system due to security policies. Is there a way I can run this tutorial through hosting my code in IBM DevOps Services using the Python runtime on Bluemix?
I am not sure if the Bluemix Python runtime can be leveraged like a natively installed Python and accept the command line instructions from:
Stage 4: Create and train the ranker.
Any feedback would be greatly appreciated!
An alternative to the manual Python and curl commands in that tutorial is to use the web interface.
I've written about it in some detail on my blog which also includes a video walkthrough of how the web tool works. (You can also find the official documentation on ibm.com).
But to sum up, it'd mean you could do everything through a web browser and not have to install or run anything locally - including training a ranker.
There is a small wrinkle in this plan right now, unfortunately. The Solr schema used in the Python/curl tutorial you've followed isn't compatible with the web tool, but we're working on that. This means that if you use the web tool, you'll need to start again with a new cluster and collection. But this means that you could start again using your own documents, content and training questions, instead of having to use the cranfield test data - so hopefully this is a good thing!
Related
I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.
I have a python code which is quite heavy, my computer cant run it efficiently, therefore i want to run the python code on cloud.
Please tell me how to do it ? any step by step tutorial available
thanks
Based on my experience I would recommend Amazon Web Services: https://aws.amazon.com/.
I would suggest for you to look about creating an EC2 Instance and run your code there. An EC2 Instance basically is some kind of server and you can automate your Python script there as well.
Now, there's this tutorial that helped me a lot to have a clearer image about running Python script using AWS (specifically EC2): https://www.youtube.com/watch?v=WE303yFWfV4.
For further informations about Cloud Services in Amazon and products, you can get informations here: https://aws.amazon.com/products/.
You can try Heroku. It's free and they got their own tutorials. But it's good enough only if you will use it for studying. AWS, Azure or google cloud are much better for production.
Can Google Cloud Functions handle python with packages like sklearn, pandas, etc? If so, can someone point me in the direction of resources on how to do so.
I've been searching a while and it seems like this is impossible, all I've found are resources to deploy the base python language to google cloud.
Python 3.7 is supported now.
Steps to create one via the google cloud console:
go to google cloud functions in the google cloud console and click on create function
2.specify the function's properties
select trigger
4.change runtime to python 3.7
enter your cloud function logic and entry point
enter python dependencies in requirements.txt
EDIT: As of July 2018 there is now a Python runtime (3.7) available for Google Cloud Functions!
OLD ANSWER: Google Cloud Functions (GCF) are written in JavaScript (executed in a Node.js runtime), so there is no way for them to actually handle Python at this moment. There is a Python module at GitHub that you might have come across and it can be used to write and deploy GCF with one of three trigger types: http, Pub/Sub and bucket. The module takes care of translating your Python logic to a JavaScript code that is later run inside Google Cloud Platform.
When it comes to other packages like pandas, the ‘translation’ into JavaScript was not prepared for them by anyone AFAIK. If you really don’t like the idea of jumping into JavaScript and writing the Cloud Function code on your own (with the logic you intended to use in a Python script), you have a possible workaround. You can evoke your Python script from inside of the Cloud Function written in JS - the idea was discussed in this topic. Another way is using Object Change Notifications or Pub/Sub Notifications as explained here.
As of 19th July 2018, Google Cloud Functions supports Python 3.7.
Kindly check the Runtime environment to find the Python 3.7 runtime and sample script (based on Flask) .
--UPDATED--
Official Documentation for the Google Cloud Functions - Python 3.7 support Beta Release.
This is a beta release of the Python runtime for Google Cloud
Functions. This feature might be changed in backward-incompatible ways
and is not subject to any SLA or deprecation policy.
SkLearn, Numpy is supported in Google Cloud function. Also I've run a sample test to confirm the availability of Pandas as well and its working fine.
https://github.com/mkanchwala/google-functions-python-example
Hope this helps to all the "Py" lovers.
You can use AWS lambda as well if you want to work around and still use Python as your main language. Some modules/packages will need to be imported via zip file with AWS Lambda but it has a broader range of usable languages than GCF
I have a web crawling python script that takes hours to complete, and is infeasible to run in its entirety on my local machine. Is there a convenient way to deploy this to a simple web server? The script basically downloads webpages into text files. How would this be best accomplished?
Thanks!
Since you said that performance is a problem and you are doing web-scraping, first thing to try is a Scrapy framework - it is a very fast and easy to use web-scraping framework. scrapyd tool would allow you to distribute the crawling - you can have multiple scrapyd services running on different servers and split the load between each. See:
Distributed crawls
Running Scrapy on Amazon EC2
There is also a Scrapy Cloud service out there:
Scrapy Cloud bridges the highly efficient Scrapy development
environment with a robust, fully-featured production environment to
deploy and run your crawls. It's like a Heroku for Scrapy, although
other technologies will be supported in the near future. It runs on
top of the Scrapinghub platform, which means your project can scale on
demand, as needed.
As an alternative to the solutions already given, I would suggest Heroku. You can not only deploy easily a website, but also scripts for bots to run.
Basic account is free and is pretty flexible.
This blog entry, this one and this video contain practical examples of how to make it work.
There are multiple places where you can do that. Just google for "python in the cloud", you will come up with a few, for example https://www.pythonanywhere.com/.
In addition, there are also several cloud IDEs that essentially give you a small VM for free where you can develop your code in a web-based IDE and also run it in the VM, one example is http://www.c9.io.
In 2021, Replit.com makes it very easy to write and run Python in the cloud.
If you have a google e-mail account you have an access to google drive and utilities. Choose for colaboratory (or find it in more... options first). This "CoLab" is essentially your python notebook on google drive with full access to your files on your drive, also with access to your GitHub. So, in addition to your local stuff you can edit your GitHub scripts as well.
I have developed a few python programs that I want to make available online.
I am new to web services, and I am not sure what I need to do in order to create a service where somebody makes a request to an URL (for example), and the URL triggers a Python program that displays something in the user's browser, or a set of inputs are given to the program via browser, and then python does whatver it is supposed to do.
I was playing with the google app engine, which runs fine with the tutorial, and was planning to use it becuase it looks easy, but the problem with GAE is that it does not work well (or does not work at all) with some libraries that I plan to use.
I guess what I am trying to do is some sort of API using my WebFaction account.
Can anybody point me in the right directions? What choices do I have in WebFaction? What are the easiest tools available?
Thank you very much for your help in advance.
Cheers
Well, your question is a little bit generic, but here are a few pointers/tips:
Webfaction allows you to install pretty much anything you want (you need to compile it / or ask the admins to install some CentOS package for you).
They provide some default Apache server with mod_wsgi, so you can run web2py, Django or any other wsgi frameworks.
Most popular Python web frameworks have available installers in Webfaction (web2py, django...), so I would recommend you to go with one of them.
I would also install supervisord to keep your service running after some reboot/crash/problem.
I would be glad to help you if you have any specific question...