Python resources for cloud computing learning? - python

Is there a book or resource for learning cloud in Python or Scala? I know Django and app-engine but I am not that interested in learning more about a client framework. I'm interested in learning the core thing.

Steve Marx published a blog post describing a python sample running in Windows Azure, with the Rocket web server. The code is on github.
This will show you some interesting elements of setting up a python app in Windows Azure, including startup tasks. You'll still want to take a look at the Windows Azure Platform Training Kit to get a deeper understanding of Windows Azure.

Related

Anaconda enterprise connection to cloud vs. 'offline'

On my company PC, I do not have full permissions to install Python packages(usually this has to be requested for approval from IT, which is very painful and takes a very long time).
I am thinking to ask my manager to invest in Anaconda Enterprise so that the security aspect of open source Python use will not be an issue anymore. However, also to consider, my boss is looking to move to the cloud and I was wondering if Anaconda Enterprise can be used interchangeably on-premise (offline from cloud, i.e., no use of cloud storage or cloud compute resources) and when needed for big data processing, switched to 'cloud mode' by connecting to any of AWS, GCP, Azure to rent GPU instances? Any advice welcome.
Yes, that can be a good approach for your company, I used it in many projects on GCP and IBM cloud over Debian 7,8 and 9, and is a good approach, you can also depend on your need to create a package channel with the enterprise version and manage the permissions over your packages and it has a deploying tool where you can manage the deploys and audit the different for projects and API's as well also track the deployments and assign them to owners.
You can switch your server nodes to different servers or add and remove as well when you work with those depending on your environment can be difficult at the beginning but is pretty good after implemented.
Below are some links where you can see more information about what I'm talking about:
using-anaconda-enterprise
conda-offline-install-update
server-nodes
Depending on your preferences it may not be necessary to use anaconda enterprise on GCP. If you're boss is looking to move to the cloud then GCP has some great options for analyzing big data. Using the AI Platform you can deploy a new instance choose R, Python, CUDA, TensorFlow etc. Once the instance is deployed you can start your data preprocessing. Install whatever libraries you desire, Numpy, Scipy, Pandas, Matplotlib etc. And start your data manipulation.
If using something like Jupyter Notebooks you can use that offline to prepare your work before entering the GCP platform to run the Model Training.
Oh, also GCP has many labs to test out their Data Science platform.
https://www.qwiklabs.com/quests/43
GCP has many free promos these days below is a link to one.
GCP - Build your cloud skills for free with Google Cloud
Step by step usage for AI Platform

Register Python App with Spring Cloud Data Flow

I need to make use of python code libraries to do some NLP and ML. However, I also would like to use Spring Cloud Data Flow to register these python script apps etc.
I am using kafka as the messaging middleware for spring cloud stream. Ideally, I can code the business login in java and package them as a jar. However, I need python gensim to get NLP results back.
Is there any way I can solve this problem? Thanks
We have a fleet of deployment options to run Python workloads in SCDF. There's, in fact, a recipe for each of the deployment options in the Microsite.
Feel free to try out the desired option and let us know if you have any feedback through issues in the Microsite repo.

How to run python code on cloud

I have a python code which is quite heavy, my computer cant run it efficiently, therefore i want to run the python code on cloud.
Please tell me how to do it ? any step by step tutorial available
thanks
Based on my experience I would recommend Amazon Web Services: https://aws.amazon.com/.
I would suggest for you to look about creating an EC2 Instance and run your code there. An EC2 Instance basically is some kind of server and you can automate your Python script there as well.
Now, there's this tutorial that helped me a lot to have a clearer image about running Python script using AWS (specifically EC2): https://www.youtube.com/watch?v=WE303yFWfV4.
For further informations about Cloud Services in Amazon and products, you can get informations here: https://aws.amazon.com/products/.
You can try Heroku. It's free and they got their own tutorials. But it's good enough only if you will use it for studying. AWS, Azure or google cloud are much better for production.

Watson Retrieve and Rank: Python Bluemix runtime

I am trying to run through the following tutorial on Bluemix:
https://www.ibm.com/watson/developercloud/doc/retrieve-rank/get_start.shtml
However, I am not able to install Python locally onto my system due to security policies. Is there a way I can run this tutorial through hosting my code in IBM DevOps Services using the Python runtime on Bluemix?
I am not sure if the Bluemix Python runtime can be leveraged like a natively installed Python and accept the command line instructions from:
Stage 4: Create and train the ranker.
Any feedback would be greatly appreciated!
An alternative to the manual Python and curl commands in that tutorial is to use the web interface.
I've written about it in some detail on my blog which also includes a video walkthrough of how the web tool works. (You can also find the official documentation on ibm.com).
But to sum up, it'd mean you could do everything through a web browser and not have to install or run anything locally - including training a ranker.
There is a small wrinkle in this plan right now, unfortunately. The Solr schema used in the Python/curl tutorial you've followed isn't compatible with the web tool, but we're working on that. This means that if you use the web tool, you'll need to start again with a new cluster and collection. But this means that you could start again using your own documents, content and training questions, instead of having to use the cranfield test data - so hopefully this is a good thing!

running hadoop on Google app engine?

Is it possible to run map reduce jobs on Google app engine?
Any reference or tutorial would help
Thanks
Sort of.
You can't use the actual MapReduce framework - the architecture is too incompatible with AppEngine.
However, there is an equivalent system built specficially for GAE - appengine-mapreduce. That site is a bit confusing, as the first version of the code only supported mappers, without the subsequent reduce step - recently they released a version with full mapreduce support, but some of the documentation still referes to the earlier mapper-only one.
The best introduction is the GoogleIO talk from Mike Aizatskyi.
You cannot run Hadoop on Appengine (No filesystem access as well).
You may want to check AWS ElasticMapreduce. Its a cloud based platform for running Mapreduce jobs.
ElasticMapreduce
Here is the full documentation: https://developers.google.com/appengine/docs/python/dataprocessing/overview
Saw this Google Cloud Platform advertisement:
Hadoop on Google Compute Engine virtual machines
https://cloud.google.com/solutions/hadoop

Categories

Resources