running hadoop on Google app engine? - python

Is it possible to run map reduce jobs on Google app engine?
Any reference or tutorial would help
Thanks

Sort of.
You can't use the actual MapReduce framework - the architecture is too incompatible with AppEngine.
However, there is an equivalent system built specficially for GAE - appengine-mapreduce. That site is a bit confusing, as the first version of the code only supported mappers, without the subsequent reduce step - recently they released a version with full mapreduce support, but some of the documentation still referes to the earlier mapper-only one.
The best introduction is the GoogleIO talk from Mike Aizatskyi.

You cannot run Hadoop on Appengine (No filesystem access as well).
You may want to check AWS ElasticMapreduce. Its a cloud based platform for running Mapreduce jobs.
ElasticMapreduce

Here is the full documentation: https://developers.google.com/appengine/docs/python/dataprocessing/overview

Saw this Google Cloud Platform advertisement:
Hadoop on Google Compute Engine virtual machines
https://cloud.google.com/solutions/hadoop

Related

Can I use google DataFlow with native python?

I'm trying to build a python ETL pipeline in google cloud, and google cloud dataflow seemed a good option. When I explored the documentation and the developer guides, I see that the apache beam is always attached to dataflow as it's based on it.
I may find issues processing my dataframes in apache beam.
My questions are:
if I want to build my ETL script in native python with DataFlow is that possible? Or it's necessary to use apache beam for my ETL?
If DataFlow was built just for the purpose of using Apache Beam? Is there any serverless google cloud tool for building python ETL (Google cloud function has 9 minutes time execution, that may cause some issues for my pipeline, I want to avoid in execution limit)
My pipeline aims to read data from BigQuery process it and re save it in a bigquery table. I may use some external APIs inside my script.
Concerning your first question, it looks like Dataflow was primarly written for using it along the Apache SDK, as can be checked in the official Google Cloud Documentation on Dataflow. So, it is possible that's actually a requirement to use Apache Beam for your ETL.
Regarding your second question,this tutorial gives you a guidance on how to build your own ETL Pipeline with Python and Google Cloud Platform functions, which are actually serverless. Could you please confirm if this link has helped you?
Regarding your first question, Dataflow needs to use Apache Beam. In fact, before Apache Beam there was something called Dataflow SDK, which was Google proprietary and then it was open sourced to Apache Beam.
The Python Beam SDK is rather easy once you put a bit of effort into it, and the main process operations you'd need are very close to native Python language.
If your end goal is to read, process and write to BQ, I'd say Beam + Dataflow is a good match.

How to run python code on cloud

I have a python code which is quite heavy, my computer cant run it efficiently, therefore i want to run the python code on cloud.
Please tell me how to do it ? any step by step tutorial available
thanks
Based on my experience I would recommend Amazon Web Services: https://aws.amazon.com/.
I would suggest for you to look about creating an EC2 Instance and run your code there. An EC2 Instance basically is some kind of server and you can automate your Python script there as well.
Now, there's this tutorial that helped me a lot to have a clearer image about running Python script using AWS (specifically EC2): https://www.youtube.com/watch?v=WE303yFWfV4.
For further informations about Cloud Services in Amazon and products, you can get informations here: https://aws.amazon.com/products/.
You can try Heroku. It's free and they got their own tutorials. But it's good enough only if you will use it for studying. AWS, Azure or google cloud are much better for production.

Python in Google Cloud Functions

Can Google Cloud Functions handle python with packages like sklearn, pandas, etc? If so, can someone point me in the direction of resources on how to do so.
I've been searching a while and it seems like this is impossible, all I've found are resources to deploy the base python language to google cloud.
Python 3.7 is supported now.
Steps to create one via the google cloud console:
go to google cloud functions in the google cloud console and click on create function
2.specify the function's properties
select trigger
4.change runtime to python 3.7
enter your cloud function logic and entry point
enter python dependencies in requirements.txt
EDIT: As of July 2018 there is now a Python runtime (3.7) available for Google Cloud Functions!
OLD ANSWER: Google Cloud Functions (GCF) are written in JavaScript (executed in a Node.js runtime), so there is no way for them to actually handle Python at this moment. There is a Python module at GitHub that you might have come across and it can be used to write and deploy GCF with one of three trigger types: http, Pub/Sub and bucket. The module takes care of translating your Python logic to a JavaScript code that is later run inside Google Cloud Platform.
When it comes to other packages like pandas, the ‘translation’ into JavaScript was not prepared for them by anyone AFAIK. If you really don’t like the idea of jumping into JavaScript and writing the Cloud Function code on your own (with the logic you intended to use in a Python script), you have a possible workaround. You can evoke your Python script from inside of the Cloud Function written in JS - the idea was discussed in this topic. Another way is using Object Change Notifications or Pub/Sub Notifications as explained here.
As of 19th July 2018, Google Cloud Functions supports Python 3.7.
Kindly check the Runtime environment to find the Python 3.7 runtime and sample script (based on Flask) .
--UPDATED--
Official Documentation for the Google Cloud Functions - Python 3.7 support Beta Release.
This is a beta release of the Python runtime for Google Cloud
Functions. This feature might be changed in backward-incompatible ways
and is not subject to any SLA or deprecation policy.
SkLearn, Numpy is supported in Google Cloud function. Also I've run a sample test to confirm the availability of Pandas as well and its working fine.
https://github.com/mkanchwala/google-functions-python-example
Hope this helps to all the "Py" lovers.
You can use AWS lambda as well if you want to work around and still use Python as your main language. Some modules/packages will need to be imported via zip file with AWS Lambda but it has a broader range of usable languages than GCF

AppEngine applications running on pypy

Is it possible to deploy python services on Google's AppEngine using pypy as runtime environment?
Being a JIT compiled implementation, and seeing its good use cases on web apps, it makes sense it could be integrated with App Engine.
I have found out that custom python environments can be created in Google AppEngine. These require only a Dockerfile defining the project.

Deployment on Google App Engine - Django, Vagrant, Ansible

I want to deploy Django project on google app engine
Following are the current situations.
I have a code on GITHUB
Djnago project has setup using Vagrant, Ansible, VirtualBox
I am completely new for cloud base deployments.
Need help to achieve this.
I checked google docs but there are couple of options for django related deployment, I am not sure which to pick for vagrant and ansible.
Your question is a bit too generic as it stands - making it here rather than comment for clarity.
If you're talking about deploying to GAE (Google App Engine) then most likely you cannot re-use your Ansible scripts as you've been writing for vagrant. As it may be possible to use Ansible to deploy on GAE, most people I know are using standard google procedure to deploy their app.
If you plan to use GCE (Google Compute Engine, a layer down in the infrastructure), you would be able to use your existing Ansible provisioning scripts (maybe with slight modification), follow along the Ansible documentation

Categories

Resources