I'm trying to learn the boto API and I noticed that there are two major versions/packages for Python: boto and boto3.
What is the difference between the AWS boto and boto3 libraries?
The boto package is the hand-coded Python library that has been around since 2006. It is very popular and is fully supported by AWS but because it is hand-coded and there are so many services available (with more appearing all the time) it is difficult to maintain.
So, boto3 is a new version of the boto library based on botocore. All of the low-level interfaces to AWS are driven from JSON service descriptions that are generated automatically from the canonical descriptions of the services. So, the interfaces are always correct and always up to date. There is a resource layer on top of the client-layer that provides a nicer, more Pythonic interface.
The boto3 library is being actively developed by AWS and is the one I would recommend people use if they are starting new development.
Related
I am looking at using google cloud python SDKs to manage my resources. I am not able to find compute module in the python SDK.
Python Doc here: https://googlecloudplatform.github.io/google-cloud-python/latest/
However compute module is available in Node.js SDK.
Node.js Doc here: https://googlecloudplatform.github.io/google-cloud-node/#/docs/google-cloud/0.56.0/compute
Can I get information if this module(compute) is available in python?
If not is this being planned and when can I expect it.
The google-cloud-python project you link to is hand-crafted pythonic libraries for GCP APIs. There is not yet one for Compute.
Instead you will have to use the auto-generated Python client library. See https://cloud.google.com/compute/docs/tutorials/python-guide for an example.
I searched for Python API to interact with google bigquery. And I found two packages provides similar APIs: Google BigQuery Client(Part of Google API Client Package googleapiclient) and Gcloud package gcloud.
Here are the documentation about using these two APIs for Bigquery:
Google API Client:googleapiclient
https://developers.google.com/resources/api-libraries/documentation/bigquery/v2/python/latest/index.html
https://cloud.google.com/bigquery/docs/reference/v2/
Google Cloud package: gcloud
http://googlecloudplatform.github.io/gcloud-python/stable/bigquery-usage.html
Both packages are from google, and provides similar functionalities interacting with bigquery. I have the following confusions:
It seems both package includes a wide range of functionalities of Google Cloud Platform. In my view, gcloud provides commandline tool and local environment setup. Generally, what are the differences of these two packages?
In terms of python module, what are the differences of their usage?
Is there any relation between these two packages?
Which is more suitable for accessing Bigquery?
What kind of job are they suitable for?
The googleapiclient client is generated directly from the raw API definition (the definition is a json file, hosted here.)
Because it is automatically generated, it is not what any sane python programmer would do if they were trying to write a python client for BigQuery. That said, it is the lowest-level representation of the API.
The gcloud client, on the other hand, was what a group of more-or-less sane folks at Google came up with when they tried to figure out what a client should look like for BigQuery. It is really quite nice, and lets you focus on what's important rather than converting results from the strange f/v format used in the BigQuery API into something useful.
Additionally, the documentation for the gcloud API was written by a doc writer. The documentation for the googleapiclient was, like the code, automatically generated from a definition of the API.
My advice, having used both (and having, mostly unsuccessfully, helped design the BigQuery API to try to make the generated client behave reasonably), is to use the gcloud client. It will handle a bunch of low-level details for you and generally make your life easier.
I'm new to azure, I see that a new api/abstraction has been released called Azure Resource Manger, which will superceed older azure service management :
I was trying to spin up an instance using the azure python sdk, but I see the sdk still using the older concepts (afinity groups, virtual networks, hosted services). There is no mention of resource groups. Are this supported? If not when will this be added to the sdk?
Per my experience, currently Python SDK doesn't provide packages for new Azure Resource Manager APIs while it provides components, i.e. ServiceManagement, Storage, and ServiceBus. Please feel free to submit your ideas and feedback on https://github.com/Azure/azure-sdk-for-python/issues.
You might want to take a look at the azure-mgmt collection of packages, which use the new Resource Manager APIs.
I am looking for a python library which can be used for accessing vSphere WS SDK. I have came across two which are having non GPL license:
psphere - https://github.com/jkinred/psphere
pysphere - https://code.google.com/p/pysphere/
Has anybody used these in production. I do not want these for test automation but for a product which could go on a scale of upto 25K VMs.
I saw this post Python - VMWare vSphere (WEB SDK) - SUDS. But there he seems to be using it for test automation only. Also I am not only looking for VM operations but also other objects like Host, Cluster, PortGroup, vDS etc.
Regards,
Litty
VMware has published an initial release of their vSphere SDK for Python two weeks ago: pyVmomi
I don't know psphere or pysphere but I've used Suds to access the vSphere Web Services. Worked pretty well.
The vSphere WS API is SOAP based and,to the best of my knowledge, exposes everything that's possible via vCenter. It's a bit tricky sometimes, but you can do it.
I don't know what you try to achieve but you should be able to do it with Suds. Of course, you'd have to familiarize yourself with the API: vSphere Web Services SDK
At the moment, we're playing around a bit with vCenter Orchestrator. It's a nice tool (since 5.0). Maybe that's an option for you, too.
I'm looking for a good Django custom storage backend for use with Amazon S3.
I've been googling around and found a lot of blog posts with code snippets or half-baked gist.github.com one-off jobs. But I can't seem to find a solid, well-tested one.
Is there a widely accepted standard Amazon S3 Django custom storage backend out there? It doesn't particularly matter to me what Python backend library it uses--i.e., either S3.py or boto are fine.
Have you checked out django-storages? I would lean towards the boto library as I have had good experiences with boto.