I am trying to rewrite some bash scripts in Python, and specifically am trying to rewrite a line that executes gsutil -m rsync -r /local/path/to/data gs:/path/to/data. However, I am not able to find any references to rsync functionality in the Python client library documentation here.
If anyone has solved this, please let me know. If this functionality is not currently implemented in the client library, does anyone know why?
gsutil is a command line tool and has application-level logic beyond the client library, so not all of the features of gsutil are available in the client library. gsutil does not presently consume the google-cloud-python client library, as that library was developed later.
Related
I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.
I was trying to automate the process of getting/copying a file from a POD to the jump node of an Openshift cluster(not using the rsync command). I tried to find any suitable ansible module for this but failed and tried to create a python script also but not sure how will I manage the authentication part. So if you have any example repo/website from where I can get an idea on how to write a python script for this or any official Ansible module that is suitable for this task please let me know. Thanks.
The Google documentation is a little generic on this topic and I find it hard to get around the different APIs and terms they're using, so I'm wondering if someone could point me to the right direction.
I'm looking for a way to call the gcloud command directly from Python. I've installed gcloud in my Python environment and as an example to follow, I'd like to know how to do the following from Python:
gcloud compute copy-files [Source directory or file name] [destination directory of file name]
You should check out gcloud:
https://pypi.python.org/pypi/gcloud
There's nothing magic about uploading files to a Computer Engine VM. I ended up using paramiko to upload files.
You can of course call gcloud from python directly and not care about the implementation details, or you can try to see what gcloud does:
Try running gcloud compute copy-files with the --dry-run flag. That will expose the scp command it uses underneath and with what arguments. Knowing what scp params you need, you can recreate them programmatically using paramiko_scp in python. More information on this here: How to scp in python?
You can use the subprocess.run function in python to execute commands from your terminal/shell/bash. That is what I have done to execute gcloud commands from python, rather than using the Python SDK.
Other Q&As on Stack Overflow have already addressed how to launch SCP using Python and its standard library. How do I determine the scp transfer is stalling using python so that I can react to it?
Context
I have access to Python 2.6 and its standard library. I am unable to use additional packages in my working environment. Rsync is also unavailable and we're forced to use scp (no ftp either).
I'm trying to write a script that identifies a stalled transfer, ends it and restarts using a different node.
Thanks
How can we call the CLI executables commands using Python
For example i have 3 linux servers which are at the remote location and i want to execute some commands on those servers like finding the version of the operating system or executing any other commands. So how can we do this in Python. I know this is done through some sort of web service (SOAP or REST) or API but i am not sure....... So could you all please guide me.
Depends on how you want to design your software.
You could do stand-alone scripts as servers listening for requests on specific ports,
or you could use a webserver which runs python scripts so you just have to access a URL.
REST is one option to implement the latter.
You should then look for frameworks for REST development with python, or if it’s simple logic with not so many possible requests can do it on your own as a web-script.
Maybe you should take a look at Pushy, which allows to connect to remote machines through SSH and make them execute various Python functions. I like using it because there are no server-side dependencies except the SSH server and a Python interpreter, and is therefore really easy to deploy.
Edit: But if you wish to code this by yourself, i think SOAP is a nice solution, the SOAPpy module is great and very easy to use.
You can use Twisted,
It is easy create ssh clients or servers.
Examples:
http://twistedmatrix.com/documents/current/conch/examples/