Identifying when scp is stalling using python - python

Other Q&As on Stack Overflow have already addressed how to launch SCP using Python and its standard library. How do I determine the scp transfer is stalling using python so that I can react to it?
Context
I have access to Python 2.6 and its standard library. I am unable to use additional packages in my working environment. Rsync is also unavailable and we're forced to use scp (no ftp either).
I'm trying to write a script that identifies a stalled transfer, ends it and restarts using a different node.
Thanks

Related

Firebase Cloud Functions running a python script - needs dependencies

I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.

How to avoid 'paramiko' which restricts me to limited python usage on remote machine?

I have distributed system to test (e.g. hadoop), so my test cluster has 10 to 20 machines. We have developed test automation suite which we trigger from outside of the test cluster. Being a system remotely located we are not able to leverage python modules except 'paramiko' for remote calls, as a result, we always issue a Linux commands a part of test execution.
What shall I do to leverage different python modules on remote machine?
Is python not meant for the distributed system?
paramiko definitely won't help to leverage the Python modules present on remote machines.
Python is definitely meant for distributed systems. For this, one solution I can think of is need to use Remote Procedure Call mechanism.
There are several modules in Python using which you can achieve RPC.
XML-RPC (In-Built)
RPyC
PyRo
Stackoverflow link with multiple Python RPC solutions.
Some links with comparison between above three
Pyro and RPyC
XML-RPC vs PyRo

Forbid Python from writing anything to disk

Are there any command-line options or configurations that forbids Python from writing to disk?
I know I can hack open but it doesn't sound very safe.
I've hosted some Python tutorials I wrote myself on my website for friends who want to learn Python, and I want them to have access to a Python console so they can try as they learn. This is done by creating a Python subprocess from the http server.
However, I do not want them to accidentally or intentionally damage my server, so I need to forbid the Python process from writing anything to disk.
Also I'm running the server on Ubuntu Linux so doing it Python-wise or system-wise are both OK.
I doubt there's a way to do this in the interpreter itself: there are way too many things to patch (open, subprocess, os.system, file, and probably others). I'd suggest looking into a way of containerizing the python runtime via something like Docker. The containerization gives some guarantees restricting access, though not as much as virtualization. See here for more discussion about the security implications.
Running a jupyter/ipython notebook in the docker container would probably be the easiest way to expose a web-frontend. jupyter provides a collection of docker containers for this purpose: see https://github.com/jupyter/tmpnb and https://github.com/jupyter/docker-stacks

Odd message and processing hangs

I have a large project that runs on an application server. It does pipelined processing of large batches of data and works fine on one Linux system (the old production environment) and one windows system (my dev environment).
However, we're upgrading our infrastructure and moving to a new linux system for production, based on the same image used for the existing production system (we use AWS). The python version (2.7) and libraries should be identical because of this, we're verifying this on our own using file hashes, also.
Our issue is that when we attempt to start processing on the new server, we receive a very strange output written to standard out followed by hanging of the server, "Removing descriptor: [some number]". I cannot duplicate this on the dev machine.
Has anyone ever encountered behavior like this in python before? Besides modules in the python standard library we are also using eventlet and beautifulsoup. In the standard library we lean heavily on urllib2, re, cElementTree, and multiprocessing (mostly the pools).
wberry was correct in his comment, I was running into a max descriptors per process issue. This seems highly dependent on operating system. Reducing the size of the batches I was having each processor handle to below the file descriptor limit of the process solved the problem.

Kerberos authentication with python

I need to write a script in python to check a webpage, which is protected by kerberos. Is there any possibility to do this from within python and how? The script is going to be deployed on a linux environment with python 2.4.something installed.
dertoni
I think that python-krbV and most Linux distributions also have a python-kerberos package. For example, Debian has one of the same name. Here's the documentation on it
Extract from link:
"This Python package is a high-level wrapper for Kerberos (GSSAPI)
operations. The goal is to avoid having to build a module that wraps
the entire Kerberos.framework, and instead offer a limited set of
functions that do what is needed for client/server Kerberos
authentication based on http://www.ietf.org/rfc/rfc4559.txt. "

Categories

Resources