I was trying to automate the process of getting/copying a file from a POD to the jump node of an Openshift cluster(not using the rsync command). I tried to find any suitable ansible module for this but failed and tried to create a python script also but not sure how will I manage the authentication part. So if you have any example repo/website from where I can get an idea on how to write a python script for this or any official Ansible module that is suitable for this task please let me know. Thanks.
Related
I would like to create a self-contained, .exe file that launches a JupyterLab server as an IDE on a physical server which doesn't have Python installed itself.
The idea is to deploy it as part of an ETL workflow tool, so that it can be used to view notebooks that will contain the ETL steps in a relatively easily digestible format (the notebooks will be used as pipelines via papermill and scrapbook - not really relevant here).
While I can use Pyinstaller to bundle JupyterLab as a package, there isn't a way to launch it on the Pythonless server (that I can see), and I can't figure out a way to do it using Python code alone.
Is it possible to package JupyterLab this way so that I can run the .exe on the server and then connect to 127.0.0.1:8888 on the server to view a notebook?
I have tried using the link below as a starting point, but I think I'm missing something as no server seems to start using this code alone, and I'm not sure how I would execute this via a tornado server etc.:
https://gist.github.com/bollwyvl/bd56b58ba0a078534272043327c52bd1
I would really appreciate any ideas, help, or somebody to tell my why this idea is impossible madness!
Thanks!
Phil.
P.S. I should add that Docker isn't an option here :( I've done this before using Docker and it's extremely easy.
So I'm trying to get my Intellij to see Apache Airflow that I downloaded. The steps I've taken so far:
I've downloaded the most recent Apache Airflow setup and saved the apache airflow 2.2.3 onto my desktop. I'm trying to get it to work with my Intellij, I've tried adding the Apache Airflow folder into the Library and Modules, both have come back with errors stating it's not being utilized. I've tried looking up documentation on it within Airflow but I'm not able to find any documentation on how to implement in your own IDE to write Python scripts for DAGs and other items?
How would I go about doing this as I'm at a complete loss of how to get Intellij to register that Apache Airflow is a Library to utilize for Python code so I can write DAG files correctly within the IDE itself.
Any help would be much appreciated as I've been stuck on this aspect for the past couple of days searching for any kind of documentation to make this work.
Airflow is both application and library. In your case you are not trying to run the application but only looking to write DAGs so you need it just as a library.
You should just open a virtual environment (preferably) and run:
pip install apache-airflow
Then you can write DAGs using the library and Intellij will let you know if you are using wrong imports or deprecated objects.
When your DAG file is ready deploy it to the DAG folder on the machine where Airflow is running.
I am using the Docker Python SDK docker-py to create a script that allows starting one or multiple containers (depending on a program argument in a way like script.py --all or script.py --specific_container) and it has to be possible to start each container with its own configuration (image, container_name, etc.) just like in typical docker-compose.yml files.
So basically, im trying to do the same what docker-compose does, just with the Python Docker SDK.
I've read that some people are trying to stick with docker-compose by using subprocess but it is not recommended and i would like to avoid this.
I am searching for possibly existing libraries for this but i haven't found anything just yet. Do you know anything i could use?
Another option would be to somehow store configuration files for the "specific_container"-profiles and for the "all"-profile as JSON (?) and then parse them and populate the Docker SDK's run method of the Container class, which lets you give all options that you can also give in the docker-compose file.
Maybe someone knows another, better solution?
Thanks in advance guys.
I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.
I have a image in my firebase acccount and I want to run a python script on this image and get a result back.
Can someone suggest me a simple way to do this?
I tried hosting the python file in heroku. Fetching the image from firebase to heroku and running the python script would be an overhead.
Is there a simpler way to run the python script in firebase itself?
There are a handful of Python wrappers for Firebase but some have not been updated in awhile. Try this Getting Started with Pyrebase, then try this tutorial for Getting started on Heroku with Python.