Prevent user from importing os module in jupyter notebook - python

I am trying to set up a jupyter notebook server so that a few members can have access and run analysis on it. But there are several API credentials that I stored as the environment variables that I don't want users to have access to. Basically I want to prevent users from importing os module in the notebook, since os.environ list all environment variables on the server. What would be a proper way to do this?

You could try to run the jupyter notebook server as a Docker container. That way your environment variables will be isolated from the container. Ipython has an available docker image, so you need to install docker if this approach works for you.
Installing Ipython Docker Image
If you need to pass environment variables for the Docker container refer to this question: Passing env variables to docker

Related

Set environment variable for fastAPI on Cloud RUN

I'm trying to set environment variables using docker (for fastAPI) but clod run doesn't want to see them. I tried many solutions, what would be a good way? I mention that I use the docker image in cloud run

installing python in a docker container

I'm new to coding and been fiddling around with docker containers and services
I have installed a temporary vscode server on my raspberry and deployed on my local lan for accessing it from various machines
Now i've been trying to create a flask app and run it from the container and trying to figure out how to publish and run the flask web server since i can't figure out what ip should i host it on (the default i always used was host=127.0.0.1 port=8080, but that would bring me to the local machine i'm visiting it from)
So while i was troubleshooting to understand what to do with exposed ports etc i've stopped the container and changed the docker-compose file, (i have a path set for the config's permanent storage, so my vscode setting are actually saved and persistent between deployments)
But I'm having the problem that every time i stop and re deploy the container i loose my python3 installation, and have to re run apt-update, apt-upgrade, apt install python3-pip and every python packages i need for the project
Where am i going wrong?
Silly question, but where does python get installed, and why isn't it persistent since i have my config path set?
I read that python gets installed in usr/local/lib, should i also map those directories to the persistent storage folder?
how should i do that?
thanks

Will my virtual environment folder still work if i shift it from my local machine to a shared folder?

Currently, I have a Flask app waiting to be shifted to a shared drive in the company's local network. My current plan is to create a virtual environment inside a folder locally, install all the necessary dependencies like Python3, Flask, Pandas and etc. This is to ensure that my Flask app can still reference to necessary dependencies to run the app.
As I can't access my shared drive via command prompt, my plan is to create a virtual environment locally then shift it to the shared folder together with all the scripts required by my Flask app. Will the app be able to run on the shared drive in this manner?
The standard virtualenv hardcodes the path of the environment into some of its files in the environment it creates, and stops working if you rename it etc. So no, you'll probably have to recreate the env on the destination server.
Perhaps your app's startup script could take care of creating the env if it's missing.

vscode run / debug python in docker instance

I'm using jupyter notebooks to prototype and I write the majority of my code as python packages using vscode and installed as so:
pip install -e .
This works well as I can test rapidly prototype in jupyter but still maintain reusable / testable code by keeping most of the heavy lifting in the package(s)
I'd like to move my python/jupyter environment to docker. Is there any way to configure vscode to work well with a "remote" development environment running in a docker container?
Since May 2019 (version 1.35), VScode remote development feature is present in the stable release. It splits the VScode program in two:
a server part that can be runned on a remote computer, container, or WSL environment
a client part, mainly the GUI, that is runned locally
When properly configured, debugging/linting/... operations will be executed inside the container. To answer your specific question, you can get a debug experience identical the one of an uncontainerized setup.
See here for a quick overview of this feature. You can find a vscode-issued tutorial on how to setup vscode with docker here.
If you expose the Jupyter instance running in the container to your machine, you may be able to specify it as a remote Jupyter server.

Pass %%local variables from dotenv run in jupyter to Azure HDInsight pyspark cluster

Intro
This link details how to install jupyter locally and work against an Azure HDInsight cluster. This works well getting things setup.
However:
Not all python packages that we have available locally are available on the cluster.
Some local processing may want to be done before 'submitting' a cell to the cluster.
I'm aware that python packages that are not installed can be installed via script actions and %%configure, however given our use of dotenv locally these don't seem to be viable solutions.
Problem
Source control with git
Git repos are local on dev machines We store
configuration/sensitive environment variables in .env files
locally (they are not checked into git)
dotenv package is used to
read sensitive variables and set locally for execution
blob storage
account names and keys are example of these variables
how to pass these locally set variables to a pyspark cell?
Local cell example
Followed by pyspark cell

Categories

Resources