Connection killed Azure container - python

I am trying to access to a Azure container to dowload some blobs with my python code.
My code is working perfectly on windows but when I execute it on my debian VM I have this error message :
<azure.storage.blob._container_client.ContainerClient object at 0x7f0c51cafd10>
Killed
admin_bbt#vm-bbt-cegidToAZ:/lsbCodePythonCegidToAZ/fuzeo_bbt_vmLinux_csvToAZ$
The blob I am trying to acces is not mine but I do have the sas key.
My code fail after this line :
container = ContainerClient.from_container_url(sas_url)
What I have tried to do :
move my VM to an other location
open the port 445 on my VM
install cifs-utils

Usually this issue comes when our VM was not enabled for managed identities for azure resources on VM. This MS Docs helped me to enable it successfully (MSDocs1, MSDocs2)
We need to check for network access rules which are as below
Go to the storage account you want to secure.
Select on the settings menu called Networking.
To deny access by default, choose to allow access from Selected networks. To allow traffic from all networks, choose to allow access from All networks.
Select Save to apply your changes.
Also along with these setting changes, need to ensure users can access blob storage, and might need to add vnet integration
Check this MS Docs for understanding about Azure Storage firewall rules.
We can use MSI to authenticate with VM

Related

How to simulate AWS services and env locally using Python moto?

Is it practically possible to simulate AWS environment locally using Moto and Python?
I want to write a aws gluejob that will fetch record from my local database and will upload to S3 bucket for data quality check and later trigger a lambda function for cronjob run using Moto Library using moto.lock_glue decorator.Any suggestion or document would be highly appreciated as I don't see much clue on same.Thank you in advance.
AFAIK, moto is meant to patch boto modules for testing.
I have experience working with LocalStack, a docker you can run locally, and it acts as a live service emulator for most AWS services (some are only available for paying users).
https://docs.localstack.cloud/getting-started/
You can see here which services are supported by the free version.
https://docs.localstack.cloud/user-guide/aws/feature-coverage/
in order to use it, you need to change the endpoint-url to point to the local service running on docker.
As it's a docker, you can incorporate it with remote tests as well e.g., if you're using k8s or a similar orchestrator

Is it safe to work with confidential data in Colab?

After having worked with it for a while, I would like to understand how Colab really works and whereas it is safe to work with confidential data in it.
A bit of context. I understand the differences between Python, IPython and Jupyter Notebook described in here. and I would summarize it by saying Python is a programming language and can be installed as any other application with sudo apt-get). IPython is an interactive command-line terminal for Python and can be installed with pip, the standard package manager for Python. It allows you to install and manage additional packages writen in Python that are not part of the Python standard library. Jupyter Notebook add a web interface to and it can use several kernels or backends being IPython one of them.
What about Colab? It is my understanding than when using Colab, I get a VM from google with Python pre-installed as well as many other libraries (aka packages) like pandas or matplotlib. These packages are all installed in the base python installation.
Colab VMs comes with some ephemeral storage. This is equivalent to instance storage in AWS. So it will be lost when the VM runtime is interrupted, i.e. our VM is stopped (or would you rather say...terminated?) by Google. I believe that if I were to upload my confidential data there it will not be in my private subnet...
Mounting our Drive is hence equivalent of using an EBS volume in AWS. An EBS volume is network attached drive so the daat in it will persist even if the VM runtime is interrupted. EBS volumes can however be attached to only one EC2 instance... but I can mount my Drive to several Colab sessions. Not exactly clear to me what these sessions are...
Some users would like to create virtual environments in Colab and it looks like mounting the drive is a way to get around it.
When mounting our Drive to Colab, we need to authentificate because we are giving to the IP of the Colab VM access to our private subnet. Hence, if we had some confidential data, by using Colab the data would not be leaving our private company subnet...?
IIUC, the last paragraph asks the question: "Can I use IP-based authentication to restrict access to data in Colab?"
The answer is no: network address filtering cannot provide meaningful access restrictions in Colab.
Colab is a service rather than a machine. Colab backends do not have fixed IP addresses or a fixed IP address range. By analogy, there's no list of IP addresses for restricting access to a particular set of Google Drive users since, of course, Google Drive users don't have a fixed IP address. Colab users and backends are similar.
Instead of attempting to restrict access to IPs, you'll want to restrict access to particular Google accounts, perhaps using typical Drive file ACLs.

appengine set up local host with datastore for testing

I have tried to follow googles documentation on how to set up local development using a database (https://cloud.google.com/appengine/docs/standard/python/tools/using-local-server#Python_Using_the_Datastore). However, i do not have the experience level to follow along. I am not even sure if that was the right guide. The application is a Django project that uses python 2.7. To run the local host, i usually type dev_appserver.py --host 127.0.0.1 .
My questions are:
how do i download the data store database on google cloud. I do not want to download the entire database, just enough data to populate local host so i can do tests
once the database is download, what do i need to do to connect it to the localhost? Do i have to change a parameter somewhere?
do i need to download the datastore? Can i just make a duplicate on the cloud and then connect to that datastore?
When i run localhost, should it not already be connected to the datastore? Since the site works when it is running on the cloud. Where can i find the connection URI?
Thanks for the help
The development server is meant to simulate the whole App Engine Environment, if you examine the output of the dev_appserver.py command you'll see something like Starting Cloud Datastore emulator at: http://localhost:PORT. Your code will interact with that bundled Datastore automatically, pushing and retrieving data according to the code you wrote. Your data will be saved on a file in local storage and will persist across different runs of the development server unless it's explicitly deleted.
This option doesn't provide facilities to import data from your existing Cloud Datastore instance although it's a ready to go solution if your testing procedures can afford populating the local database with mock data through the use of a custom created script that does so programmatically. If you decide for this approach just write the data creation script and execute it before running the tests.
Now, there is another option to simulate local Datastore using the Cloud SDK that comes with handy features for your purposes. You can find the available information for it under Running the Datastore Emulator documentation page. This emulator has support to import entities downloaded from your production Cloud Datastore as well as for exporting them into files.
Back to your questions:
Export data from the Cloud instance into a GCS bucket following this, then download the data from the bucket to your filesystem following this, finally import the data into the emulator with the command shown here.
To use the emulator you need to first run gcloud beta emulators datastore start in a Cloud Shell and then in a separate tab run dev_appserver.py --support_datastore_emulator=true --datastore_emulator_port=8081 app.yaml.
The development server uses one of the two aforementioned emulators, in both cases it is not connected to your Cloud Datastore. You might create another project aimed for development purposes with a copy of your database and deploy your application there so you don't use the emulator at all.
Requests at datastore are made trough the endpoint https://datastore.googleapis.com/v1/projects/project-id although this is not related to how the emulators manage the connections in your local server.
Hope this helps.

How to create a connection with google cloud instance with python code?

I'm completely working on python and need to connect my instance of cloud SQL to my python project(sort of Software). Now what I need is that without using cloud_sql_proxy I need to make the connection only and only using python so that client need not need to install Google Cloud SDK.
used cloud_sql_proxy need a way to execute that without google SDK
cloud_sql_proxy -instances=Instance-Name:tcp:3306
I expect that without installing google SDK by only using python client can access the database
If you really need to do this:
Expose your cloudsql instance to the ip address that the python code runs on. Do this in the console under cloudsql -> Connections -> Authorized networks.
Connect via the ip address of the instance using your chosen database connection tool. Looking at your snippet, you are using postgres, so I would suggest psycopg2
Otherwise, if your python code is also running in GCP, you can use the internal ip (provided that they are in the same network)

Python- Is it possible to upload a local file to a server

I have created a program on elastic beanstalk which would be used internally in my office. It is easier to host it rather than run it locally with firewalls.
On the office network, we have a server that holds reports that would be processed in my program. but they are around 12 clicks to access through a file prompt. I have tried using a Transport Adapter to no avail. Is there a way to do this using a post request without needing to actually choose the file?
Thanks for the help! (I am new at this)

Categories

Resources