Azure - Reading local files from Static Web Page - python

I am trying out static web pages service from Azure. But I need to access a list of files (for example images stored on the server) from the client side. I am trying to implement a Function that will iterate over the files in a particular folder and return a list of names.
The problem is that I simply cannot access those files. It can't find them from the context in which the Function runs as if they are stored on another machine.
I now use this function to print the folders and files accessible to the python.
This is the function I used called GetResources:
import logging
import os
import sys
import azure.functions as func
import json
import os
def showFolderTree(path):
show_files=True
indentation=1
file_output=False
tree = []
result=""
if not show_files:
for root, dirs, files in os.walk(path):
level = root.replace(path, '').count(os.sep)
indent = ' '*indentation*(level)
tree.append('{}{}/'.format(indent,os.path.basename(root)))
if show_files:
for root, dirs, files in os.walk(path):
level = root.replace(path, '').count(os.sep)
indent = ' '*indentation*(level)
tree.append('{}{}/'.format(indent,os.path.basename(root)))
for f in files:
subindent=' ' * indentation * (level+1)
tree.append('{}{}'.format(subindent,f))
for line in tree:
result+=line+"\n"
return result
def main(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
# logging.info('Python HTTP trigger function processed a request.') --> where is this logged?
try:
errors=context.function_directory+"\n"
except Exception as e:
error="context error\n"
try:
errors+=os.path.dirname(os.path.realpath(__file__))+"\n"
errors+=os.getcwd()+"\n"
errors+=showFolderTree("/")
except Exception as e:
errors+=e
return func.HttpResponse(errors,status_code=200)
This function returns:
/home/site/wwwroot/GetResources
/home/site/wwwroot/GetResources
/home/site/wwwroot
/
app/
.bash_logout
.bashrc
.profile
site/
wwwroot/
.funcignore
requirements.txt
proxies.json
.gitignore
host.json
GetResources/
function.json
sample.dat
__init__.py
__pycache__/
__init__.cpython-38.pyc
.python_packages/
lib/
site-packages/
azure/
functions/
...
but I cannot find my files in the list.
Observations:
The Python code runs on a linux environment
I tried a similar code with C# and a Windows environment running on Azure
I tried placing a folder of assets with pictures in the api folder in which the function resides. Still could not find the files.
What am I doing wrong?
Note: I have a student subscription
Side issue: I cannot find anywhere the logs generated by the Function (generated by the logging function)

While Azure Static Web Apps act as a proxy to Azure Functions for APIs, the Functions themself are hosted separate to the Static Web App and don't share the filesystem.
So, instead of having these files pushed alongside the Web App, they should be stored in Blob Storage and from Functions, you could leverage Blob Storage Bindings to read/write them. To list files in a container, you need to bind to BlobContainerClient and use a method like BlobContainerClient.GetBlobs.
Duplicate of my answer on Microsoft Q&A

Related

creating/deleting folders in runtime using heroku/django

I have developed a Django app where I am uploading a file, doing some processing using a project folder name media.
Process:
user uploads a csv file, python code treats the csv data by creating temp folders in Media folder. After processing is complete, these temp folders are deleted and processed data is downloaded through browser.
I am using the below lines of code to make and delete temp file after processing
temp = 'media/temp3'
os.mkdir(temp)
shutil.copyfile('media/' + file_name, temp + '/' + file_name)
shutil.rmtree(temp, ignore_errors=True)
To set the media root, I used the below lines in settings.py which I am sure I am not using in other parts of the code.
MEDIA_ROOT = os.path.join(BASE_DIR, 'media/')
MEDIA_URL = "/media/"
Everything works fine when I run the app on local host. but as soon as i deployed it to heroku, it seems like these folders were not created/not found.
I am looking for:
Either a solution to create, read and delete folders/files in runtime using heroku,
or
a better way to manage files/folders in runtime.

Automating running Python code using Azure services

Hi everyone on Stackoverflow,
I wrote two python scripts. One script is for picking up local files and sending them to GCS (Google Cloud Storage). Another one is opposite - for taking files from GCS that were uploaded and saving locally.
I want to automate process using Azure.
What would you recommend to use? Azure Function App, Azure Logic App or other services?
*
I'm now trying to use Logic App. I made .exe file using pyinstaller and looking for connector in Logic App that will run my program (.exe file). I have trigger in Logic App - "When a file is added or modified", but now I stack when selecting next step (connector)..
Kind regards,
Anna
Adding code as requested:
from google.cloud import storage
import os
import glob
import json
# Finding path to config file that is called "gcs_config.json" in directory C:/
def find_config(name, path):
for root, dirs, files in os.walk(path):
if name in files:
return os.path.join(root, name)
def upload_files(config_file):
# Reading 3 Parameters for upload from JSON file
with open(config_file, "r") as file:
contents = json.loads(file.read())
print(contents)
# Setting up login credentials
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = contents['login_credentials']
# The ID of GCS bucket
bucket_name = contents['bucket_name']
# Setting path to files
LOCAL_PATH = contents['folder_from']
for source_file_name in glob.glob(LOCAL_PATH + '/**'):
# For multiple files upload
# Setting destination folder according to file name
if os.path.isfile(source_file_name):
partitioned_file_name = os.path.split(source_file_name)[-1].partition("-")
file_type_name = partitioned_file_name[0]
# Setting folder where files will be uploaded
destination_blob_name = file_type_name + "/" + os.path.split(source_file_name)[-1]
# Setting up required variables for GCS
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
# Running upload and printing confirmation message
blob.upload_from_filename(source_file_name)
print("File from {} uploaded to {} in bucket {}.".format(
source_file_name, destination_blob_name, bucket_name
))
config_file = find_config("gcs_config.json", "C:/")
upload_files(config_file)
config.json:
{
"login_credentials": "C:/Users/AS/Downloads/bright-velocity-___-53840b2f9bb4.json",
"bucket_name": "staging.bright-velocity-___.appspot.com",
"folder_from": "C:/Users/AS/Documents/Test2/",
"folder_for_downloaded_files": "C:/Users/AnnaShepilova/Documents/DownloadedFromGCS2/",
"given_date": "",
"given_prefix": ["Customer", "Account"] }
Currently, there is no built-in connector in Logic Apps for interacting with Google Cloud Services. however, you can use Google Cloud Storage does provide REST API in your Logic app or Function app.
But my suggestion is you can use the Azure Function to do these things. Because the azure Function can be more flexible to write your own flow to do the task.
Refer to run your .exe file in the Azure function. If you are using Local EXE or using Cloud Environment exe.
Refer here for more information

Python Boto3 - how to drop a 'folder' and rename another one?

I have these paths in S3:
s3://mykey/mytest/file1.txt
s3://mykey/mytest/file2.txt
s3://mykey/mytest/file3.txt
and
s3://mykey/mytest_temp/file4.txt
s3://mykey/mytest_temp/file5.txt
s3://mykey/mytest_temp/file6.txt
Want to drop s3://mykey/mytest/ (and all files in it) and THEN rename s3://mykey/mytest_temp/ to s3://mykey/mytest/ while keeping all files in there (file4, file5, file6).
Final result should be - only 1 folder:
s3://mykey/mytest/file4.txt
s3://mykey/mytest/file5.txt
s3://mykey/mytest/file6.txt
How to achieve this using Python Boto3?
Thanks.
The AWS API only allows an operation on one object at a time. Also, there is no "move" command, so you would need to do a Copy and a Delete.
The easiest way to do what you ask is to use the AWS Command-Line Interface (CLI) because it has some higher-level commands that can do this easily:
aws rm --recursive s3://mykey/mytest/
aws mv s3://mykey/mytest_temp/ s3://mykey/mytest/
If you didn't want to use the AWS CLI, you could code this operation using boto but you would need to loop through each object and process it individually.
To do this purely from Python using boto3, you would need to do the following:
Deleting existing 'folder'
Call list_objects_v2(), passing in a Prefix to obtain a listing of the directory
Take the results and pass the object names into a delete_objects() call
Please note that each of these API calls handle up to 1000 objects each. If you have more than 1000 objects, you would need to paginate the results by calling them again.
'Renaming' objects
Amazon S3 does not have a 'rename' command. Instead, it will be necessary to copy each object to a new key, then delete the original object.
Call list_objects_v2(), passing in a Prefix to obtain a listing of the directory
Loop through each object and:
Call copy_object(), specifying a full path in the destination Key
Call delete_object() after the object has been copied
I have a Django project where I needed the ability to rename a folder but still keep the directory structure in-tact, meaning empty folders would need to be copied and stored in the renamed directory as well.
aws cli is great but neither cp or sync or mv copied empty folders (i.e. files ending in '/') over to the new folder location, so I used a mixture of boto3 and the aws cli to accomplish the task.
More or less I find all folders in the renamed directory and then use boto3 to put them in the new location, then I cp the data with aws cli and finally remove it.
import threading
import os
from django.conf import settings
from django.contrib import messages
from django.core.files.storage import default_storage
from django.shortcuts import redirect
from django.urls import reverse
def rename_folder(request, client_url):
"""
:param request:
:param client_url:
:return:
"""
current_property = request.session.get('property')
if request.POST:
# name the change
new_name = request.POST['name']
# old full path with www.[].com?
old_path = request.POST['old_path']
# remove the query string
old_path = ''.join(old_path.split('?')[0])
# remove the .com prefix item so we have the path in the storage
old_path = ''.join(old_path.split('.com/')[-1])
# remove empty values, this will happen at end due to these being folders
old_path_list = [x for x in old_path.split('/') if x != '']
# remove the last folder element with split()
base_path = '/'.join(old_path_list[:-1])
# # now build the new path
new_path = base_path + f'/{new_name}/'
# remove empty variables
# print(old_path_list[:-1], old_path.split('/'), old_path, base_path, new_path)
endpoint = settings.AWS_S3_ENDPOINT_URL
# # recursively add the files
copy_command = f"aws s3 --endpoint={endpoint} cp s3://{old_path} s3://{new_path} --recursive"
remove_command = f"aws s3 --endpoint={endpoint} rm s3://{old_path} --recursive"
# get_creds() is nothing special it simply returns the elements needed via boto3
client, resource, bucket, resource_bucket = get_creds()
path_viewing = f'{"/".join(old_path.split("/")[1:])}'
directory_content = default_storage.listdir(path_viewing)
# loop over folders and add them by default, aws cli does not copy empty ones
# so this is used to accommodate
folders, files = directory_content
for folder in folders:
new_key = new_path+folder+'/'
# we must remove bucket name for this to work
new_key = new_key.split(f"{bucket}/")[-1]
# push this to new thread
threading.Thread(target=put_object, args=(client, bucket, new_key,)).start()
print(f'{new_key} added')
# # run command, which will copy all data
os.system(copy_command)
print('Copy Done...')
os.system(remove_command)
print('Remove Done...')
# print(bucket)
print(f'Folder renamed.')
messages.success(request, f'Folder Renamed to: {new_name}')
return redirect(request.META.get('HTTP_REFERER', f"{reverse('home', args=[client_url])}"))

Error in azure functions for python whenever trying to create new directory

I am trying to create a new directory folder using azure functions for python. But I am not able to create a new directory and file in azure functions for python. I got below error.
Whenever I am executing Azure functions for python on local then it's working fine but not on azure.
Error: -
Error in folder creation: [Errno 30] Read-only file system: './HttpTrigger/logs'
I am trying to create new log folder in HttpTrigger function, but got above error.
Please check the below code: -
import logging
import struct
import sys
import azure.functions as func
import os
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
try:
if not os.path.exists('./HttpTrigger/logs'):
logging.info('Inside Forlder Creation')
os.makedirs('./HttpTrigger/logs')
f= open("test.txt","w+")
for i in range(10):
logging.info('Inside For')
f.write("This is line %d\r\n" % (i+1))
logging.info('Outside For')
f.close()
return func.HttpResponse("Created",
status_code=200
)
except Exception as e:
return func.HttpResponse(f"Error in floder creation : {e}", status_code=400)
Is there any way to create a new directory in azure functions for python? Please let me know if there is any way.
If you need to do some file processing temporary then azure function provides a temporary directory.
temporary directory in azure functions
Here is a code snippet.
import tempfile
from os import listdir
tempFilePath = tempfile.gettempdir()
fp = tempfile.NamedTemporaryFile()
fp.write(b'Hello world!')
filesDirListInTemp = listdir(tempFilePath)
For reference.
The point of Azure functions (and more generally serverless functions) is to be triggered by a specific event, execute some logic and then exit. It's not like a regular server where you have access to the file system and where you can read/write files. Actually, you can't be sure it will always be executed by the same physical machine ; Azure abstracts all this concepts for you (hence the name "serverless").
Now, if you really need to write files, you should have a look at Blob storage. It's a cloud-native service where you can actually download and upload files. From your Azure function, you'll have to use the Blob storage API to manipulate the files.
Your actual app folder will be reading from a zip, so won't allow you to create folders or files. However you should be able to create temporary directories in like the /tmp directory. That said, you shouldn't rely on them being there and are (as the name implies) intended to be temporary. So would stick with #frankie567 advice on using something like Azure Storage to store artifacts you expect to pull later.
You could create file or directory in the temp folder or the function execution folder, cause the content in temp folder won't be saved all the time so you could create directory in the execution directory, you could get the directory with Context bingding the use function_directory to get it. Further more information you could refer to this doc: Python developer guide.
And the below is my test code, I create the folder and file then send the filename as the response
import logging
import os
import time
import datetime
import json
import azure.functions as func
def main(req: func.HttpRequest,context: func.Context) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
t = datetime.datetime.now().strftime("%H-%M-%S-%f")
foldername=context.function_directory+"/newfolder"+t
os.makedirs(foldername)
suffix = ".txt"
newfile= t+suffix
os.getcwd()
os.chdir(foldername)
if not os.path.exists(newfile):
f = open(newfile,'w')
f.write("test")
f.close()
data=[]
for filename in os.listdir(context.function_directory):
print(filename)
d1={"filename":filename}
data.append(d1)
jsondata=json.dumps(data)
return func.HttpResponse(jsondata)
Here is the result picture, you could see I could create the folder and file.

Content from folder on a server

I am working with a python script on a server with the following hierarchy:
DreamteamPy (folder)
pictest.py
assets (folder)
pictures (folder)
31.jpg
picture2.jpg
picture3.jpg
The complete path of the python file is
http://www.cytosine.nl/~owe4_pg3/Rogier/DreamteamPy/pictest.py
And one of the pictures:
http://www.cytosine.nl/~owe4_pg3/Rogier/DreamteamPy/assets/pictures/31.jpg
How can I get all of the files in the pictures folder?
I've tried things like
import os
def index():
filelist=[]
path= "http://www.cytosine.nl/~owe4_pg3/Rogier/DreamteamPy/assets/pictures"
for filename in os.listdir(path):
filelist.append(filename)
but to no avail.
What you call is path isn't a path but a HTTP URL. os.listdir() needs a path on the local file system. Which in your case most likely is something like /home/owe4_pg3/html/Rogier/DreamteamPy/assets/pictures unless it is not a typical Linux installation where a web server is configured to map ~username at the start of the path part of an URL to /home/username/html.
os.listdir() already returns a list, so it doesn't make much sense to copy the elements one by one into another list.
import os
PICTURES_PATH = '/home/owe4_pg3/html/Rogier/DreamteamPy/assets/pictures'
def index():
filenames = os.listdir(PICTURES_PATH)

Categories

Resources