Store images temporary in Google App Engine? - python

I'm writing an app with Python, which will check for updates on a website(let's call it A) every 2 hours, if there are new posts, it will download the images in the post and post them to another website(call it B), then delete those images.
Site B provide API for upload images with description, which is like:
upload(image_path, description), where image_path is the path of the image on your computer.
Now I've finished the app, and I'm trying to make it run on Google App Engine(because my computer won't run 7x24), but it seems that GAE won't let you write files on its file system.
How can I solve this problem? Or are there other choices for free Python hosting and providing "cron job" feature?

GAE has a BlobStore API, which can work pretty much as a file storage, but probably it's not what you whant. Actually, the right answer depends on what kind of API you're using - it may support file-like objects, so you could pass urllib response object, or accept URLs, or tons of other interesting features

You shouldn't need to use temporary storage at all - just download the image with urlfetch into memory, then use another urlfetch to upload it to the destination site.

Related

Share media between multiple django(VMs) servers

We have deployed a django server (nginx/gunicorn/django) but to scale the server there are multiple instances of same django application running.
Here is the diagram (architecture):
Each blue rectangle is a Virtual Machine.
HAProxy sends all request to example.com/admin to Server 3.other requests are divided between Server 1 and Server 2.(load balance).
Old Problem:
Each machine has a media folder and when admin Uploads something the uploaded media is only on Server 3. (normal users can't upload anything)
We solved this by sending all requests to example.com/media/* to Server 3 and nginx from Server3 serves all static files and media.
Problem right now
We are also using sorl-thumbnail.
When a requests comes for example.com/,sorl-thumbnail tries to access the media file but it doesn't exist on this machine because it's on Server3.
So now all requests to that machine(server 1 or 2) get 404 for that media file.
One solution that comes to mind is to make a shared partition between all 3 machines and use it as media.
Another solution is to sync all media folders after each upload but this solution has problem and that is we have almost 2000 requests per second and sometimes sync might not be fast enough and sorl-thumbnail creates the database record of empty file and 404 happens.
Thanks in advance and sorry for long question.
You should use an object store to save and serve your user uploaded files. django-storages makes the implementation really simple.
If you don’t want to use cloud based AWS S3 or equivalent, you can host your own on-prem S3 compatible object store with minio.
On your current setup I don’t see any easy way to fix where the number of vm s are dynamic depending on load.
If you have deployment automation then maybe try out rsync so that the vm takes care of syncing files with other vms.
Question: What was the problem?
we got 404 on other machines because normal requests (requests asking for a template) would get a 404 not found on thumbnail media.
real problem was with sorl-thumbnail template tags.
Here is what we ended up doing:
In models that needed a thumbnail, we added functions to create that specific thumbnail.
and using a post-save signal in the admin machine called all those functions to make sure all the thumbnails were created after save and the table for sorl-thumbnail is filled.
now in templates instead of calling sorl-thumbnail template tags now we call a function in model.

Flask - Serving user-uploaded images to the webpage

So I am working on a Flask application which is pretty much a property manager that involves allowing users to upload images of their properties. I am new to Flask and have never had to deal with images before. From a lot of Googling I understand that there are various ways to manage static files like images.
One way is to allow users to upload images directly to the file system, and then displaying it by retrieving the file location in the static folder using something like:
<img src="static/images/filename.jpg">
However, is this really an efficient way since this means storing generating and storing the location of each image URL in the database? Especially when it comes to deploying the application? Another way I discovered was using base64 encoding and storing the image directly into the database, which also doesn't sound very efficient either.
Another way, which I think might be the best to go about this, is to use an AWS S3 bucket. The user would then be able to upload an image directly to that bucket and be assigned a URL to that image. This URL is stored in the database and can then be used to display the image similarly to the file system method. Is my understanding of this correct? Is there a better way to go about this? And is there something similar to django-storages that can be used to connect Flask to S3?
Any input or pointing me in the right direction would be much appreciated. Thank you!
If you want to store the images in the web server then the best approach for you is to use nginx as proxy in front of flask and let nginx serve the static folder for all the images.
Nginx is pretty much enough for a small website. Don't try to serve the file using flask. It is too slow.
If you want to store the images in s3 ,then you just need to store the name of image in bucket in the database. You can tell flask to use s3 bucket as the static folder. You can use boto3 library in python to access s3.
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html
If you are concerned of exposing s3 bucket to users, then you can use cloudfront distribution. It is cheaper in terms of price to serve and also hides your bucket.

Django Google App Engine Upload files greater than 32mb

I have a Django Rest Framework Project that I've integrated with Django-Storages to upload files to GCS. Everything works locally. However, Google App Engine imposes a hard limit of 32mb on the size of each request, I cannot upload any files greater than this described limit.
I looked into many posts here on StackOverflow and on the internet. Some of the solutions out listed the use of Blobstore API. However, I cannot find a way to integrate this into Django. Another solution describes the use of django-filetransfers but that plugin is obsolete.
I would appreciate it if someone can point me towards an approach I can take to fixing this problem.
PS: I would like to point out that the current setup works likes this. A post request sends the file up to the server which then handles the process of storing the file in google cloud storage. Since Google App Engine restricts request size to 32mb I cannot get to the point of receiving the file. So my issue is that how can I go about uploading these large files.
According with the official documentation[1] cloud storage can manage files until the 5 tb of size, nevertheless, is recommended take a look at the best practices document[2], also there is an example about how to upload objects using python here [3].
[1]https://cloud.google.com/storage/docs/json_api/v1/objects/insert
[2]https://cloud.google.com/storage/docs/best-practices#uploading
[3]https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python

Setting Up S3 with Heroku and Django with Images

so I currently have my static files (js and css) just being stored on Heroku which is no biggie. However, I have objects that I need to store multiple images too and be able to get those images on request. How would I store a reference to those images?
I was planning to use a S3 Direct File Upload using these steps on Heroku here. Is this also going to be the best way for me to do so?
Thank you in advance.
I don't think setting up static (css,js,etc..) or media (images, videos) to be stored on S3 has anything to do with Heroku or where you deploy. Rather, its just making sure Django knows where to save the files, and where to fetch them. I would definitely not follow that link, because it seems confusing and not helpful when working with Django.
This tutorial has really helped me, as it will show you how to set all of that up. I have gone through these steps and can confirm it does the trick. https://simpleisbetterthancomplex.com/tutorial/2017/08/01/how-to-setup-amazon-s3-in-a-django-project.html
While I've gone this route in the past, I've recently opted to use Digital Ocean's one-click app - Dokku. It's based on Herokuish. I then use Dokku's persistent storage to take advantage of the 25 gigs of storage on DO's smallest, $5/month, plan. I wrote a guide to this here.

user upload to my S3 bucket

I would like for a user, without having to have an Amazon account, to be able to upload mutli-gigabyte files to an S3 bucket of mine.
How can I go about this? I want to enable a user to do this by giving them a key or perhaps through an upload form rather than making a bucket world-writeable obviously.
I'd prefer to use Python on my serverside, but the idea is that a user would need nothing more than their web browser or perhaps opening up their terminal and using built-in executables.
Any thoughts?
You are attempting to proxy the file thorough your python backend to S3, that too large files. Instead you can configure S3 to accept files from user directly (without proxying through your backend code).
It is explained here: Browser Uploads to S3 using HTML POST Forms. This way your server need not handle any upload load at all.
If you also want your users to use their elsewhere ID (google/FB etc) to achieve this workflow, that too is possible. They will be able to upload these files to a sub-folder (path) in your bucket without exposing other parts of your bucket. This is detailed here: Web Identity Federation with Mobile Applications. Though it says mobile, you can apply the same to webapps.
Having said all that, as #Ratan points out, large file uploads could break in between when you try from a browser and it cant retry "only the failed parts". This is where a dedicated app's need come in. Another option is to ask your users to keep the files in their Dropbox/BOX.com account and your server can read from there - these services already take care of large file upload with all retries etc using their apps.
This answer is relevant to .Net as language.
We had such requirement, where we had created an executable. The executable internally called a web method, which validated the app authenticated to upload files to AWS S3 or NOT.
You can do this using a web browser too, but I would not suggest this, if you are targeting big files.

Categories

Resources