Python Flask server on elastic beanstalk cant save files - python

I am trying to upload and save a file to my EC2 instance so that I can do some file manipulation.
My upload looks like this:
UPLOAD_FOLDER = os.getcwd() + '/uploads'
application = Flask(__name__)
CORS(application)
application.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
#application.route("/test-upload", methods=["POST"])
def test_upload():
file = request.files['file']
filename = secure_filename(file.filename)
file.save(os.path.join(application.config['UPLOAD_FOLDER'], filename))
# return redirect(url_for('download_file', name=filename))
return str(os.path.join(application.config['UPLOAD_FOLDER'], filename))
If I remove the file.save() line, then the function won't return errors. And as you can see I'm returning the path, so that will return the correct path. But, I get a 500 bad request from the server when I try and save the file.
I'm not sure what I'm missing here.
I can locally try this using postman, and this works fine. The file saves to the right location. I do have file size for EC2 enabled to be 100MB, but I am testing with a 50 Byte file so size can't be the issue. I know the file is being uploaded to the EC2 web server for sure.

Related

Video Upload in Cloud Run Flask Application is Giving No such file or directory

I am having a web application running locally on my laptop. Below is the python code snippet that takes the video from the HTML and uploads it to Cloud Storage:
#app.route('/' , methods=['POST', 'GET'])
def index():
if request.method == 'POST':
video = request.form['video']
video2 = request.form['video2']
if not video:
flash('please upload your answer first')
if not video2:
flash('please upload your answer first')
else:
#store video to GCS
video_list = [video,video2]
for i in range(len(video_list)):
upload_video_to_gcs(video_list[i],i)
def upload_video_to_gcs(video,video_num):
# Setting credentials using the downloaded JSON file
client = storage.Client.from_service_account_json(json_credentials_path='sa-credentials.json')
# Creating bucket object
bucket = client.get_bucket('bucket_name')
# Name of the destination file in the bucket
gcs_file_name = "".join(("video","_",str(video_num)))
object_name_in_gcs_bucket = bucket.blob(gcs_file_name)
object_name_in_gcs_bucket.upload_from_filename(video)
return 1
It is working fine when running it locally on my laptop, in which the video is located in the same folder as the python file.
However, when I deployed this web application on GCP Cloud Run(the video is no more existing in the same folder of the python file), I am getting the below error:
FileNotFoundError: [Errno 2] No such file or directory: 'sample_video.mp4'
Do you have any idea how to upload the video(existing anywhere on my laptop) through the web service hosted on Cloud Run on GCP.
The Python FileNotFoundError: [Errno 2] No such file or directory error is often raised by the os library. This error tells you that you are trying to access a file or folder that does not exist. To fix this error, check that you are referring to the right file or folder in your program.
Here is a similar case that was fixed by providing the correct path.

Problem with directory in Heroku with Flask

I'm having a problem with my project on Heroku, when I download a video from Youtube on localhost I used a code that takes the user's name and puts it in the Donwload directory, but in Heroku it doesn't put it there, I suspect it is with some problem when it comes to finding the director's place
#app.route('/', methods=['POST'])
def getvalue():
if request.method == 'POST':
name = request.form['url']
try:
url = name
youtube = pytube.YouTube(url)
video = youtube.streams.get_highest_resolution()
audio = youtube.streams.get_audio_only()
video_path = r'C:/Users/' + getpass.getuser() + '/Downloads/Youtube_Download'
audio_path = r'C:/Users/' + getpass.getuser() + '/Downloads/Youtube_Download/Audio'
if not os.path.exists(video_path):
os.makedirs(video_path)
os.makedirs(audio_path)
video.download(video_path)
audio.download(audio_path)
return render_template('index.html')
The problem is that there's no `C:/Users/' in the Heroku Dyno, since is a Linux environment.
The folder structure there will match your project folder structure.
But I don't actually recommend store any user data inside the Heroku Dyno since will be erased every time you restart the app (e.g. when deploying a new version)
You can use a Storage Bucket like Amazon S3 for this instead.

Uploading file to AWS S3 through Chalice API call

I'm trying to upload a file to my S3 bucket through Chalice (I'm playing around with it currently, still new to this). However, I can't seem to get it right.
I have AWS setup correctly, doing the tutorial successfully returns me some messages. Then I try to do some upload/download, and problem shows up.
s3 = boto3.resource('s3', region_name=<some region name, in this case oregon>)
BUCKET= 'mybucket'
UPLOAD_FOLDER = os.path.abspath('') # the file I wanna upload is in the same folder as my app.py, so I simply get the current folder name
#app.route('/upload/{file_name}', methods=['PUT'])
def upload_to_s3(file_name):
s3.meta.client.upload_file(UPLOAD_FOLDER+file_name, BUCKET, file_name)
return Response(message='upload successful',
status_code=200,
headers={'Content-Type': 'text/plain'}
)
Please don't worry about how I set my file path, unless that's the issue, of course.
I got the error log:
No such file or directory: ''
in this case file_name is just mypic.jpg.
I'm wondering why the UPLOAD_FOLDER part is not being picked up. Also, for the reference, it seems like using absolute path will be troublesome with Chalice (while testing, I've seen the code being moved to /var/task/)
Does anyone know how to set it up correctly?
EDIT:
the complete script
from chalice import Chalice, Response
import boto3
app = Chalice(app_name='helloworld') # I'm just modifying the script I used for the tutorial
s3 = boto3.client('s3', region_name='us-west-2')
BUCKET = 'chalicetest1'
#app.route('/')
def index():
return {'status_code': 200,
'message': 'welcome to test API'}
#app.route('/upload/{file_name}, methods=['PUT'], content_types=['application/octet-stream'])
def upload_to_s3(file_name):
try:
body = app.current_request.raw_body
temp_file = '/tmp/' + file_name
with open(temp_file, 'wb') as f:
f.write(body)
s3.upload_file(temp_file, BUCKET, file_name)
return Response(message='upload successful',
headers=['Content-Type': 'text/plain'],
status_code=200)
except Exception, e:
app.log.error('error occurred during upload %s' % e)
return Response(message='upload failed',
headers=['Content-Type': 'text/plain'],
status_code=400)
I got it running and this works for me as app.py in an AWS Chalice project:
from chalice import Chalice, Response
import boto3
app = Chalice(app_name='helloworld')
BUCKET = 'mybucket' # bucket name
s3_client = boto3.client('s3')
#app.route('/upload/{file_name}', methods=['PUT'],
content_types=['application/octet-stream'])
def upload_to_s3(file_name):
# get raw body of PUT request
body = app.current_request.raw_body
# write body to tmp file
tmp_file_name = '/tmp/' + file_name
with open(tmp_file_name, 'wb') as tmp_file:
tmp_file.write(body)
# upload tmp file to s3 bucket
s3_client.upload_file(tmp_file_name, BUCKET, file_name)
return Response(body='upload successful: {}'.format(file_name),
status_code=200,
headers={'Content-Type': 'text/plain'})
You can test this with curl and its --upload-file directly from the command line with:
curl -X PUT https://YOUR_API_URL_HERE/upload/mypic.jpg --upload-file mypic.jpg --header "Content-Type:application/octet-stream"
To get this running, you have to manually attach the policy to write to s3 to the role of your lambda function. This role is auto-generated by Chalice. Attach the policy (e.g. AmazonS3FullAccess) manually next to the existing policy in the AWS IAM web interface to the role created by your Chalice project.
Things to mention:
You cannot write to the working directory /var/task/ of the Lambda functions, but you have some space at /tmp/, see this answer.
You have to specify the accepted content-type 'application/octet-stream' for the #app.route (and upload the file accordingly via curl).
HTTP PUT puts a file or resource at a specific URI, so to use PUT this file has to be uploaded to the API via HTTP.

loading excel file into pandas dataframe flask app running with mod_wsgi apache that user uploaded

I have a flask app where a user uploads files to an upload folder. Then I want to take those files and read them into pandas dataframes for further processing. The process works fine using app.run() on my localhost. I am trying to get it to work on aws with mod_wsgi and apache.
#app.route('/uploader', methods=['POST'])
def upload_file():
if request.method == 'POST':
filenames=[]
uploaded_files = request.files.getlist("file[]")
file.save(os.path.join(app.root_path,app.config['UPLOAD_FOLDER'], filename))
filenames.append(filename)
plotfiles=parse_all(filenames)
def parse_all(filenames):
folder_path=os.path.join(app.root_path, app.config['UPLOAD_FOLDER'])
for f in filenames:
f=send_from_directory(folder_path,filename))
excel_file=pandas.ExcelFile(f)
#do more stuff
I get the error ValueError: Must explicitly set engine if not passing in buffer or path for io.
The file is uploaded to the upload folder correctly but obviously not fetched correctly into the f variable. The type of f is <class 'flask.wrappers.Response'> and f.__dict__ returns
{'_on_close': [], 'response': [], 'headers': Headers([('X-Sendfile', u'/var/www/html/cluster_app/data/filename.xlsx'), ('Content-Length', u'82668'), ('Content-Type', u'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'), ('Cache-Control', u'public, max-age=43200'), ('Expires', u'Tue, 07 Jun 2016 22:59:11 GMT'), ('ETag', u'"1465297151.54-82668-755509703"')]), '_status_code': 200, '_status': '200 OK', 'direct_passthrough': True}
When running on my localhost on my machine there was a .file attribute in the response, now response is empty. Printing folder_path gives /var/www/html/cluster_app/data which is the uploads folder.
I'm very green on flask/wsgi/apache. Would really appreciate some advice on how to access the file system in my code.
Hi I suggest you check the flask documentation about upload files here and later re-check your code.
Instead of
f=send_from_directory(folder_path,filename))
I use
f = open(os.path.join(app.root_path, app.config['UPLOAD_FOLDER'], filename))
to open the file. I just assumed send_from_directory would work as it does when I used flask app.run() on my localhost. I'd still like to know why send_from_directory does not work.

Flask, Apache, mod_wsgi: unable to save file on server side

From the client side I am sending an image via post from form enctype=multipart/form-data, and on the server side I am saving it to a directory. All of this works locally on my computer and running flask directly with python app.py.
Here is my reference for setting up file saving:
http://flask.pocoo.org/docs/patterns/fileuploads/
On the actual production server, I am running it with Apache and mod_wsgi, which I set up according to this website:
http://flask.pocoo.org/docs/deploying/mod_wsgi/
For directory permissions I have triedchown -R 777 and chown -R www-data:www-data where the relevant Apache code for users looks like this: WSGIDaemonProcess app user=www-data group=www-data threads=5.
However, after all of this I am still not able to get the file to save. I just get a 500 HTTP error back at the point where it tries to save the file.
Here is the relevant Flask code:
UPLOAD_FOLDER = '/images/'
app = Flask(__name__)
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
#app.route('/upload_ocr_images', methods=['GET', 'POST'])
def upload_images():
if request.method == 'POST':
files = request.files.getlist("images[]")
for file in files:
if allowed_file(file.filename):
filename = secure_filename(file.filename)
file.save(os.path.join(app.config['UPLOAD_FOLDER'], filename))
return redirect(url_for('home'))
At this point I am wondering if there is something I need to be setting on the Apache side of things.
Youre using /uploads as your path.
That means you're trying to upload to a directory named /uploads at root level of your filesystem.
This is usually wrong and normally it's the error.
If you've the uploads folder under your flask application file structure, then you should create the path using app.root_path which holds the absolute application path.
Something like
file.save(os.path.join(app.root_path, '/uploads', filename))

Categories

Resources