Upload a file to S3 compatible service using python - python

I'm using an S3 compatible service. That means my dynamic storage is not hosted on AWS. I found a couple of python scripts that upload files to AWS S3. I would like to do the same but I need to be able to set my own host url. How can that be done?

You can use the Boto3 library (https://boto3.readthedocs.io/en/latest/) for all your S3 needs in Python. To use a custom S3-compatible host instead of the AWS, set the endpoint_url argument when constructing a S3 resource object, e.g.:
import boto3
session = boto3.session.Session(...)
s3 = session.resource("s3", endpoint_url="http://...", ...)

You can use amazon route53.
Please refer
http://docs.aws.amazon.com/AmazonS3/latest/dev/website-hosting-custom-domain-walkthrough.html

Related

How to download a file from private S3 bucket using its URL?

I have a file that is stored on AWS s3 at https://xyz.s3.amazonaws.com/foo/file.json and I want to download that into my local machine using Python. However, the URL cannot be accessed publicly. I have the Account ID, IAM user name, and password (but NO Access Key or Secret Access Key and no permissions to view/change them either) to the resource that contains this file. How can I programmatically download this file instead of doing so from the console?
You could generate an Amazon S3 pre-signed URL, which would allow a private object to be downloaded from Amazon S3 via a normal HTTPS call (eg curl). This can be done easily using the AWS SDK for Python, or you could code it yourself without using libraries. Answer by John Rotenstein
here

Uploading files to Amazon s3 bucket using ARN iam in Python

I have been given a bucket name with ARN Number as below:
arn:aws:iam::<>:user/user-name
I was also given an access key.
I know that this can be done using boto.
Connect to s3 bucket using IAM ARN in boto3
As in the above link do i need to use 'sts'?
if so why am i provided with an access key?
First, I recommend you install the AWS Command-Line Interface (CLI), which provides a command-line for accessing AWS.
You can then store your credentials in a configuration file by running:
aws configure
It will prompt you for the Access Key and Secret Key, which will be stored in a config file.
Then, you will want to refer to S3 — Boto 3 documentation to find out how to access Amazon S3 from Python.
Here's some sample code:
import boto3
client = boto3.client('s3', region_name = 'ap-southeast-2') # Change as appropriate
client.upload_file('/tmp/hello.txt', 'mybucket', 'hello.txt')

using python boto to copy json file from my local machine to amazon S3

I have a json file with file name like '203456_instancef9_code323.json' in my C:\temp\testfiles directory and want to copy the file to Amazon s3 bucket and my bucket name is 'input-derived-files' using python and boto library but throwing exceptions at all times saying the file does not exist.I have a valid access id and secret key and could establish connection to AWS. Could someone help me with the best code to script for this please. Many thanks for your contribution
Here is the code that you need based on boto3, it is the latest boto library and is maintained. You need to make sure that you use the forward slash for directory path. I have tested this code on windows and it works.
import boto3
s3 = boto3.resource('s3')
s3.meta.client.upload_file('C:/temp/testfiles/203456_instancef9_code323.json',
'input-derived-files', '203456_instancef9_code323.json')

Local access to Amazon S3 Bucket from EC2 instance

I have an EC2 instance and an S3 bucket in the same region. The bucket contains reasonably large (5-20mb) files that are used regularly by my EC2 instance.
I want to programatically open the file on my EC2 instance (using python). Like so:
file_from_s3 = open('http://s3.amazonaws.com/my-bucket-name/my-file-name')
But using a "http" URL to access the file remotely seems grossly inefficient, surely this would mean downloading the file to the server every time I want to use it.
What I want to know is, is there a way I can access S3 files locally from my EC2 instance, for example:
file_from_s3 = open('s3://my-bucket-name/my-file-name')
I can't find a solution myself, any help would be appreciated, thank you.
Whatever you do the object will be downloaded behind the scenes from S3 into your EC2 instance. That cannot be avoided.
If you want to treat files in the bucket as local files you need to install any one of several S3 filesystem plugins for FUSE (example : s3fs-fuse ). Alternatively you can use boto for easy access to S3 objects via python code.

How to copy a file via the browser to Amazon S3 using Python (and boto)?

Creating a file (key) into Amazon S3 using Python (and boto) is not a problem.
With this code, I can connect to a bucket and create a key with a specific content:
bucket_instance = connection.get_bucket('bucketname')
key = bucket_instance.new_key('testfile.txt')
key.set_contents_from_string('Content for File')
I want to upload a file via the browser (file dialogue) into Amazon S3.
How can I realize this with boto?
Thanks in advance
You can't do this with boto, because what you're asking for is purely client-side - there's no direct involvement from the server except to generate the form to post.
What you need to use is Amazon's browser-based upload with POST support. There's a demo of it here.
do you mean this one? Upload files in Google App Engine

Categories

Resources