S3 image read/write from Beanstalk/EC2 Web application
S3 image read/write from Beanstalk/EC2 Web application
I have a web application (Elastic Beanstalk managed PHP on a Linux EC2 instance) that allows the user to upload images.
My concern is that if I upload my code I would first have to download all the images then re-upload them with my new code if they stay on EC2. So the obvious solution seems to be to move the image directory to S3.
The approaches I've found so far are a mounted drive or the SDK. What I'm finding is that for both approachs it seems Elastic Beanstalk is running in a write protected drive location that is preventing me from accessing the other approaches.
For instance, for the mounted drive, I can't recurse up enough directories to access the mounted drive. Beanstalk is running in /var/app/current/ and the mounted drive is located /var/s3-{bucket mount name}. If I try to access it (../../s3-{bucket mount name}) it looks for it in the same folder as my app.
When I followed the instructions for the SDK I installed it in ec2-user. So it's located /home/ec2-user/vendor/autoload.php and as before I can't access that folder either (even from a require statement)!
Where do I go from here? I've searched and read everything I can find that seems helpful and I'm not getting anywhere.
1 Answer
1
I'm not completely sure I understand what you mean by it looking in the same folder as your app, but here is my approach to file read/write to S3 from a PHP based web application:
1) Create a new bucket for your application. This bucket can be separate from the bucket that manages your Elastic Beanstalk project.
2) In your bucket, go to permissions->bucket policy and add this policy to make your images public. Important: Replace BUCKET_NAME with the actual name of your bucket
{
"Id": "Policy1397632521960",
"Statement": [
{
"Sid": "Stmt1397633323327",
"Action": [
"s3:GetObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::BUCKET_NAME/*",
"Principal": {
"AWS": [
"*"
]
}
}
]
}
Source
3) Create a composer.json file with Amazon SDK and replace with whatever version you want. Place composer.json in the root of your project folder. (When you upload your project code, Elastic Beanstalk does its magic for you and will install all composer packages).
{
"require": {
"aws/aws-sdk-php": "3.27.1",
}
}
4) Include this at the top of the PHP file that handles file upload.
require __DIR__ . '/vendor/autoload.php';
use AwsS3S3Client;
5) Create an S3Client object using the AWS PHP SDK. Replace YOUR_KEY, YOUR_SECRET, and YOUR_BUCKET_REGION with the correct values. Again, add this to the same file.
$client = S3Client::factory(
[
'credentials' =>
[
'key' => 'YOUR_KEY',
'secret' => 'YOUR_SECRET'
],
'region' => 'YOUR_BUCKET_REGION',
'version' => 'latest'
]
);
6) Now finally, after all that fun setup, here is how you can put an image object into S3 when a user uploads a file. Replace BUCKET_NAME with the exact name of your bucket, FILE_NAME.JPG with the destination file name on S3, and PATH/TO/TEMP_FILE.JPG with the temporary path where the image is being stored on your Elastic Beanstalk server.
$client->putObject([
'Bucket' => 'BUCKET_NAME',
'Key' => 'FILE_NAME.JPG',
'Body' => fopen('PATH/TO/TEMP_FILE.JPG', 'rb'),
'ACL' => 'public-read'
]);
7) Okay, so that handles file writing. Now once you want to show the user the uploaded file, you can do so with a regular tag. Again, replacing all capitalized fields with their correct values:
<img src="https://s3-YOUR_BUCKET_REGION.amazonaws.com/BUCKET_NAME/FILE_NAME.JPG" alt="My S3 image">
8) You now should have a web application with image write/read capabilities to Amazon S3.
By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.
Thank you! I'm not 100% sure it answers everything but it helps a lot! Some of the things I think I was missing ghat you helped to click were: 1. I don't have to reference the composer location, I configure it with the Json file within my code. 2. How to handle the permissions on the s3 side. I had just figured out the URL access last night and seen the permissions needed (I thought giving ec2 access would be enough to serve it) but then I didn't know how to apply it to all! Thanks again!
– Michelle Stewart
yesterday