How to cache S3 stored pictures to local disk and serving them from your web server

How to cache S3 stored pictures to local disk and serving them from your web server

The problem

Recently I worked on upgrading a legacy Laravel v4.2 to a newer version of Laravel. The site stored user uploaded pictures, which where stored locally on the app's public folder.

Now, I didn't like the fact that those pictures where stored locally because if the server went down the picture files would go down as well. Therefore deploying to a new server would require the hassle of manually restoring the pictures from a backup.

I was thinking to upload all images to S3 and serve them from there, but on the other hand I didn't want to get additionally charged from AWS S3 for the pictures bandwidth. 

How could I get a local cache of those pictures? Here is the solution that worked for my case.

The solution

Here is how the problem was solved in 4 steps.

1. Put all pictures under a single folder using unique random names

By using uuid v4 as file names, it was possible to put all pictures into a folder without worrying about conflicts when adding new files.

2. Upload pictures in S3.

Then I uploaded all images in an S3 bucket. If the server was gone, S3 would still be the primary data source.

3. Link all images to your server

The urls of all images were pointing to a specific directory on the server like{picture_file}.

4. Create the route

How did the server handle the request for let's say

The first time the server receives a request for this a file, the file is not on the server, so it hits laravel. We handle the request by using the following route:

Route::get("/pictures/{filename}", function ($filename) {
    // Load image from S3
    $f = Storage::disk('s3-pictures')->get('pictures-path/'.$filename);
    // If found, store it locally
    Storage::disk('public')->put('pictures/'.$filename, $f);
    // Tell the browser to try again
    return redirect(url("/pictures/{$filename}"));

The beauty of this is that every time we receive a request for an image that is already downloaded locally, nginx serves directly the picture without going through php.

On the other side, if the picture is missing, we download it and redirect to the same url.


Of course this is not a perfect solution, as it is not going to work for all scenarios. However for small apps it offers the convenience to deploy an app locally or somewhere else without having to worry about dynamically uploaded pictures which shouldn't be in your version control system.

Finally, by importing the database to a dev environment, this reduces the pain of running a local version of the app for development/debugging while serving the images from the web server.