Elm library for handling S3 direct uploads with Shrine

Or how I built a file uploader that immediately triggers the Digital Ocean Spaces rate limiter they don't mention very publicly

by Robert May on Afternoon Robot

I've been building my own blog article editor and it's built in Elm. I needed a way of performing direct uploads to S3-compatible services, and though I found a couple of other examples of doing this, I decided to build my own to ensure it was up-to-date and that it worked with the other aspect of my setup: Shrine.

Elm Package

robotmay/s3-direct-file-upload

Source Code

S3DirectFileUpload.elm on GitLab

I found out as part of publishing the package that Elm has a hard dependency on GitHub for packages, which is a bit odd, but I've gotten around it for now by using GitLab's mirroring service. I've chosen to host the main repo on GitLab mostly because I've recently accepted a job with them, and I needed more opportunities to learn the platform better.

In order to make use of this, you can pass it multiple files grabbed through drag+drop or a file picker, e.g. in your update loop:

  -- Import the library
  import S3DirectFileUpload as FileUpload exposing (FileUpload)

  -- Other dependencies for the example code below
  import File exposing (File)
  import File.Select as Select
  import Html.Events exposing (onClick)
  import Task

  -- Example upload button in the view:
  uploadButton : Model -> Html Message
  uploadButton model =
    a [ class "btn", onClick Pick ]
      [ text "Upload images" ]

  -- Then in the update loop you'll need a few separate matchers:
  update : Message -> Model -> (Model, Cmd Message)
  update message model =
    -- File picker event, taken from the elm/files examples
    Pick ->
      ( model
      , Select.files ["image/*"] GotFiles
      )

    -- Here we grab all the files (the Select.files bit above returns a single file then a List, so we turn it into a single list)
    -- This then fires off a new task attempt for each file, performing the uploads simultaneously
    GotFiles file files ->
      let
        allFiles = (file :: files)

        commands =
          List.map
            (\file_ ->
              -- "media/sign" here is the endpoint on the application which generates some needed
              -- params for this to work. Shrine defaults to "/s3/params" when mounted, but you
              -- need it without the preceding / here, e.g. "s3/params"
              Task.attempt GotFileUpload (FileUpload.upload file_ "media/sign")
            )
            allFiles

      in
        ( model
        , Cmd.batch commands
        )

    -- Each file upload will then trigger this message, and you can figure out what you do with the returned info from S3
    GotFileUpload fileUpload ->
      case fileUpload of
        Err err ->
          -- Most likely a HTTP error
          -- Do something with the error, e.g.
          ( { model | state = Broken }, Cmd.none )
        Ok data ->
          -- Do something with the returned FileUpload type, e.g.
          saveToServer data

Interestingly, this will upload all files simultaneously (to which I should probably add a limit) and it's a very quick way of discovering that Digital Ocean has a poorly-designed rate-limiter on their Spaces offering, which doesn't exist on AWS S3 itself (hence why I now use that instead). I believe they're calculating the upload rate using maths (shudder), and hitting them with 10 files at the exact same time is obviously the same as uploading a million files at the exact same time, and therefore you can expect 8/10 of those uploads to fail. With S3, however, I expect my file uploader will brick your browser before it starts to drop the requests.

This package is in use on Senryu.pub now, which is currently an MVP of my ideal blogging system. Work is ongoing, but here's where it's currently implemented:

The current version of the editor, written in Elm

Obviously I'm open to suggestions for improvements I can make to this library, as right now it solves my very specific problem and little else.