Rails Direct Upload to AWS S3 from React Native

react rails and amazon s3 logos

I recently took on the task of allowing a user of a React Native app I’m helping build upload a custom profile picture. It sounded like a relatively simple task when I was estimating it in our sprint planning. However, I still allowed myself some grace since I’d never done such a thing before and put 8 hours on it. Little did I know what was to come.

See, I knew our backend was running Ruby on Rails (RoR) and I knew that Active Storage is now the thing but I didn’t realize the issues I would run into when I threw Amazon Web Services (AWS) S3 into the mix. I had heard good things bout Active Storage though I hadn’t worked with it any, I know RoR well enough to know that the things they add are intentional and typically well thought out, and I also knew my experience with S3 was while the configuration could be somewhat complex when it comes to IAM roles and things once it was running the way you wanted it should be pretty easy to use. Especially for something that was going to be public.

Early on in my work on this task I was informed by the back end engineer that Active Storage had this pretty neat way of allowing the client application to send files directly to S3 and just sending a reference string to the Rails server. This is preferred because instead of sending the data from the client to the Rails server to Amazon it goes directly from the client to Amazon. Bypassing one step speeds everything up and also saves some load on the server. I thought to myself this was pretty cool. We at Airship had done this before with a web app with solid results. I had that code to reference and base my work off of.

Where things start to go wrong…

This is where things start to splinter. I start to digest the code from the web app we created:

import axios from "axios";
import SparkMD5 from "spark-md5";
const getUploadInfo = async (file) => {
  const checksum = await createFileChecksum(file);
  return axios.post(
      blob: {
        filename: file.name,
        content_type: file.type,
        byte_size: file.size,
        checksum: checksum
export const createFileChecksum = async (file) => {
  return new Promise((resolve, reject) => {
    const chunkSize = 2097152; // 2MB
    const chunkCount = Math.ceil(file.size / chunkSize);
    var chunkIndex = 0;
    const md5Buffer = new SparkMD5.ArrayBuffer();
    const fileReader = new FileReader();
    const readNextChunk = () => {
      if (chunkIndex < chunkCount || (chunkIndex === 0 && chunkCount === 0)) {
        const start = chunkIndex * chunkSize;
        const end = Math.min(start + chunkSize, file.size);
        const fileSlice =
          File.prototype.slice ||
          File.prototype.mozSlice ||
        const bytes = fileSlice.call(file, start, end);
        return true;
      } else {
        return false;
    fileReader.addEventListener("load", event => {
      if (!readNextChunk()) {
        const binaryDigest = md5Buffer.end(true);
        const base64digest = btoa(binaryDigest);
    fileReader.addEventListener("error", () =>
      reject(`Error reading ${file.name}`)
export const uploadFile = async (file) => {
  const uploadInfo = await getUploadInfo(file);
  await axios.put(uploadInfo.data.direct_upload.url, file, {
    headers: uploadInfo.data.direct_upload.headers
  return uploadInfo.data.signed_id;

Real quick, getUploadInfo() sends the relevant info about the file the the Rails back end and returns what’s needed to direct upload to S3. createFileChecksum() is used by getUploadInfo() to calculate the base64 encoded md5 checksum of the file being sent. While Amazon does not require this Rails does. Lastly, uploadFile() uploads the file to AWS and then returns the signed_id that is then sent to Rails so it can associate that file with whatever it is in the back end.

I later realized most of this code came from somewhere else, maybe even the @rails/activestorage package. I found similar code living in a <a href="https://github.com/rails/rails/blob/master/activestorage/app/javascript/activestorage/file_checksum.js">file_checksum.js</a> file in the Rails repository on GitHub. No matter the source of the code, there was an issue. I don’t have access to the FileReader api on mobile. I’m working in React Native and not a browser. So now the search commences to doing this exact same thing in React Native.

All the things that didn’t work

Actually, I’m not going to bore you with everything that didn’t work. I honestly don’t think you care. You probably Googled how to do this and it’s NOWHERE TO BE FOUND on the internet. Yet, this direct upload has been a feature in Rails for a while. You might have even landed on the Rails issue Make ActiveStorage work for API only apps and a comment there:

For those on react native, I was able to get direct uploads working using rn-fetch-blob for md5 hashing (which is output in hex), then converting its hex output into base64 using buffer for calculating the checksum. To lookup the content_type, I used react-native-mime-types, and last but not least, used rn-fetch-blob again for calculating the size. Then, just follow the communication guidelines pointed out by @cbothner, and if the files are big, use rn-fetch-blob for efficiently uploading the file.


So, I tried to follow the above thread and I couldn’t get it to work. Granted, that comment is almost 6 months old and in JavaScript time that’s a lifetime ago. The main issue I ran into is I could not for the life of me get the checksum to match up with what Amazon calculated on their side. I kept getting responses of “The Content-MD5 you specified was invalid”. I tried MANY ways of generating the md5 checksum and they all ended up with the same Content-MD5 message being returned from AWS.

So here’s how I ended up getting it to work (why you’re really here):

import axios from "axios";
import Config from "react-native-config";
import RNFetchBlob from "rn-fetch-blob";
import AWS from "aws-sdk/dist/aws-sdk-react-native";
import { Platform } from "react-native";
import { Buffer } from "buffer";
const { fs } = RNFetchBlob;

  accessKeyId: Config.AWS_ACCESS_KEY_ID,
  region: Config.AWS_REGION,
  secretAccessKey: Config.AWS_SECRET_ACCESS_KEY
const s3 = new AWS.S3({ apiVersion: "2006-03-01" });

const getUploadInfo = async (fileInfo, file) => {
  const params = {
    Bucket: Config.AWS_BUCKET,
    ContentType: fileInfo.type,
    Key: fileInfo.fileName,
    Body: file
  const psUrl = s3.getSignedUrl("putObject", params);
  const checksum = unescape(psUrl.split("&")[1].split("=")[1]);

  return axios.post(
      blob: {
        filename: fileInfo.fileName,
        content_type: fileInfo.type,
        byte_size: fileInfo.fileSize,
        checksum: checksum

export const uploadFile = async (fileInfo) => {
  const uri =
    Platform.OS === "ios" ? fileInfo.uri.replace("file://", "") : fileInfo.uri;
  const file = await fs
    .readFile(uri, "base64")
    .then(data => new Buffer(data, "base64"));

  const uploadInfo = await getUploadInfo(fileInfo, file);
  const { headers, url } = uploadInfo.data.direct_upload;

  try {
    await axios.put(url, file, { headers: { ...headers } });
  } catch (e) {
    throw e;

  return uploadInfo.data.signed_id;

This is definitely not the most elegant solution. I haven’t refactored it at all yet either. However, it works. In the world of code that means something. So what in the world is going on here? I’ll walk through it, although, I’ll jump around the file some. First, I setup the aws-sdk and a new s3 instance. I’m using react-native-config to manage environment variables here. I initially did this to see if I could get the signed_id I needed by just bypassing Rails and uploading directly to AWS, that didn’t work. However, what I noticed when I generated a pre-signed URL for uploading via the aws-sdk was that the URL contained and md5 checksum!

Back to the code

Okay, the code, walk through it, here we go. I call uploadFile() in the response from react-native-image-picker on my screen component. That’s where the fileInfo argument comes from. I then get the proper URI based on the OS, and read the file with rn-fetch-blob. I turn that data into a Buffer because the aws-sdk on accepts certain types of files when creating a pre-signed URL. I then pass the fileInfo and the file along the getUploadInfo(). getUploadInfo() then creates a pre-signed URL using the s3 instance we setup earlier and does some hacky string manipulation (needs a refactor) to acquire the checksum. Now, I can use that checksum (which Amazon code created) to get the direct upload URL and headers from Rails. Lastly, I upload the file to AWS and return the signed_id which I send along to Rails elsewhere in my code.

Ultimately, this was a pretty frustrating problem to fight against. However, it felt so good when I uploaded a file and saw the user profile image change. I actually got up and ran around my home office with my hands in the air rejoicing. I’m also stoked that I can share this solution and see how others might improve on what I did or figure out better ways to go about this. I’m not convinced this is the best solution to this problem, however, it’s a solution that works.

From my yarn.lock:
– react-native v0.60.5
– react-native-image-picker v1.1.0
– rn-fetch-blob v0.10.16
– aws-sdk v2.532.0

Supercluster with @react-native-mapbox-gl/maps

During a recent project in my work at Airship I had to stop using the built in cluster functionality that <a href="https://github.com/react-native-mapbox-gl/maps">@react-native-mapbox-gl/maps</a> provides and utilize Supercluster instead. The reason is we need access to the points that make up the clusters. We had some items that never broke out of their clusters because they had the same exact longitude & latitude combination. As well as wanting to show a slide up view of those locations in a list view. What started me down this path was an issue on the deprecated <a href="https://github.com/nitaliano/react-native-mapbox-gl">react-native-mapbox-gl</a> library which shares a lot of functionality with the new library. You can view that issue here. I’m honestly surprised that this functionality isn’t available in the library since it is supported in the Mapbox JS SDK as documented here with the getClusterLeaves() function. I noticed people asking how to do this so when I nailed it down I knew a how-to was coming.

Continue reading “Supercluster with @react-native-mapbox-gl/maps”

Portfolio Site How-To For New Developers


I recently shared my portfolio site with the Free Code Camp Nashville group and got some inquiries into some of the technologies and features I used to build it. So I figured I’d share all aspects of the site and some steps to utilizing the same tools I did.

Should I build my site from scratch?

When I asked this question to the ever helpful NashDev community I received a resounding “No” from Senior devs. This might seem counter intuitive, however, the overall thought process was if you’re just starting out, unless you’re looking to be considered a designer, then using something someone else has already done very well as the base of your portfolio is better than building it yourself. Concentrate on highlighting the things that you are going to be doing in a potential job, not on the overall layout and design of your portfolio site. I decided to go with a template from HTML5 UP for a few reasons. First, they’re FREE as long as you keep the attribution. Second, AJ who creates these amazing templates is based out of Nashville just like me and had in the past connected me with some solid people to have beer/coffee with and discuss my career. Third, it’s a static site. That is, it’s 100% HTML, CSS & JavaScript so I would have many simple options for hosting when I got to that point. Lastly, all of the templates are responsive so if a potential hiring manager clicks through from their phone they’re going to get a great experience.
Continue reading “Portfolio Site How-To For New Developers”

freeCodeCamp Nashville December Meetup Recap

freecodecamp nashville

After taking a hiatus from our meetup for a month due to family obligations, Dave and I were back at it again this month for the freeCodeCamp Nashville meetup. This month we pulled together a group of technical recruiters for some Q&A. The premise behind this time together was that recruiters can be a valuable asset to people in the tech scene through their relationships with companies as well as their knowledge of how to best get past some common barriers. However, recruiters seems to get a bad wrap in general due to some bad apples in the industry. Those people who are required by a company or boss to get X number of calls or contacts in per day per position. I know I’ve personally received emails about positions that I know I’m not qualified for and anybody who took a few seconds to scan my LinkedIn profile would know as well.
Continue reading “freeCodeCamp Nashville December Meetup Recap”

freeCodeCamp Nashville October Meetup Recap

freeCodeCamp nashville

This past Saturday we had our monthly freeCodeCamp Nashville meetup at Nashville Software School. As always it was good times. We were supposed to have a guest speaker but they couldn’t make it at the last minute so our very own superstar and freeCodeCamp Nashville Co-Organizer Dave Harned stepped in and crushed it. He presented a Crash Course on NodeJS. You can find the repo here and excuse the work in progress readme. Like most things, it’s not perfect. Feel free to open a Pull Request and shore up those docs! I’m going to walk through what Dave presented on Saturday so you can see what you missed out on and come to the next one ;-). Honestly, so you can benefit from what I think is a well put together intro that’ll have you up, running, and playing around in no time.
Continue reading “freeCodeCamp Nashville October Meetup Recap”

The Full Time Job before The Full Time Job

I think this picture is pretty accurate as to how I feel right now. A pawn crowned king. I passed my final assessment the first time through and have officially been a graduate of Flatiron School for a week now. I’m also lucky enough to have a full-time job right out of school. A full-time job looking for a job that is :-p. It’s amazing how in depth the job search can be. However, when you are provided with a solid framework to follow it’s nice to see the pieces just start falling into place. With my choice of the Full Stack Web Developer Program through Flatiron School, I receive a job guarantee if I follow some steps. Obviously there is a little more to it, like following my career coaches advice, but in general I must have 8 git commits per week to GitHub, write one blog post per week, and perform 8 job search activities per week (ie. apply for a job, meet someone at a user group, follow up thank you notes, networking outreach to meet for coffee, etc…).
Continue reading “The Full Time Job before The Full Time Job”