Streaming live at 10am (PST)

File Hosting with Amazon S3


I see a lot of people requesting file hosting in Webflow, and even though it's a planned feature, I thought I'd share how we tackle this. We store all kinds of stuff in S3, but also use it for PDF and JavaScript files on our Webflow projects.

For example, the typing animation on our home page runs off a JavaScript file hosted on S3.

Getting started with S3

You’ll need an account. The AWS Free Tier will get you going with 5GB of storage for the first 12 months. Go here to get started.

Now that you’ve created your account you’ll need to setup a user. Navigate to “IAM” under services and find the Users tab. Once there, you should see a big blue “Create New Users” button. You’ll just need one to get started. Name it something simpler “s3user”. AWS will generate a Access Key ID and a Secret Access Key. THIS IS IMPORTANT: download that file and keep it somewhere safe. Those are your credentials and will allow you to access your bucket.

Next, you’ll need to attach a policy to that user. Select it from the list and click “Attach Policy.” Amazon supplies a lot of default policies, but you only need to find “AmazonS3FullAccess.” If you’re curious you can see the policy code below. Once found, simply attach the policy. This gives your user full access to read and write all of your S3 buckets which we’ll create next.

  "Version": "2012-10-17",
  "Statement": [
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": "*"

Now to make your first bucket. Think of a bucket like you do a hard drive attached to your computer. It will need a unique name much like a website does. Make sure you keep it simple, and avoid using underscores or other special characters. Use a dash if you need a space. Something like “my-bucket-name”.

Navigate to S3 under services and click “Create Bucket.” This will prompt you to name your bucket and select a region. For this example I’d suggest using “US Standard”. It’s the go-to region on AWS located in Virginia. Once created, click the little magnifying glass icon next to it’s name, and find the permissions drop down on the right.

Your bucket needs a policy just like your user did. This can get complicated, but we’ll use a simple one that gives everyone with a link permission to read the file, making sharing very easy. Click “Edit bucket policy” under permissions and copy/paste the following code into the prompt. THIS IS IMPORTANT: Before saving you need to change the bucket name on line 11. Replace “YOUR-BUCKET-HERE” with your bucket’s exact name.

  "Version": "2012-10-17",
  "Statement": [
      "Sid": "AllowPublicRead",
      "Effect": "Allow",
      "Principal": {
        "AWS": "*"
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::YOUR-BUCKET-HERE/*"

That’s it! Well, almost. Your S3 bucket is all set. Now you need to access that bucket. Those of you familiar with FTP clients will feel right at home. We us Panic’s Transmit, but Cyberduck is a good, free alternative for both Mac and Windows.

With Cyberduck running, click “Open Connection” and choose S3 from the drop down menu. Remember those credentials you downloaded? Yeah, here’s where you need those. Copy and paste your Access Key ID and Secret Access Key into the appropriate fields and click connect. Boom! You’re in your own private little cloud.

Now for the fun part… Upload a file you’d like to share. Maybe a cute picture of your dog? Once uploaded you can easily share the link by right-clicking the file and choosing Copy URL > HTTP URL. Paste the result into your browser and you should see that cute dog picture ready to be shared with the whole internet.

Note: This is a repost of a medium article I wrote a while back.

Dropbox is removing public folders = no more hosting files there

Thanks for sharing! This will be useful for large media assets and documents.


Thanks again man, this is dead simple and will work great until they launch a more robust file hosting feature set...

FYI for others using this method:
S3 apparently offers direct storage management as well (saw a little "opt-in" button to try it, not sure how new it is in relation to your tutorial)... But you can interface with a bucket directly through the console (add/delete files, etc)...

So you can even skip the FTP stuff if preferred...


Oh yeah! Forgot to mention that. There are so many things you can do with S3. It's really an awesome product.


Hi @md673 thanks for the great post! I'm considering using S3 to store digital files that my users can download. Right now I'm using Gumroad to do this, to enable the users to donate a small amount if they feel like it. But it's getting very expensive with Gumroad because I have so many downloads. It's about 400 a day in average.

Do you have any idea how expensive it is to host 1GB files in S3 with around 12000 downloads a month?

So basically 12000GB downloaded a month.

All the best.


There are several factors that go into AWS monthly charges, but my gut says that S3 isn't the right setup for you. That's a lot of monthly download bandwidth and that will add up. You can estimate the charges with the AWS calculator.

I'd recommend looking into something like DigitalOcean or maybe even Backblaze B2.


Thank you! great tutorial!