When I discovered this serverless repo years ago, it got me pretty pumped about the potential of serverless with Cloudformation/SAM. I used the repo for years as a microservice on various projects; but when I had some trouble with adding an enpoint to the YAML recently, I decided to see how my new best friend Pipedream would handle the task. It turns out, it’s suuper easy to fetch an image from a URL and write it to S3 with Pipedream.
Fetch Image and Write to S3 with Pipedream
You’ve been there: there’s a bunch of images somewhere on the web, and you want to download them all. You’ve got the URLS so you could write a local script to do it, but it would be better if you had a serverless function that you could call from anywhere.
Send a POST request to the function with a URL, and it will fetch the image and write it to S3. Simple, right?
The problem with the popular serverless repo is that it doesn’t come with a pre-defined endpoint. You have to deploy it to AWS, and then you have to figure out how to call it. Or you can edit the
.yml file before you deploy, and cross your fingers that it worked. It’s a little bit of a pain.
Luckily Pipedream provides a built-in integration for this very purpose. It beats the serverless repo in a few ways:
- It’s easier to configure and deploy
- It’s easier to call the funcion out-of-the-box since it comes with a pre-defined endpoint
- It’s easier to use, because it’s a web app and you get observability built-in (you can see the event log right there in the portal)
So let’s do it. I’ll be sending the following JSON to the endpoint:
Steps to Stream image (or file) to S3 from URL
1. Navigate to https://pipedream.com and login.
2. Click “New” to start a new Workflow in Pipedream.
3. Select New HTTP request. This will instantly create a custom endpoint for you to use where you can send data.
4. Click here to copy your endpoint URL so you can use it.
5. Click “Select event…” to select the test event you just sent in to your endpoint.
6. Click “GET /” to load the test event.
7. Behold, the reason I use Pipedream every day. Nobody makes it quite this easy to work with API request payloads.
8. Note how I have a URL for the image I want to fetch and write to S3, and I have the new name for the file that I sent in for the S3 key (aka the filename).
9. Click “AWS”
10. Click “S3 – Stream file to S3 from URL.
11. Click this icon to go back up to our event and copy the image URL to our clipboard.
12. Click “Copy Path.” Whisper “Thank You” to Pipedream for this button.
13. Go back to your S3 Stream File step and paste the path into the “File URL” field of the AWS step.
14. Go back up to your Event step and click “Copy Path” on your new key name that you sent in.
15. Now back to the AWS step to drop in the new key name.
16. Your AWS step will now look like this.
17. That’s it! Just connect your AWS account and the S3 bucket where you want to stream the images. Remember that the AWS user you put in the Account field needs to have permission to write to the bucket you choose in the Bucket field.
Pipedream is a fast and effective supplement — or a replacement — for S3 services. In this case, we fetched an image from a URL and wrote it to S3 without even using a Lambda function and API Gateway. This is awesome because we get the benefits of S3 storaage and a serverless architecture without the complexity of managing a Lambda function and API Gateway.