Parallel CLI Curl

running requests faster

Nix
Bash

When running a list of requests, that are embarrassingly parallel, I find it easy to keep it in bash and some GNU Parallel.

To demonstrate, I'm going to use httpbingo from @mccutchen. To simulate some slow responses, im going to use the /delay/:n endpoint which simulates response delays from 0 to 10 seconds.

I'd also recommend the usage docs from the GNU Parallel manual, they are very detailed on the multitude of ways to feed and use it to make your scripts parallel. I'm going to start by making a list of random endpoints with delays between 1 and 10 seconds and loading that to a file.

urls_gen.sh

#!/bin/bash
count=10
for i in $(seq $count)
do
  x=$(shuf -i 1-10 -n 1)
  printf "https://httpbingo.org/delay/%s\n" $x >> urls.txt
done

Back to the cli to execute:

sudo chmod +x urls_gen.sh && ./urls_gen.sh

which makes urls.txt:

https://httpbingo.org/delay/6
https://httpbingo.org/delay/3
...

And now we have an input file to feed to gnu parallel via ::::!

parallel 'curl -fsSl {}' :::: urls.txt

So, it's time to single file this. Our application does two things in sequence: generate a file with urls to target, and then curling those urls. Putting it into bash functions looks like:

#!/usr/bin/env nix-shell
#!nix-shell --pure -i bash -p bash parallel curl jq cacert
#!nix-shell -I nixpkgs=https://github.com/NixOS/nixpkgs/archive/ee084c02040e864eeeb4cf4f8538d92f7c675671.tar.gz

generate_files() {
  rm -f ./urls.txt
  count=10
  for i in $(seq "$count")
  do
    x=$(shuf -i 1-10 -n 1)
    printf "https://httpbingo.org/delay/%s\n" "$x" >> urls.txt
  done
}

send_requests() {
  curl -fsSL "$1" | jq .data
}
export -f send_requests

generate_files
# Use 10 workers (child processes) to send requests across
parallel -j 10 'send_requests {}' :::: urls.txt

In the next post, I'm going to show how to get some tracing added to this script, including using some observability tools with otel-cli. Stay tuned!