Parallel CLI Curl
running requests faster
When running a list of requests, that are embarrassingly parallel, I find it easy to keep it in bash and some GNU Parallel.
To demonstrate, I'm going to use httpbingo from
@mccutchen. To simulate some slow responses, im
going to use the /delay/:n endpoint which simulates response delays from 0 to
10 seconds.
I'd also recommend the usage docs from the GNU Parallel manual, they are very detailed on the multitude of ways to feed and use it to make your scripts parallel. I'm going to start by making a list of random endpoints with delays between 1 and 10 seconds and loading that to a file.
urls_gen.sh
Back to the cli to execute:
which makes urls.txt:
And now we have an input file to feed to gnu parallel via ::::!
So, it's time to single file this. Our application does two things in sequence: generate a file with urls to target, and then curling those urls. Putting it into bash functions looks like:
In the next post, I'm going to show how to get some tracing added to this script, including using some observability tools with otel-cli. Stay tuned!