These requests are not processed on your machine (unless connecting to localhost). This means that for each request to be fully processed, your machine will have to wait for a response.
Consider sending an invitation for your birthday party to friend #1, and after receiving a response, send one to friend #2, etcetera. It would be faster to send the invitation to all friends, and then wait for all of them to respond. Especially if friend #1 happens to be on holiday.
I don't know why the number of bytes per second are identical, perhaps some node in the network limits the speed, but the parallel approach can at least send out each request and "parallel out" the total wait time.
I don't understand how the synchronicity could affect the total number of bytes received. You're talking about bytes/second, but not the number of seconds spent at that transfer speed.