I'm working on an http client and I would like to test it on requests that take some time to finish. I could certainly come up with a python script to suit my needs, something about like:
def slow_server(environ, start_response):
with getSomeFile(environ) as file_to_serve:
block = file_to_serve.read(1024);
while block:
yield block
time.sleep(1.0)
block = file_to_serve.read(1024);
but this feels like a problem others have already encountered. Is there an easy way to serve static files with an absurdly low bandwidth cap, short of a full scale server like apache or nginx.
I'm working on linux, and the way I've been testing so far is with python -m SimpleHTTPServer 8000
in a directory full of files to serve. I'm equally interested in another simple command line server or a way to do bandwidth limiting with one or a few iptables commands on tcp port 8000 (or whatever would work).