⬅︎ Back to Fastest database for Tornado
Testing the asynchronous nature is a whole different beast. I think a decent test would be to write a REST api so that each client can do something like this: ids = [] for i in range(HOW_MANY_TIMES): r = request.post('http://localhost:8000/benchmark/create', topic=random_topic(), ...) assert r.status_code == 200 ids.append(r.content)for id in ids: r = request.post('http://localhost:8000/benchmark/edit', id=id, topic=random_topic(), ... assert r.status_code == 200 for id in ids: r = request.post('http://localhost:8000/benchmark/delete', id=id, topic=random_topic(), ... assert r.status_code == 200Then, you run that concurrently, once for each database engine and count the total time it took to complete everything.
Comment
Testing the asynchronous nature is a whole different beast.
I think a decent test would be to write a REST api so that each client can do something like this:
ids = []
for i in range(HOW_MANY_TIMES):
r = request.post('http://localhost:8000/benchmark/create', topic=random_topic(), ...)
assert r.status_code == 200
ids.append(r.content)
for id in ids:
r = request.post('http://localhost:8000/benchmark/edit', id=id, topic=random_topic(), ...
assert r.status_code == 200
for id in ids:
r = request.post('http://localhost:8000/benchmark/delete', id=id, topic=random_topic(), ...
assert r.status_code == 200
Then, you run that concurrently, once for each database engine and count the total time it took to complete everything.