I deviced a very simple benchmark through Zope as the web server. Two SQL Select statements that draws from a dummy database table which contains a table with loads of dummy entries with a timestamp field.

Conclusion is that Python's DateTime module is considerably slower than PostgreSQL's builtin date formatting function.

On the first experiment, I select the timestamp as a string and use Python's DateTime module to convert the date to a DateTime object then I apply the strftime() function to get the date printed in a formatted fashion (e.g. 04 July 02 20). The SQL looks like this:

SELECT name, time_stamp
FROM test_datetime_entries;

On the DTML output I apply the DateTime module with code like this:

<dtml-var "ZopeTime(time_stamp).strftime('%y %B %d %S')">

On the second experiment I used PostgreSQL's equally useful to_char() function to do the formatting in the SQL command so that formatted format is already prepared. The SQL looked like this:

SELECT name, time_stamp,
  TO_CHAR(time_stamp,'YY Mon DD SS') AS time_stamp_formatted,
FROM test_datetime_entries;

and it's corresponding DTML much simpler this time:

<dtml-var time_stamp_formatted>

Then I had a script that randomly stress tests either experiment. After about 5000 iterations the following average fetch times were noted:

Using DateTime module Using to_char() SQL function
3.556 seconds 1.201 seconds

Why is this important?

Generally, using the power SQL is better than using a programming language that has to work with the data because it's faster. But as a pattern it's less good. The database is responsible for reading and writing data, not to fiddle with the presentation; that's what you have the programming language for. These simple results prove that it's worth breaking that pattern and exploiting the SQL power more.

jose - 11 October 2006 [«« Reply to this]
gracias!!! me sirvio de algo este ejm


Your email will never ever be published