Postgres and Time Zones

Postgres from version 7.3 on stores TIMESTAMP columns as TIMESTAMP WITHOUT TIMEZONE. TIMESTAMPs WITHOUT TIME ZONE store exactly what they're given, with no conversion.

TIMESTAMPS WITH TIME ZONE are stored as UTC in the database. Postgres assumes that timestamps are always provided by the user in the local time zone. Timestamps are converted to UTC behind the scenes according to Postgres' time zone settings. To specify that a timestamp is in a particular time zone, it's necessary to use a TIMESTAMP WITH TIME ZONE literal. Generally, this means that your literal will end in a UTC offset:

'2010-05-12 20:22:32-04:00'

The “+” or “-” before the UTC offset should indicate that the literal contains a time zone.

Unfortunately, Postgres ignores the SQL requirement and silently drops the UTC offset, treating the resulting TIMESTAMP literal as local time. Forcing Postgres to interpret a literal as a TIMESTAMP WITH TIME ZONE requires casting or explicitly stating it as:

'2010-05-12 20:22:32-04:00'::timestamp with time zone


Python has a parallel set of date and time types: naive and timezone aware variants of the standard datetime.datetime and datetime.time types.

A python datetime is non-naive if it has timezone information associated with it and naive if not. Python does not ship with any timezone objects that can be used to make a datetime non-naive, but the pytz package offers such a database.

Creating Non-naive Time Objects

From the pytz documentation:

Retrieving a Timezone

import pytz
utc_tz = pytz.utc
eastern = pytz.timezone('US/Eastern') # DST aware

Using a Timezone

To create a datetime:

utc_dt = datetime.datetime(2010,05,13,0,22,32,0,pytz.utc)
# N.B. eastern is incorrect - it uses EST not EDT
eastern_dt = datetime.datetime(2010,05,12,0,22,32,0,eastern)  

To convert a datetime:

utc_dt.astimezone(eastern) # localizes to EDT

SQL Alchemy

SQL Alchemy, a Python ORM and database framework, understands naive and aware time types, and, if a database column is mapped as timezone aware itself, can transmit the time zone data to the database.

Note that the SQL Alchemy reflection feature does not create timezone aware column mappings by default, even TIMESTAMP WITH TIME ZONE columns; the columns need to be overridden:

my_table = Table('my_table',
    Column('my_timestamp_with_tz', \

SQL Alchemy will now generate a TIMESTAMP WITH TIME ZONE literal if my_timestamp_with_tz is passed a non-naive datetime. However, SQL Alchemy seems to only generate SQL compliant literals, not Postgres' literals with "::timestamp with time zone" explicitly set. Since SQL Alchemy uses prepared statements, it's unable to log the exact form of the queries it generates with literals filled in. The Postgres logs (with query logging configured) are thus the only place to see the timestamp literal:

UPDATE my_schema.my_table \
    SET my_timestamp_with_tz = \
    '2010-05-11T17:01:16.350195-04:00' \
    WHERE = 12345;

This makes using Postgres' timezone conversion with SQL Alchemy infeasible.

The solution is thus to ensure that all times are converted to Postgres' local time before being sent to a SQL Alchemy query. An alternative would be to only store UTC, or some other time zone, in a TIMESTAMP WITHOUT TIME ZONE, but that requires changing the way database fields are used, and manual management of time zones.

It's probably also worth exploring SQL Alchemy's field wrapping hooks to see if the time zone conversion can be done there.


If you're using a different DB package, it may be possible to convince the DB interface to generate a proper Postgres compatible TIMESTAMP WITH TIME ZONE literal. This would be best.

If not, determine the local time that will match the database's:

import pytz
local_tz = pytz.timezone('US/Eastern')

Parse time-zone–tagged times, such as "2010-05-13T00:22:32.0Z," into non-naive datetimes in the UTC time zone:

dt = datetime.datetime(2010,05,13,0,22,32,0,pytz.utc)

Once you're ready to commit this object to the database, convert it to local time:

dt_for_db = dt.astimezone(local_tz)

If your DB wrapper layer will deal with both naive and non-naive times, check that the timestamps are non-naive:

if hasattr(dt, 'utcoffset') and dt.utcoffset() != None:
    dt_for_db = dt.astimezone(local_tz)

More Links