Someone asked,

My client has an application in which there is a lot of date/time management. DB server, application server and the client PCs are in different timezones, so they want to ensure that time management is done in one timezone on the server and presented to the user in the client’s timezone. This seems a common problem, do we have pattern and design recommendations around this? Also, there doesn’t seem to be a way in .NET2.0 to transform between timezones except to transform from timezone to UTC and back.

I had written a little about this a long time ago, but this question is a little different and of course it is still relevant and timely.


Here's the thing:

Dates and times are relative. 


In .NET, time values, specifically the DateTime type,  are expressed with a unified frame of reference.   Time X is the same whether you are in UTC, PST, or some other timezone.  It can look different in each of those timezones if you format the value for display, but the value, despite appearances, represents the same time.  If you stay within .NET, all the various application components running in the various geographically distributed tiers do not need to agree on the use of a single common timezone.   IF YOU STAY in .NET.    A computer in Shanghai can communicate with one in Los Angeles, and another  in London and in Syndney, and they can all use different timezones, and it’s all good.   Time X is time X.  Grabbing the DateTime.Now() value on a machine in London and sending it to LA, you get the right value in LA.  It may display as “11:43am” in London, and 3:43am in Los Angeles, but it is the same point in galactic time.   A meeting you schedule in Outlook with colleagues in Frankfurt and Sao Paolo shows different times in Outlook, but it is really the same time.  (I do not mean to imply that Outlook is using .NET to manage all of this, it doesn't.  I use the meeting-in-Outlook as an example only).


Troubles can arise when you transmit the time value outside of .NET, for example, when you store it into a database, or transmit it to a cooperating partner via web services.  In general, heterogeneous systems do not share the same "model" for representing time values.  In a practical example, if you serialize the DateTime value into a SQLDATE column in a database, you can lose the timezone information and thus you lose the frame of reference. Imagine serializing to a DateTime value of 11:43am from a machine in London, and then stuffing that serialized value into a database.  Then imagine, on a machine in Los Angeles, you retrieve that SQLDATE from the database and de-serialize it into DateTime value.  The value in LA is now 11:43am, which is wrong.  11:43am in LA is not the same as 11:43am in London.  The transition out of .NET has caused a loss of information. 


You can get the same sort of “loss of fidelity” when you transition from .NET into Java or PHP or some other .NET platform, via a web service call for example.  The reason is the same in the database and in the Java case – it is the fact that the two cooperating systems have different models for date/time.  Specifically, timezone is often NOT included in the model for non .NET systems, and this is true for databases and for Java. 


Therefore if you are transitioning into and out of .NET, the best practice is for every party to agree on a frame-of-reference timezone.   When we say “1143am” in such a system, all parties can agree that it refers to UTC (For example), regardless where those parties are physically located.  And then if the application component needs to display or serialize that time in a format that respects local timezone, the application must convert:  add/subtract the UTC offset. 


The key thing is, you need to be careful, especially about transitions into and out of .NET. 


More on this in a future post.