I ran into an issue with an application running in a remote location through a Citrix server. When the application was run in a remote location it logged an event to the database with the time from the remote location even through the user was accessing the application through Citrix.
Did .NET's DateTime.Now somehow "escape" the Citrix container and find the time of the remote machine or is there some other explanation?
Here is the current setup of the application.
The application logs events to a database in the US
The remote user is located in Europe and access the application through Citrix which simulates running the application in the US
The Citrix server machine time is set to the US time
P.S. The application has been updated to query the time from the database when saving in the future.
A few things:
Citrix's "time zone redirection" feature is probably the explanation for what you've observed. The details are specific to what particular Citrix product you are using (which you didn't supply). You may be able to disable that feature. Google it, or ask on ServerFault.
Really, you should avoid using DateTime.Now in server-based applications. It should be irrelevant what the server's time zone setting is. Use DateTime.UtcNow, DateTimeOffset.UtcNow, or DateTimeOffset.Now, along with the TimeZoneInfo class, as appropriate. Alternatively, use Noda Time.
To answer the question in the title, you can see the source code for DateTime.Now here. It works as follows:
Call DateTime.UtcNow, which itself calls the GetSystemTimeAsFileTime Win32 API. This is the system clock. It is in terms of UTC - not any specific time zone. If you're running in a VM, typically this system clock is synchronized with time from the host.
Get data about the local machine's time zone. It uses a truncated version of the complete time zone (for performance reasons), and it caches the data for future reuse.
Note that if Citrix is redirecting the time zone, it will affect this step.
Using the data retrieved, calculate the offset from UTC - for the specific UTC point retrieved in step 1.
Apply the offset to get local time, truncating if MinTicks or MaxTicks are exceeded in the conversion.
Mark the value calculated in step 4 with DateTimeKind.Local, and return it in a DateTime struct.
Related
I am from west India. DateTime.Now returning current time when debugging from Visual Studio. But when I hosted it on an Azure App Service, it is returning 6 hours earlier time. The App Service is hosted in West India. I added App Setting for the App Service WEBSITE_TIME_ZONE to Indian Standard Time and restarted the App Service but still the same issue.
Indian Standard Time is not supported as App Service's timezone name, check supported time zone names here (Check exact spelling of the tz name in left-side column).
Best practise is to use UTC time in server context and convert it to local time in UI.
Azure server times runs on UTC by default. This way you could calculate the difference from the user time. There is also the option to use DateTime.UtcNow()
I've been doing some research around on how to properly store and save the Date and Time in a Web Application but I couldn't find a post that has a marked answer.
I have a ASP.NET MVC website and I use C# for backend code and MS SQL for the database, let say you have different client with different TimeZone currently I'm storing the DateTime by DateTime.UtcNow and I just display it back in "YYYY/MM/DD" format.
All you really need to do is make sure the application server and the database server share the same time settings and are in sync.
IIS, .NET, and SQL Server all use the system time by default.
Clients in different time zones will still use the system time of the IIS server they are connected too. Most sites just format the date on the client side, to the clients zone.
If you want to get fancy, you can write database triggers that handle transaction time stamps. This comes in handy if you do a lot of back-end processing.
I have one existing ASP.NET application. This application built in .net framework 2.0
We used DateTime.Now at many places to store current datetime in application. Its for different purposes including logging.
Currently application hosted on server which of USA. When I access application from my machine, it returns current datetime of server. Is there any way, I can get my local machine.
I researched and got that I need to use TimeZone class to get current datetime of my local machine. But in that case, I need to change wherever I used Datetime.Now and get time from timezone.
Is there any way, I can set globally and when I use datetime.now, it returns local machine time?
Please help.
No, you can't. There is no way to set TimeZone globally.
Best options:
Convert DateTime.Now to DateTime.UtcNow and do the appropriate
calculations (what if you have to move the server to another zone of
US?)
Use DateTimeOffset insted of DateTime which use the TimeZone.
In any case, you have to handle the date time conversion to the local time zone in client side of you app, not in the server side.
You could also possibly set the timezone on the server itself to the one you need.
I am creating a .NET service that gets timestamp information (in the local server time zone) from another Windows 2008 server. I don't have control over the other server, and I need a way to programatically get the timezone offset from that server so that I can interpret the timestamp information as a time in UTC. Is there a builtin service to get the time (and timezone) from a windows 2008 server? Thanks.
Check out this post: http://blogs.technet.com/b/heyscriptingguy/archive/2012/03/28/use-powershell-to-see-time-zone-information-on-remote-computers.aspx
It discusses using powershell, you may need to have access to the server, but perhaps it will help if your looking for a way to access programatically.
If you are able to map to the server, you could also try:
net time \\SERVER_NAME
When i compile the application in my laptop it runs fine but when i run the same application in the server something is wrong with the date format. But when i checked the system date time format wit my laptop and server it the same format. Can anyone tell me what is wrong.
You also have to check the default culture set in the laptop and the server. Are they running in the same culture? Different culture settings will have different default time formats.
Is your application running on the server under a different user account? Regional settings are per-user so date and time formats will depend on the user that the application is running under. You can log in as the application user and check the Regional settings to determine if that's the issue.
This is really common in ASP.NET where the developer builds the app on their workstation and upon deployment to QA or production they find that Regional settings differ because the App Pool is using a service identity.
Is the system clock running UTC, and the OS changing it to local time on the server. This is a pretty common configuration, but will break programs that bypass the OS's functions to retrieve the date/time.