Should a server-backed app show time according to the device’s clock or the server’s clock?

The current time can be different across different computers. For example, a user may adjust the clock on their mobile device or personal computer whereas server clocks are generally always synced with an accurate time service.

Some use cases that I can think of where a user may adjust the clock:

  • Make the clock run early to compensate for the user’s tendency to be late.
  • Work around time-based limitations in apps (e.g. games).

Which clock should a server-backed app use when displaying times?

As a concrete example, consider a hypothetical app with a Do Not Disturb function that is synced across multiple clients via a server. The server performs the Do Not Disturb function by preventing notifications from being sent to clients. In the app, the user can configure the end time of the Do Not Disturb period using a date picker control.

Should the time that the user specifies be based on the device clock or the server clock? If the time is based on the server clock, then the time value would be sent as-is to the server. If the time is based on the device clock, then the time value would be adjusted to be based on the server clock using a clock synchronization algorithm before being sent to the server.


My opinion is that all times displayed by a device should be based on the device clock; otherwise, having the ability to adjust the device clock would be sort of pointless. What do y’all think?