I am working on a patch for adding datetime parsing support for RFC3339 as reported in MDEV-11829.

In examining methods to use the provided offset to convert the parsed datetime to the timezone of mariadb. I have some questions on the my_system_gmt_sec function in sql-common/my_time.c .

From testing on linux (rhel 7 and gentoo), it seems the timezone computed by my_system_gmt_sec is off by 7200 seconds and has a inverse sign. I am trying to determine why this is and if it is designed to be this way. From the comments in the file, it seems my_system_gmt_sec was designed specifically due to cross-platform issues. However there is no mention that it the timezone computed is suppose to be off or have an inverse sign.

I could not find any usages of the my_time_zone global variable or where another function used the timezone computed by my_system_gmt_sec, so this might not effect anything, but I wanted some clarification.

Here is some testing results:
| ---------------------------- | ------------------------------------------ | -------------- |
|   Server TimeZone     |  my_system_gmt_sec timezone | Difference |
| ---------------------------- | ------------------------------------------ | -------------- |
| -18000 (-05:00 EST)  |                    25200                      |      43200   |
|  3600 (+01:00 CET)   |                    -10800                     |      14400   |
|     0 (+00:00 GMT)     |                     7200                       |      7200     |
| ---------------------------- | ------------------------------------------ | -------------- |

In testing I've been able to determine that  the 7200 second deviation is caused by compensating for the "-3600" seconds twice, first in setting an initial timezone in my_init_time:
https://github.com/MariaDB/server/blob/10.1/sql-common/my_time.c#L709

Inside my_system_gmt_sec the "current_timezone" is set to the "my_time_zone" which is at 3600 initially. Then below another 3600 is added to the timezone computation: 
https://github.com/MariaDB/server/blob/10.1/sql-common/my_time.c#L946

The diff variable that is computed has the correct offset, with an inverse sign. If current_timezone were set to "diff" then it would have the correct timezone but again with an inverse sign. The inverse sign comes from the doing t - l_time instead of l_time - t.

Should the timezone computed have an inverse sign? Is there a reason it is offset by 7200 seconds? Perhaps the behavior is different on other platform, that I have not tested.

Thank you,

Seth Shelnutt