2-digit year dates not validated properly during client validation

G

Guest

I have noticed that CompareValidator and RangeValidator don't work properly
with 2-digit year dates when using client script validation. Specifically the
problem lies in the way the string is converted into a date before being
compared to the preset value(s). Normally (when using the default values:
CutOffYear=2029 and Century=2000) a 2-digit year, YY, should be converted to
20YY if YY<=29 and 19YY if YY>29. This is what happens if the validation is
done on the server, but when the validation is done on the client, the year
is always converted to 20YY (regardless of the value of YY).

I have tracked down the bug to WebUIValidation.js in function GetFullYear(...)
In line 183, the original code reads:
return (year + parseInt(val.century)) - ((year < val.cutoffyear) ? 0
: 100);
and since year is 2-digits it is always less than 100 which is less than
val.cutoffyear (=2029)
The correct code should read:
return (year + parseInt(val.century)) - ((year +
parseInt(val.century) <= val.cutoffyear) ? 0 : 100);

I can make this change myself but I have to do it on every client's server
where I install my applications and I also have to explain to the client why
I am tampering with the Framework code (not to mention that there may be
another application on the same server relying on this incorrect behaviour).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,048
Latest member
verona

Latest Threads

Top