The UDEC function returns a character-string value that is the
unsigned decimal equivalent of the specified parameter. The
return value is compatible with all other string types.
Syntax:
UDEC(x [[,length[[,digits]]]])
The parameter 'x' is the expression to be converted. The UDEC
function can take a parameter of any type except VARYING OF
CHAR, conformant parameters, or schema types. This function
requires the size of 'x' to be less than or equal to the size of
INTEGER64 (if supported) on your system. If your system does
not support INTEGER64, then the UDEC function requires that 'x'
be less than or equal to the size of INTEGER32.
Two optional integer parameters specify the length of the
resulting string and the minimum number of significant digits to
be returned. If you specify a length that is too short to hold
the converted value, the resulting string is truncated on the
left.
If you do not specify values for the optional parameters, a
default length and a default minimum number of significant
digits is used. If the size of 'x' is greater than 32, the
defaults are 21 characters for the length and 20 characters for
the minimum number of digits. Otherwise, the defaults are 11
characters for the length and 10 characters for the minimum
number of digits.