如果变量的十进制类型由编译器定义,那么精度丢失的原因是什么?这在任何地方都有记录吗?
DATA: gv_1 TYPE p LENGTH 15 DECIMALS 2 VALUE '56555.31'.
DATA: gv_2 TYPE p LENGTH 15 DECIMALS 2 VALUE '56555.31'.
DATA: gv_3 TYPE p LENGTH 15 DECIMALS 2 VALUE '56555.34'.
DATA(gv_sum) = gv_1 + gv_2 + gv_3. "data type left to be resolved by the compiler
WRITE / gv_sum.
DATA: gv_sum_exp TYPE p LENGTH 15 DECIMALS 2. "explicit type declaration
gv_sum_exp = gv_1 + gv_2 + gv_3.
WRITE / gv_sum_exp.
第一个总和导致
169666
第二个在
169665.96