How do I detect the timescale precision used in a simulation from the source code ?. Consider I have a configuration parameter(cfg_delay_i) of some delay value given by user in timeunits as fs .If the user gives 1000 , my code has to wait 1000fs or 1ps before executing further.
#(cfg_delay_i * 1fs );//will wait only if timescale is 1ps/1fs
do_something();
If the timescale precision is 1fs ,there won’t be any problem but if the precision is higher than that it won’t wait and it will work as 0 delay . So I want to write a code which will determine the timescale used by the user and give the delay accordingly.My expected pseudo-code will be like below,
if(timeprecision == 1fs )#(cfg_delay_i * 1fs ) ;
else if(timeprecision == 1ps )#(cfg_delay_i/1000 * 1ps ) ;
Please help me with the logic to determine the timescale unit and precision internally.