我有以下 ocamllex 代码:
let flt = ['-' '+']?['0'-'9']+ ['.'] ['0'-'9']+
rule token = parse
[' ' '\t' '\r' '\n'] { token lexbuf } (* Whitespace *)
| ['0'-'9']+ as lxm { INTEGER(int_of_string lxm) }
| flt as lxm { FLOAT(float_of_string lxm) }
这行得通!
但是,当我想允许 INTEGER 的 + 和 - 符号时,它给了我一个错误。
let flt = ['-' '+']?['0'-'9']+ ['.'] ['0'-'9']+
rule token = parse
[' ' '\t' '\r' '\n'] { token lexbuf } (* Whitespace *)
| ['+' '-']['0'-'9']+ as lxm { INTEGER(int_of_string lxm) }
| flt as lxm { FLOAT(float_of_string lxm) }
错误如下:
Fatal error: exception Failure("int_of_string")
Undefined symbols for architecture x86_64:
"_main", referenced from:
implicit entry/start for main executable
ld: symbol(s) not found for architecture x86_64
有趣的是,在我的 .ml 文件中,我使用的是“float_of_string”,但我没有在任何地方使用“int_of_string”。