2

我的 fsycc 代码给出了一个编译器错误,提示找不到变量,但我不知道为什么。我希望有人能指出这个问题。

%{
open Ast
%}


// The start token becomes a parser function in the compiled code:
%start start

// These are the terminal tokens of the grammar along with the types of
// the data carried by each token:
%token NAME
%token ARROW TICK VOID
%token LPAREN RPAREN
%token EOF

// This is the type of the data produced by a successful reduction of the 'start'
// symbol:
%type < Query > start

%%

// These are the rules of the grammar along with the F# code of the 
// actions executed as rules are reduced.  In this case the actions 
// produce data using F# data construction terms.
start: Query { Terms($1) }

Query:
    | Term EOF                  { $1 }

Term: 
    | VOID                      { Void }
    | NAME                      { Conc($1) }
    | TICK NAME                 { Abst($2) }
    | LPAREN Term RPAREN        { Lmda($2) }
    | Term ARROW Term           { TermList($1, $3) }

线 | NAME {Conc($1)} 和以下行都给出了这个错误:

  error FS0039: The value or constructor '_1' is not defined

我理解语法问题,但是 yacc 输入有什么问题?

如果有帮助,这里是 Ast 定义:

namespace Ast
open System

type Query =
    | Terms   of Term

and Term =
    | Void
    | Conc of String
    | Abst of String
    | Lmda of Term
    | TermList of Term * Term

和 flex 输入:

{
module Lexer
open System
open Parser
open Microsoft.FSharp.Text.Lexing

let lexeme lexbuf =
    LexBuffer<char>.LexemeString lexbuf
}

// These are some regular expression definitions
let name = ['a'-'z' 'A'-'Z' '0'-'9']
let whitespace = [' ' '\t' ]
let newline = ('\n' | '\r' '\n')

rule tokenize = parse
| whitespace    { tokenize lexbuf }
| newline       { tokenize lexbuf }
// Operators
| "->"          { ARROW }
| "'"           { TICK }
| "void"        { VOID }
// Misc
| "("           { LPAREN }
| ")"           { RPAREN }
// Numberic constants
| name+                                 { NAME }
// EOF
| eof   { EOF }
4

1 回答 1

4

这不是 FsYacc 的错。NAME是没有价值的代币。

你想要做这些修复:

%token NAME%token <string> NAME

| name+ { NAME }| name+ { NAME (lexeme lexbuf) }

现在一切都应该编译了。

于 2011-11-03T17:24:24.557 回答