package token; import ( "strings" ) type operation func(Token) Token /* * LIST: a list of elements * OPERAND: a string or number * OPERATOR: an entry in a symtable * OPERATION: a list starting with an operator */ type parse_tag int const ( LIST_T parse_tag = iota OPERAND_T parse_tag = iota OPERATOR_T parse_tag = iota OPERATION_T parse_tag = iota ) type Token struct { next *Token tag parse_tag _inner interface{} } func lex(string input) Token { ret := new(Token) iter := &ret delim := ' ' var tok strings.Builder iter_alloced := false for pos, char := range input { switch char { case '\'', '"', '`': delim = char case '(': delim = ')' case delim: *iter = new(Token) *iter._inner = tok.String() iter_alloced = true delim = ' ' default: tok.WriteRune(char) buf_is_dirty = true } if iter_alloced { iter = &(*iter.next) iter_alloced = false tok.Reset() } } } func parse(Token *arg) { // if operand determine if operator // determine operator precense in symbol table // Determine if a list is an operation or a list } func eval(Token *tree) *Token { // Find operations // Simplify operations deepest first // return tree of final Tokens }