|
||
---|---|---|
.. | ||
hash.go | ||
lex.go | ||
README.md | ||
table.go |
JS
This package is a JS lexer (ECMA-262, edition 6.0) written in Go. It follows the specification at ECMAScript Language Specification. The lexer takes an io.Reader and converts it into tokens until the EOF.
Installation
Run the following command
go get github.com/tdewolff/parse/js
or add the following import and run project with go get
import "github.com/tdewolff/parse/js"
Lexer
Usage
The following initializes a new Lexer with io.Reader r
:
l := js.NewLexer(r)
To tokenize until EOF an error, use:
for {
tt, text := l.Next()
switch tt {
case js.ErrorToken:
// error or EOF set in l.Err()
return
// ...
}
}
All tokens (see ECMAScript Language Specification):
ErrorToken TokenType = iota // extra token when errors occur
UnknownToken // extra token when no token can be matched
WhitespaceToken // space \t \v \f
LineTerminatorToken // \r \n \r\n
CommentToken
IdentifierToken // also: null true false
PunctuatorToken /* { } ( ) [ ] . ; , < > <= >= == != === !== + - * % ++ -- << >>
>>> & | ^ ! ~ && || ? : = += -= *= %= <<= >>= >>>= &= |= ^= / /= => */
NumericToken
StringToken
RegexpToken
TemplateToken
Quirks
Because the ECMAScript specification for PunctuatorToken
(of which the /
and /=
symbols) and RegexpToken
depends on a parser state to differentiate between the two, the lexer (to remain modular) uses different rules. Whenever /
is encountered and the previous token is one of (,=:[!&|?{};
, it returns a RegexpToken
, otherwise it returns a PunctuatorToken
. This is the same rule JSLint appears to use.
Examples
package main
import (
"os"
"github.com/tdewolff/parse/js"
)
// Tokenize JS from stdin.
func main() {
l := js.NewLexer(os.Stdin)
for {
tt, text := l.Next()
switch tt {
case js.ErrorToken:
if l.Err() != io.EOF {
fmt.Println("Error on line", l.Line(), ":", l.Err())
}
return
case js.IdentifierToken:
fmt.Println("Identifier", string(text))
case js.NumericToken:
fmt.Println("Numeric", string(text))
// ...
}
}
}
License
Released under the MIT license.