gota

package module
v0.8.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 19, 2025 License: MIT Imports: 11 Imported by: 2

README

Gota

Overview

gota (go-tee-a) is a GO Template Analyzer. It parses go template file and produces a parse tree on which operations can be executed on. More than anything, it is primarily developed as a main component for the go template LSP from the same author.

This parser in particular make a syntactical & semantical parsing. On one hand, the syntax parsing is made with a standard lexer and parser, at the end of which a parse tree is obtained. On the other hand, the semantical analysis uses the parse tree to validate the meaning of the language (go template). To achieve this, the semantical analyzer consists of 2 components:

  • Definition analyzer: traverse the parse tree in order to check that all variables, functions, and templates used in the project are defined
  • Type analyzer: traverse the parse tree in order to validate that there is no type mismatch between functions and variables, templates and variables

Thus the processing order could be summed up as :

flowchart LR
    text(Text) --> lexer(Lexer)
    lexer --> parser(Parser)
    parser --> definition(Definition analyzer)
    definition --> type(Type analyzer)

This is especially meaningful since any statements with error encountered in an earlier stage will not be passed to the next stage.

Put on another word, statement error found during 'lexing' won't be passed to the 'parser'. For instance, this go template statement {{ if -true }} won't survive the lexing because (-) is an unrecognized symbol. The lexer will output an error, as well ass returns a list of valid tokens that didn't encounter any error. Only that valid list will be handed to the parser.

NB: In this text, parsing and analysis are used interchangeably

Install

go get github.com/yayolande/gota

Usage

Take a look at the ./examples directory to have a feel for it. For debugging purpose, there is a gota.Print() function that convert the tree into a JSON string and print it to the screen. Use the jq program for pretty formatting

Docs

Quick-start

Roadmap

  • Lexer
  • Syntax parser
  • Definition Analysis (operation)
  • Improve Go-To-Definition
  • Go-To-Definition: make it work with property ('$person.contact' or '.person.age') and method ('$city.getMayorName')
  • Add Hover feature
  • Hover: make it work with property ('$person.contact' or '.person.age') and method ('$city.getMayorName')
  • Replace 'types.Identical()' with a custom version, since the former handle poorly mixing of 'any' type with 'Basic' and 'Complex' type
  • Type inference for individual template global variable ('.' and '$')
  • Fix broken Tests
  • Dont forget to handle 'method' and 'property' for LSP feature (hover, etc)
  • Type Analysis (operation on parse tree)
  • Find declaration of a variable and function
  • Get all symbol starting with specific string
  • Type hint analysis on comment (helpful to enforce type on lsp)
  • Add more examples and more docs to 'readme'

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func FoldingRange added in v0.8.1

func FoldingRange(rootNode *parser.GroupStatementNode) ([]*parser.GroupStatementNode, []*parser.CommentNode)

func GoToDefinition

func GoToDefinition(file *checker.FileDefinition, position lexer.Position) (fileNames []string, reachs []lexer.Range, err error)

func HasFileExtension added in v0.8.3

func HasFileExtension(fileName string, extensions []string) bool

Report whether 'fileName' extension is found within 'extensions'

func Hover added in v0.6.0

func Hover(file *checker.FileDefinition, position lexer.Position) (string, lexer.Range)

func OpenProjectFiles

func OpenProjectFiles(rootDir string, withFileExtensions []string) map[string][]byte

Recursively open files from 'rootDir'. However there is a depth limit for the recursion (current MAX_DEPTH = 5) TODO: expand authorized file extension to '.gohtml, .gohtmpl, .tmpl, .tpl, etc.'

func Print

func Print(node ...parser.AstNode)

Print in JSON format the AST node to the screen. Use a program like 'jq' for pretty formatting

Types

type Error

type Error = lexer.Error

func DefinitionAnalysisSingleFile

func DefinitionAnalysisSingleFile(fileName string, parsedFilesInWorkspace map[string]*parser.GroupStatementNode) (*checker.FileDefinition, []Error)

This version is inefficient but simplier to use Use 'DefinitionAnalysisChainTrigerredBysingleFileChange()' instead since it more performant and more accurate as well

func ParseFilesInWorkspace

func ParseFilesInWorkspace(workspaceFiles map[string][]byte) (map[string]*parser.GroupStatementNode, []Error)

Parse all files within a workspace. The output is an AST node, and an error list containing parsing error and suggestions Never return nil, always an empty 'map' if nothing found

func ParseSingleFile

func ParseSingleFile(source []byte) (*parser.GroupStatementNode, []Error)

Parse a file content (buffer). The output is an AST node, and an error list containing parsing error and suggestions Returned parse tree is never 'nil', even when empty

type FileAnalysisAndError added in v0.6.0

type FileAnalysisAndError struct {
	FileName string
	File     *checker.FileDefinition
	Errs     []lexer.Error
}

func DefinitionAnalisisWithinWorkspace

func DefinitionAnalisisWithinWorkspace(parsedFilesInWorkspace map[string]*parser.GroupStatementNode) []FileAnalysisAndError

Definition analysis for all files within a workspace. It should only be done after 'ParseFilesInWorkspace()' or similar

func DefinitionAnalysisChainTrigerredByBatchFileChange added in v0.6.0

func DefinitionAnalysisChainTrigerredByBatchFileChange(parsedFilesInWorkspace map[string]*parser.GroupStatementNode, fileNames ...string) []FileAnalysisAndError

Prefered function for computing the semantic analysis of many files change in a workspace

func DefinitionAnalysisChainTrigerredBysingleFileChange added in v0.6.0

func DefinitionAnalysisChainTrigerredBysingleFileChange(parsedFilesInWorkspace map[string]*parser.GroupStatementNode, fileName string) []FileAnalysisAndError

Prefered function for computing the semantic analysis of a file in a workspace This also compute the semantic analysis of other files affected by the change initiated by 'fileName' So this process the semantic analysis of 'fileName' and other file affected by the change

Directories

Path Synopsis
_examples command

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL