ScalaImpl

Scala for NetBeans Implementation

(This is a working document, and will be incrementally change continually)

I. Lexer

The lexer of Scala is based on NetBeans' incremental lexer engine, which request you providing a language lexer and implementing state() and nextToken() interfaces. Please see ScalaLexer.java

I implemented the Scala language lexer via Rats! parser generator. There are some standard modules that can be repeatedly used, such as Characters.rats, Xml.rats etc. All these Rats! definitions can be found under org.netbeans.modules.scala.editing.rats

The lexer is incremental.

II. Syntax parser and semantic analyzer

At the beginning, I implemented a full-featured syntax parser via Rats! too (org.netbeans.modules.scala.editing.rats.ParserScala.rats), which can naturally express the syntax definition of Scala according to its specification. And also a Scala semantic analyzer under org.netbeans.modules.scala.editing.nodes (deprecated now)

But after while, I replaced these parser and analyzer by Scala's native compiler. The later one is still buggy for editor writing (which throws a lot of unprocessed "AssertError"s), but it has some error recover and full type inference features which needs a lot of man working.

The Scala's native compiler is integrated into NetBeans via org.netbeans.modules.scala.editing.ScalaGlobal.java, which sets all classpath, source directory, out directory properties according to NetBeans' Project information and ClassPathProvider. Per ScalaGlobal per project.

The error report is integrated into GSF's framework via org.netbeans.modules.scala.editing.ScalaParser.java, which is the GSF's parser implementation, and acts as the underlying parser and error report for all documents that are under editing.

III. Hook AST tree into GSF Framework

GSF's ElementHandle is used for outline displaying, semantic highlighting, code completion proposals, declaration finder, occurrences mark.

The classes under org.netbeans.modules.scala.editing.ast package are responsible for wrapping all Scala's AST elements to GSF's ElementHandles. Where, AstDef.java represents AST definition, and AstRef.java represents AST usage. AstScope.java is used to represent the AST scope, all instances of AstDef and AstRef are stored in an instance of AstScope, thus we can easily process the visibility of all these definitions and usages.

The offset range and source file properties are stored in AstDef and AstRef. To get incremental parsing support, I store the bounds tokens instead of the absolute offsets, so I do not need to track the offsets changes when document is edited, since all these tokens are generated by NetBeans' incremental lexer engine, the offset of each token can be get via Token#offset(TokenHierarchy).

The incremental parsing is to be implemented.

The AstTreeVisitor;java will traverse the AST tree, and create all necessary AstDefs and AstRefs, put them to proper AstScope.

The classes under this package can be repeatedly used for other languages supporting (via a bit of modifications), I have used them for Erlang, Fortress and Scala supporting.

IV. Find completion proposals

Since Scala's native compiler (instance of Global.scala) maintains a global symbol table, we do not need another indexer to find completion proposals. When user press a key that invokes a completion action, I'll try to find the nearest symbol around the caret offset, then get all members of this symbol, filter/convert them to GSf's ElementHandle.

The Scala's native compiler is buggy on this, since it will destroy the global symbol table when an AssertError occurred. I have try my best to catch these unhappy AssertErrors and reset the Global compiler if necessary, but there are still some strange behaviors from Scala's native compiler I have to deal with in the future.

These AssertError should be better processed in Scala's native compiler, you know, it's named "assert".

V. Go to Declaration

Scala's native compiler does not fill in offset and source file information for symbols that are generated from binary class file, so we have to get source file first, then recompile source to get the offset. Scala's source file can contains more than one Class/Trait/Objects, and the "package" statement can be different from the files directory tree, i.e. "scala.file.AClass" does not need to be under file system as "scala\file\AClass.scala". To get the correct source path, we can not rely on the package's path elements. To resolve this, I have to query the binary classpath first, get the binary class file, read source file information from class file's attributes, then query the source for binary classpaths, find the correct source file.

And for symbol from java, I need to load the corresponding JavaElement according to a Scala to Java mapping first.

VI. Scala Project

The Scala project support is almost the same as java.j2seproject, which is ant based project.

By adding ant task for scalac into project.xsd, the new created project will get proper build_impl.xml.

I implemented both GSF's ClassPathProvider and java.source's ClassPathProvider, so in any case, I can get the classpath for GSF's ElementHandle or Javac's ElementHandle usage.

VII. Scala Debugger

Scala debugger is based on current JPDA modules of NetBeans.

To hook mouse clicking to adding/removing breakpoints, I implemented and registered GlyphGutterActions for Scala's MIME type in layer.xml, so, org.netbeans.modules.editor.impl.GlyphGutterActionsProvider#getGlyphGutterActions(String mimeType) can lookup this action. GlyphGutterActionsProvider#getGlyphGutterActions will be invoked by org.netbeans.editor.GlyphGutter#GutterMouseListener#mouseClicked.

To get debugger recognizing source file's structure and context, I implemented org.netbeans.spi.debugger.jpda.EditorContext.

Not logged in. Log in, Register

By use of this website, you agree to the NetBeans Policies and Terms of Use. © 2012, Oracle Corporation and/or its affiliates. Sponsored by Oracle logo