Child pages
  • Decision table
Skip to end of metadata
Go to start of metadata

Different aspects of language definitions are tested with different means:

Language definition aspects

The way to test

Intentions
Actions
Side-transforms
Editor ActionMaps
KeyMaps

Use the jetbrains.mps.lang.test language to create EditorTestCases. You set the stage by providing an initial piece of code, define a set of editing actions to perform against the initial code and also provide an expected outcome as another piece of code. Any differences between the expected and real output of the test will be reported as errors.
See the Editor Tests section for details.

Constraints
Scopes
Type-system
Dataflow

Use the jetbrains.mps.lang.test language to create NodesTestCases. In these test cases write snippets of "correct" code and ensure no error or warning is reported on them. Similarly, write "invalid" pieces of code and assert that an error or a warning is reported in the correct node.
See the Nodes Tests section for details.

Generator
TextGen

There is currently no built-in testing facility for these aspects. There are a few practices that have worked for us over time:

  • Perhaps the most reasonable way to check the generation process is by generating models, for which we already know the correct generation result, and then comparing the generated output with the expected one. For example, if your generated code is stored in a VCS, you could check for differences after each run of the tests.
  • You may also consider providing code snippets that may represent corner cases for the generator and check whether the generator successfully generates output from them, or whether it fails.
  • Compiling and running the generated code may also increase your confidence about the correctness of your generator.

Migrations

Use the jetbrains.mps.lang.test language to create MigrationTestCases. In these test cases write pieces of code to run migration on them.
See the Migration Tests section for details.

  • No labels