Description
What is an essential part of the API?
I.e., a such part that provides a basis to define all other parts in a portable manner.
This part can consist of the following items.
1. A getter and setter for the perceptor.
2. A method to define new descriptors.
3. The basic recognizers that cannot be defined in a portable manner (e.g. a recognizer for locals).
4. The basic descriptors that cannot be defined in a portable manner (e.g. for locals).
5. Maybe something like perceive-lexeme
and translate-token
for a user-defined text interpreter loop.
6. The default perceptor (see #15).
If we have examples (or demand) of using tokens and descriptors beyond the Forth text interpreter, the more detailed methods to work with them may be appreciated.
One such method is: token-xt? ( token{k*x} td -- token{k*x} td 0 | xt td-xt )
that returns xt if the Forth system has default interpretation and default compilation semantics for the corresponding lexeme. This method is a pure postfix basis to the Tick word ('
and [']
).
Another method is postpone-token ( token{k*x} td -- )
that is a pure postfix basis to the postpone
word.
It is an important thing that [compile]
can be implemented in the basis of these words:
: [compile] ( "name" -- )
parse-lexeme recognize dup ?nf \ throw an exception if not recognized
token-xt? if compile, exit then postpone-token
;
I.e., these two words compose a pure postfix basis to the [compile]
word too.
NB: this [compile]
is applicable to the Forth words only. The behavior of [compile]
in case of other lexemes is unclear. See also #14
Should we provide a basis that allows to define even the postpone-token
or translate-token
words? It is not obvious without clear use cases (of course, beyond definitions of these words).