You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Add cursorless-talon-dev
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Tweaks
* More tweaks
* Woops
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Andreas Arvidsson <andreas.arvidsson87@gmail.com>
This directory contains voice commands to use while developing Cursorless. See https://www.cursorless.org/docs/contributing/ for more about how to contribute to Cursorless.
Copy file name to clipboardExpand all lines: docs/contributing/CONTRIBUTING.md
+2Lines changed: 2 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -27,6 +27,8 @@ extension](#running--testing-extension-locally), you may want to check out the
27
27
28
28
The `yarn init-launch-sandbox` command creates a local sandbox containing a specific set of VSCode extensions that will be run alongside Cursorless when you launch Cursorless in debug or test mode. Please file an issue if you'd like to use additional extensions when debugging locally.
29
29
30
+
4. Copy / symlink `cursorless-talon-dev` into your Talon user directory for some useful voice commands for developing Cursorless.
31
+
30
32
## Running / testing extension locally
31
33
32
34
In order to test out your local version of the extension or to run unit tests
We don't want to commit this so please add it to your own Talon user file set.
20
-
21
-
### Configuring the test case Recorder
22
-
23
-
The test case recorder has several additional configuration options. The default configuration works for most tests, but you may find the following useful. For a full list of supported configuration options, see [the api docs](../api/interfaces/testutil_testcaserecorder.internal.recordtestcasecommandarg/).
24
-
25
-
#### Testing the hat map
26
-
27
-
We have a way to test that the hats in the hat map update correctly during the course of a single phrase. These tests are also how we usually test our [range updating code](../api/modules/core_updateSelections_updateSelections).
28
-
29
-
Any tests recorded in the `hatTokenMap` directory will automatically be treated as hat token map tests. To initiate a series of hat token map tests in another directory, please add the following to your personal talon files:
When recording each test case for the hat token map, you'll need to proceed as described in [Navigation map tests](#navigation-map-tests).
35
-
36
-
#### Capturing errors
37
-
38
-
We support recording tests where the expected result is an error
39
-
40
-
Please add a command to your personal talon files. See the two files links above for context. Add the command below to your to your `vscode.py` and ensure that there is a matching Talon command.
We support testing our decoration highlights, eg the flash of red when something is deleted. If you record tests into the `decorations/` directory, these will automatically be captured.
49
-
50
-
If you'd like to record decorations when recording into a different directory, please add another command to your personal talon files. See the two files links above for context. Add the command below to your to your `vscode.py` and ensure that there is a matching Talon command.
By default, we don't capture the `that` mark returned by a command, unless the test is being recorded in the `actions/` directory of the recorded tests. If you'd like to capture the returned `that` mark when recording a test somewhere else, you can do something like
Any test case directory that contains a `config.json` will set default configuration for all tests recorded in any descendant directory. For example, the file [`actions/config.json`](../../src/test/suite/fixtures/recorded/actions/config.json) makes it so that all our action tests will capture the final `that` mark. For a full list of keys supported in this json, see [the api docs](../api/interfaces/testutil_testcaserecorder.internal.recordtestcasecommandarg/).
67
-
68
11
## Recording new tests
69
12
70
13
1. Start debugging (F5)
71
14
1. Create a minimal file to use for recording tests. And position your cursor
72
15
where you'd like. Check out the `initialState.documentContents` field of
73
16
[existing test cases](../../src/test/suite/fixtures/recorded) for examples.
74
-
1. Issue the `"cursorless record"` command
17
+
1. Issue the `"cursorless record"` command. Alternately, issue one of the special recording commands listed in
75
18
- List of target directories is shown. All test cases will be put into the
76
19
given subdirectory of `src/test/suite/fixtures/recorded`
77
20
1. Select existing directory or create new one
@@ -90,6 +33,38 @@ Any test case directory that contains a `config.json` will set default configura
90
33
-`Stopped recording test cases` is shown
91
34
- You can also just stop the debugger or close the debug window
92
35
36
+
## Test case recorder options
37
+
38
+
The test case recorder has several additional configuration options. The default configuration works for most tests, but you may find the following useful. For a full list of supported configuration options, see [the api docs](../api/interfaces/testutil_testcaserecorder.internal.recordtestcasecommandarg/).
39
+
40
+
### The options
41
+
42
+
#### Capturing errors
43
+
44
+
We support recording tests where the expected result is an error. This can be done using the command `"cursorless record error"`.
45
+
46
+
#### Testing decoration highlights
47
+
48
+
We support testing our decoration highlights, eg the flash of red when something is deleted. If you record tests into the `decorations/` directory, these will automatically be captured.
49
+
50
+
If you'd like to record decorations when recording into a different directory, you can say `"cursorless record highlights"`.
51
+
52
+
#### Testing the returned `that` mark
53
+
54
+
By default, we don't capture the `that` mark returned by a command, unless the test is being recorded in the `actions/` directory of the recorded tests. If you'd like to capture the returned `that` mark when recording a test somewhere else, you can say `"cursorless record that mark"`.
55
+
56
+
#### Testing the hat map
57
+
58
+
We have a way to test that the hats in the hat map update correctly during the course of a single phrase. These tests are also how we usually test our [range updating code](../api/modules/core_updateSelections_updateSelections).
59
+
60
+
Any tests recorded in the `hatTokenMap` directory will automatically be treated as hat token map tests. To initiate a series of hat token map tests in another directory, say `"cursorless record navigation"`.
61
+
62
+
Then each time you record a test, you need to issue two commands. The second command should be of the form `"take air"` (or another decorated mark), and will tell the test case recorder which decorated mark you're checking.
63
+
64
+
### Default config per test case directory
65
+
66
+
Any test case directory that contains a `config.json` will set default configuration for all tests recorded in any descendant directory. For example, the file [`actions/config.json`](../../src/test/suite/fixtures/recorded/actions/config.json) makes it so that all our action tests will capture the final `that` mark. For a full list of keys supported in this json, see [the api docs](../api/interfaces/testutil_testcaserecorder.internal.recordtestcasecommandarg/).
67
+
93
68
### Navigation map tests
94
69
95
70
If you want to check how the navigation map gets updated in response to changes, you can instead say "cursorless record navigation", and then you need to issue two commands in one phrase each time. The second command should be of the form "take air" (or another decorated mark), and will tell the test case recorder which decorated mark we're checking.
0 commit comments