Description
Some semi-related questions:
Are there any examples or documentation of wrapping/extending pygls to:
- ... serve LSP over websockets?
- ... to wrap existing LS implementations?
- ... use non-traditional filesystems and formats?
Background
We've been looking at bringing LSP to the Jupyter ecosystem for a while now. While the execution message format serves well for executing code, completion, inspection, shared frontend/backend state, and rudimentary code modification, LSP has many additional features and implementations which would be valuable to augment our kernels.
One of the current attempts at [1,2] is jupyterlab-lsp, which presently has a hard dependency on nodejs by way of jsonrpc-ws-proxy, which allows a single websocket port to server many stdio language servers. I've been making the configuration more jupyter-forward with this wip PR, which in turn wraps jsonrpc-ws-proxy
with jupyter-server-proxy.
I've run into some issues in making it work simply for the general case, especially around absolute file paths, which aren't necessarily a given due to [3] the Jupyter Contents API, which allows serving "files" from... well, just about anything.
Another thing I want is to be able to have multiple, language-oriented URLs (or some message envelope format) be answered by the same underlying process, e.g. javascript-typescript-langserver.
Happy to help explore this space in any way!