-
Notifications
You must be signed in to change notification settings - Fork 831
Refactoring of IncrementalBuilder #14903
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This seems to give approx 10% gain in |
|
I'll continue with this, but first I need some more tests, see #14921, because I can completely break partial type checking and this is still green. |
|
This also needs debouncing when used with live buffers, or it will spam lots of non-cancellable parse tasks when typing. I'll do it in another PR for VS. |
|
So it looks like it was a glitch of Visual Studio loading projects after all. |
|
@vzarytovskii, I think I finally managed to capture some more representative GC data. While there is less activity over all, it seems a lot more survive to gen2. |
|
@majocha hm, interesting, it seems that pauses are also longer in general. |
Exactly, I'm seeing about 3x more pauses > 200ms in the stats. |
|
@vzarytovskii so this is funny, actually forcing tcInfoExtras to compute let createBoundModelGraphNode (prevBoundModel: GraphNode<BoundModel>) syntaxTree =
GraphNode(node {
let! prevBoundModel = prevBoundModel.GetOrComputeValue()
prevBoundModel.GetOrComputeTcInfoExtras() |> Async.AwaitNodeCode |> Async.Ignore |> Async.Start
return! prevBoundModel.Next(syntaxTree)
}) |
Hm, well, this is interesting. |
|
Looks like tcInfoExtras is much more compact than the actual type check results, so it's better to compute it as soon as we have the type check, to reclaim memory. I updated the code and it's now as performant or better than main in VS. I'll have to redo the benchmarks. Let's see if it's still green. |
|
I redid some benchmarks and the gains with .fsi files and fast find references are still good. I installed this in VS and It feels fine. |
TIHan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is great work @majocha.
Glad to see there is progress being made in IncrementalBuilder. It used to be one of the more difficult areas to make changes in.
|
/azp run |
|
Azure Pipelines successfully started running 2 pipeline(s). |
| let checkFileTimeStampsSynchronously cache = | ||
| checkFileTimeStamps cache | ||
| |> Async.AwaitNodeCode | ||
| |> Async.RunSynchronously |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@majocha I started seeing some UI hangs recently in VS, specifically when finding references. Dump of the process shows that it's waiting on a task that involves this particular RunSynchronously. In this case checking project stamp through IProjectReference.
Since this is on some paths that can be hit from different jobs quite frequently in parallel (when I was finding references semantic classification, document highlights and quick info services were also probably running), we might need to do something about it.
I saw that IProjectReference.TryGetLogicalTimeStamp is not async, but maybe we could just make it async? Are there any other reasons we need this to be synchronous?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is synchronous to keep type signature of call sites, only because it was all synchronous before. IIRC there was a temporary state computed instead of currentState, sometimes duplicating work.
I'll take a look. If this can't be made async we can try reverting this back to computing temp state.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@0101 I think the problem is that currentState is guarded by semaphore while the tempState was not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Low cost attempt to fix this: #15146
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for looking into it!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But also now it should be quite easy to make it async - I'm going to try that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have a very faint suspicion, because I experienced weird behaviour recently, but in CI. I'm probably worng but maybe it's the same thing? Find references is very parallel from the get go with all the Task.WhenAll and so happens that this GetHashCode() here:
fsharp/vsintegration/src/FSharp.Editor/LanguageService/FSharpProjectOptionsManager.fs
Line 331 in 9627d33
| Stamp = Some(int64 (ver.GetHashCode())) |
is very susceptible to collisions when you create many at the same time and we use it for identity. This is grasping at straws but I'd put
None there just to try if it helps.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, I'm not even sure what's the point of that. If we have 2 project options where everything is the same except stamp - why do we care which one we use?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The situation I had on mind is different project options getting the same stamp, corrupting service caches. But this is very far fetched.
Is there a way to try to trigger this bug? Just spam shift-f12 in VisualFSharp.sln?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I see, I read the description of Stamp wrong. But then I would expect just flat out errors or crashes if that happened.
I think best chances to reproduce this are shift-F12 closely before or after changing some related files, or maybe newly opening them.
@majocha Does that handle cases where project is reloaded without any changes that affect parsing results? Does it handle all types of parsing requests made by the client, either direct or indirect, or only the ones done by the incremental builder? My suspicion is a fully working cache would still be beneficial in several cases. |
|
Exactly my thoughts, after looking into it some more. The cache in question was handling only incremental builder requests and lifetime was tied to incremental builder instances( pretty much works the same now), but incremental builder instances are efemeral by design, if I understand it correctly. |
|
Especially when considering dependency graph based type checking, it would be crucial to not reparse the whole project each time the graph needs to be obtained. |





I refactored
BoundModelby removingTcInfoState. Now there is noPartialState/FullState, we instead create separateGraphNodescomputingTcInfoandTcInfoExtras. When computingTcInfothis always skips implementations whenever possible. ComputingTcInfoExtrasOTOH always does full type check of the file, so previously skipped implementations can be checked in parallel.I made it more explicit when the skipping of implementation during parsing and type check happens. Separate functions return parsing result placeholder and partial type check info for skipped implementations.
Build graph is now updated in a single pass
mapFoldincomputeStampedFileNames. I find this easier to comprehend, but that's individual matter probably.A
Slotrecord is introduced to hold together parsing and typechecking GraphNodes for particular file slots.Removed parsing cache because it's now irrelevant. Parsing step has it's own GraphNode which ensures we never reparse unless the client or the file system notifies there was source change.
enablePartialTypeCheckingsetting is also now mostly irrelevant, as TcInfoExtras are computed on demand. Although it's easy to change this behavior to parallelized eager full type check whenenablePartialTypeChecking = false. I'm not sure what would be best here.This performs somewhat better in cases when there are backing signatures. There is still room for performance gain if we do the parsing eagerly instead of on-demand, and even more if we could use build graph from #14494.
Some benchmarks:
main:
this PR:
There is some win with FastFindReferences, it seems.
more benchmarks in comments.