Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move to building with the 9.0 SDK and move our ToolCurrent TFM to 9.0 #95980

Merged
merged 54 commits into from
Jan 17, 2024

Conversation

jkoritzinsky
Copy link
Member

@jkoritzinsky jkoritzinsky commented Dec 13, 2023

Unblocks #95904

@ghost
Copy link

ghost commented Dec 13, 2023

Tagging subscribers to this area: @dotnet/runtime-infrastructure
See info in area-owners.md if you want to be subscribed.

Issue Details

null

Author: jkoritzinsky
Assignees: -
Labels:

area-Infrastructure

Milestone: 9.0.0

@ghost ghost assigned jkoritzinsky Dec 13, 2023
@am11
Copy link
Member

am11 commented Dec 13, 2023

Looks like src/tools/illink/test/Trimming.Tests.Shared/PathUtilities.cs can be simplified to:

        public static class PathUtilities
        {
-#if NET8_0
-               public const string TFMDirectoryName = "net8.0";
-#elif NET7_0
-               public const string TFMDirectoryName = "net7.0";
-#elif NET6_0
-               public const string TFMDirectoryName = "net6.0";
-#elif NET5_0
-               public const string TFMDirectoryName = "net5.0";
-#elif NETCOREAPP3_0
-               public const string TFMDirectoryName = "netcoreapp3.0";
-#elif NET471
+#if NET471
                public const string TFMDirectoryName = "net471";
 #else
-#error "Unknown TFM"
+               public static readonly string TFMDirectoryName = $"net{Environment.Version.Major}.{Environment.Version.Minor}";
 #endif

(dropping netcoreapp3.0)

@ViktorHofer
Copy link
Member

ViktorHofer commented Dec 15, 2023

@sbomer @vitek-karas can you please help with the linker test failures?

@ViktorHofer
Copy link
Member

ViktorHofer commented Dec 15, 2023

@mikem8361 can you please take a look at the failing CustomAttributeDelete test in System.Runtime.Loader? It's this line: https://github.com/jkoritzinsky/runtime/blob/9461f4b5bd036005f37e78516c48b13396f76750/src/libraries/System.Runtime.Loader/tests/ApplyUpdateTest.cs#L241

@mikem8361
Copy link
Member

I haven't work on ENC/hot reload/loader in 2 years. I assume some .NET 9 loader change broke this test. @mikelle-rogers might have an idea of what broken.

@ericstj
Copy link
Member

ericstj commented Dec 19, 2023

I see @akoeplinger is updating dotnet/hotreload-utils#348. Maybe he's already onto what fix is needed here?

My best guess was that the hot reload diff somehow broke custom attribute type lookup. Tried to debug, but these tests don't like that
image
Not sure how to inspect the content of the hot-reload delta, or try to apply it and inspect the final assembly. My guess is some problem there. It's odd that it's only this one test failing.

If @akoeplinger's fix to the hot reload tools doesn't resolve this, I recommend we just file an issue for the hot-reload owners to fix.

Edit: after closer inspection - probably that change is unrelated. Still not sure what's going on that would cause this hot reload test to start failing.

@trylek
Copy link
Member

trylek commented Dec 19, 2023

I'm afraid I cannot add much more to what Eric has already discovered. Binary diffing of the dmeta file shows a "larger" diff for CustomAttributeDelete.dll.1.dmeta between the runtime repo without and with this change compared to the other dmeta files like CustomAttributeUpdate.dll.1.dmeta leading me to suspect that the tool used to generate the metadata delta files has a new bug in encoding the custom attribute deletion / replacement (the offending case is supposed to delete two attributes and add a new one) but I have no idea where its source code sits and / or how to dump the contents of the dmeta file that would let me investigate this further.

@akoeplinger
Copy link
Member

Updating to latest hotreload-utils and newer SDK fixed this locally, let's see what CI says.

@ViktorHofer
Copy link
Member

ViktorHofer commented Dec 19, 2023

I simplified the path expectations logic in the linker tests but had to push individual commits (and manually) as my codespace doesn't have permissions to push to this fork.

@ViktorHofer
Copy link
Member

@radical please take a look at all the wasm/wasi failures

addresses review feedback from @ viktorhofer .
@radical
Copy link
Member

radical commented Jan 10, 2024

/azp run runtime-wasm

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

RuntimePackLabels="Mono"
Condition="'$(UseLocalTargetingRuntimePack)' == 'true' and ('@(KnownRuntimePack)' == '' or @(KnownRuntimePack->WithMetadataValue('Identity', 'Microsoft.NETCore.App')->WithMetadataValue('RuntimePackLabels', 'Mono')->WithMetadataValue('TargetFramework', '$(NetCoreAppCurrent)')) == '')" />
<!-- always add wasi-wasm as it is never added by the sdk -->
<KnownRuntimePack Update="@(KnownRuntimePack->WithMetadataValue('Identity', '$(LocalFrameworkOverrideName)')->WithMetadataValue('RuntimePackLabels', 'Mono')->WithMetadataValue('TargetFramework', '$(NetCoreAppCurrent)'))"
RuntimePackRuntimeIdentifiers="%(RuntimePackRuntimeIdentifiers);wasi-wasm"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this RID addition be up-streamed into dotnet/sdk?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wasi-wasm is only supported with the workload right now, so we add that as part of the workload. And adding here for in-tree builds. So, upstreaming it is not needed right now.

@radical
Copy link
Member

radical commented Jan 12, 2024

I have been trying to debug the last remaining failures on WBT, and it seems to be down to dotnet new wasmbrowser - the process never seems to end.

...
      [5734] [] [2024-01-12 03:34:50.792] [Debug] [Microsoft.TemplateEngine.Edge.Template.TemplateCreator] => [Execute] => [Template content generation]: Template content generation finished, took 71 ms
      [5734] [] The template "WebAssembly Browser App" was created successfully.
      [5734] []
      [5734] [] [2024-01-12 03:34:50.795] [Debug] [Template Engine] => [Execute]: Execute finished, took 234 ms--------------------

.. and we wait for 3mins but the process never seems to exit.

@radical
Copy link
Member

radical commented Jan 12, 2024

I can reproduce this only on CI. And it does not happen for the first few times/tests.

.. instead of an old custom implementation.
`Process.Kill` returns immediately, and WaitOnExit needs to be used to
wait for the actual exit.
@radical
Copy link
Member

radical commented Jan 17, 2024

/azp run runtime-wasm

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@ViktorHofer
Copy link
Member

  • Two runtime-wasm legs timed out, even after retrying. This looks related to an AzDO machine pool issue (logs mention the machines got purged).
  • An issue already exists for the libraries test failure.

I don't think that any of those issues block merging the PR. Given that we need to get this in ASAP for being able to ship .NET 9 Preview 1, I'm merging this in now.

@ViktorHofer ViktorHofer merged commit a6c3f53 into dotnet:main Jan 17, 2024
204 of 211 checks passed
@jkoritzinsky jkoritzinsky deleted the 9.0-sdk branch January 17, 2024 15:48
@github-actions github-actions bot locked and limited conversation to collaborators Feb 17, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants