Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion: netstandard2.0 packages don't compile for out-of-support runtimes #3448

Open
CodeBlanch opened this issue Jul 14, 2022 · 27 comments
Labels
dependencies Pull requests that update a dependency file
Milestone

Comments

@CodeBlanch
Copy link
Member

If you look at an out-of-band NET 7 package like System.Diagnostics.DiagnosticSource it lists .NETStandard 2.0 as a target. The .NET Standard 2.0 doc lists a wide range of runtimes including ones that are no longer supported (or will be out of support by the time NET 7 ships).

Let's say a user is on .NET 5 and upgrades OTel once it has a dependency on NET 7 DS. There will be no explicit .NET 5 target in either OTel or DS, but due to the presence of .NETStandard 2.0, the tooling will allow the upgrade.

What will happen in this case is an error will start showing up surfaced through MSBuild:

System.Diagnostics.DiagnosticSource doesn't support net5.0. Consider updating your TargetFramework to net6.0 or later.

The reason for this is the .NET 7 out-of-band packages include a sort of poison pill targets file:

image

This is not exclusive to System.Diagnostics.DiagnosticSource. @alanwest did some digging we see it in System.Text.Encodings.Web (referenced through System.Text.Json) and Microsoft.Extensions.Logging.Abstractions.

The main concern here is the confusion. We anticipate issues being opened in our repo because the intention was to upgrade OTel, not these other transitive packages where the errors are being thrown.

Some options we have discussed...

  • Drop netstandard2.0 & netstandard2.1 and just explicitly list out our supported runtimes (eg net462;net6.0;net7.0). What this does is stop the upgrade in the first place. This will break many users consuming OTel from netstandard2.0 projects. But they probably would be broken anyway due to the poison pill? We would also lose mono/unity/xamarin but not sure if SDK works on those runtimes currently. The main thing this does is make the break explicit and better conveys the true supported runtimes.
  • Add our own poison pill targets file(s) in our packages. This would at least make the error less confusing and reference OTel instead of something transitive.
  • Keep netstandard2.0 on DS v6. Big maintenance headache to do this. Could also lead to different APIs being exposed which leads to issues like [HttpClient instrumentation] Runtime failures with a mix of NetFramework and NetStandard #3434.

/cc @alanwest @cijothomas @utpilla @reyang @tarekgh @noahfalk

@alanwest
Copy link
Member

Keep netstandard2.0 on DS v6.

Another big downside to this option:

One of the primary use cases of .NET Standard is for folks to put shared code in to their own netstandard2.0 targeted libraries. This enables them to share code between .NET Framework and modern .NET applications as they're making the transition. We have end users of OpenTelemetry .NET doing this in order to ensure common practices and config are adopted uniformly across their teams.

Let's say I currently have an environment with a mix of net48, net5, and net6 applications that are all referencing my own netstandard2.0 library. Now DiagnosticSource v7 is released. I should be able to make use of new features like the UpDownCounter, for example, in my net48 and net6 applications. However, in order to use UpDownCounter in my shared library, I'm now the burdened with multi-targeting it to netstandard2.0;net48;net6 and conditionally compiling any usage of UpDownCounter.

I think this begins to defeat sole purpose of .NET Standard. If the plan is to EOL .NET Standard then I think the right thing to do is to stop releasing libraries that target it.

@ViktorHofer
Copy link

@tarekgh asked me to join this discussion. I work on dotnet/runtime's infrastructure and drove many of the changes that you are observing and discussing here.

With .NET 6 we explicitly made the decision to remove out-of-support assets from our packages. This was communicated via dotnet/announcements#190. The motivation for this is well explained in the announcement but TL;DR: "For .NET 6, we plan to no longer perform any form of harvesting to ensure that all assets we ship can be serviced".

What the announcement didn't cover is how we guarantee that customers don't bind on a different asset - on an unsupported
runtime - than what they previously relied on.

Imagine your library compiles against netcoreapp3.1 and references the 6.0.x version of the DiagnosticSource package. As netcoreapp3.1 isn't supported anymore in the 7.0.0 package, without an explicit msbuild error, the library would silently not bind against a .NETCoreApp asset anymore but against .NETStandard. The same applies if your library targets net461 (which is out-of-support) as the minimum supported .NET Framework asset in the 7.0.0 package is net462.

In some cases this might just work, but many of our libraries work fundamentally differently on .NETStandard and were never designed to be invoked on .NETFramework or .NETCoreApp. Even though that goes against the basic principle of .NET Standard (a single implementation that works for many different runtimes), that's the current situation in dotnet/runtime. As an example, [DiagnosticSource]https://github.com/dotnet/runtime/blob/main/src/libraries/System.Diagnostics.DiagnosticSource/src/System.Diagnostics.DiagnosticSource.csproj) has completely different compile inputs on .NETStandard, .NETFramework and .NETCoreApp.

At the time we were updating our packages and dropping out-of-support assets, we felt that silently binding against a fundamental different implementation on an unsupported runtime will cause more issues for consumers, than blocking the consumption on out-of-support runtimes.

The feedback that we've received in the last two years on this was about the inconvenience around the incorrect support statement of netstandard2.0 in our packages. @andrewlock wrote this excellent blog post that provides even more insights into the change: https://andrewlock.net/stop-lying-about-netstandard-2-support/.

As you already identified, there are two escape hatches that I wouldn't recommend to use as nothing guarantees that the .NETStandard asset works or will work on all supported .NETStandard runtimes:

  1. Set <SuppressTfmSupportBuildWarnings>true</SuppressTfmSupportBuildWarnings> for consumers.
  2. Add ExcludeAssets=buildTransitive metadata to the PackageReference.

Please share your opinion and help us decide how to improve this scenario. Examples of some options:

  1. Drop netstandard2.0 assets in our packages entirely (this will cause app models like source generators and UWP which don't run on .NETFramework or .NETCoreApp to be shut off). This would be way too disruptive as the current ecosystem builds and relies on .NETStandard, especially class libraries.
  2. Drop netstandard2.0 implementations from packages entirely, by making them throw on initialization. This would allow class libraries to continue consuming our .NETStandard packages but there wouldn't be an msbuild error anymore and instead a runtime load error for any runtime. This might make it clearer that our packages don't provide an implementation but only a contract that libraries can bind on. We already do this for some packages which don't work cross-platform: https://github.com/dotnet/runtime/blob/87aa786b3997a23501912abd4654e4fd9958230c/src/libraries/Microsoft.Win32.SystemEvents/src/Microsoft.Win32.SystemEvents.csproj#L18.
  3. Remove the msbuild error and allow out-of-support runtimes to rely on the .NETStandard asset. If we would do this, how would we communicate the "real" minimum supported .NETCoreApp and .NETFramework versions? We can't provide any guarantee on if the .NETStandard implementation is runnable on a given runtime. (We don't test today on anything except supported .NETCoreApp and .NETFramework versions).
  4. Downgrade the msbuild error to an msbuild warning so that customers can still use package in an unsupported state on an out-of-support runtime without any guarantee. This would still defeat the purpose of .NETStandard as many packages will most likely not work on older runtimes.

cc @ericstj @danmoseley @terrajobst

@andrewlock
Copy link

Thanks for the clarification @ViktorHofer, this has been frustrating me more and more recently, especially as the original explanation didn't seem to justify the actual behaviour. The fact that you would silently switch from using netcoreapp assets to netstandard assets makes sense to me.

As for how to fix it... that's harder.

  1. Drop netstandard2.0 assets in our packages entirely: My gut reaction initially was that this was the correct option, because the error implies that you don't really support netstandard2.0. But based on your previous message, the reality is, you do support netstandard2.0, and the error is about the fact the new package would bind the netstandard assets instead of netcoreapp assets. So yeah, you can't, and shouldn't, do this.
  2. Drop netstandard2.0 implementations from packages entirely, by making them throw on initialization: This is the worst of all worlds IMO - it restores, builds, and throws at runtime? Please no.
  3. Remove the msbuild error and allow out-of-support runtimes to rely on the .NETStandard asset.: I understand wanting to warn people that they're binding against different assets, but if you ship netstandard2.0 assets you support netstandard2.0. It's up to the runtime to support netstandard2.0 right? So if you test those netstandard2.0 assets, then your job is done. I guess the concerning part is "We don't test today on anything except supported .NETCoreApp and .NETFramework versions" implies to me that the netstandard2.0 assets aren't actually tested 🤔 And RE communication - I think that can be done via the TFMs you target - if you target netstandard2.0 and netcoreapp3.1, then clearly you're using netstandard2.0 for .NET Core 2.1. I don't see why additional warnings are required, that's just how NuGet works? 🤷
  4. Downgrade the msbuild error to an msbuild warning If you insist on a warning/error, this seems like the minimum.

This would still defeat the purpose of .NETStandard as many packages will most likely not work on older runtimes

I don't see why they wouldn't work? 🤔 If you have netstandard2.0 assets, and the runtime supports netstandard2.0, what's the problem? And how are you testing those .NET standard assets?

many of our libraries work fundamentally differently on .NETStandard and were never designed to be invoked on .NETFramework or .NETCoreApp

This bothers me. You're saying the .NET Standard libraries were never intended to work on platforms that support .NET Standard? I mean, that seems like the crux of this entire problem in that case, and it baffles me.

The whole point of .NET Standard as originally espoused was to allow targeting a common set of APIs. I get there was some nasty hacking around and shims etc to patch netstandard2.0 support into net461 (which caused no end of issues). And some "implementations" throw at runtime (a design which I similarly detest for obvious reasons). Is that the root cause?

@ViktorHofer
Copy link

ViktorHofer commented Jul 15, 2022

Thanks a lot for taking the time to share your feedback @andrewlock. We very much appreciate that 👍

This bothers me. You're saying the .NET Standard libraries were never intended to work on platforms that support .NET Standard? I mean, that seems like the crux of this entire problem in that case, and it baffles me.

Exactly and I agree that this is a problem. Some of our libraries assume that when they already target .NETCoreApp, the .NETStandard asset will be chosen for the the non .NETCoreApp cases. This assumption held at some point, but not anymore.

As a concrete example, here's the implementation of the System.Diagnostics.Activity.GetUtcNow method for net6.0, net7.0 and netstandard2.0: https://github.com/dotnet/runtime/blob/87aa786b3997a23501912abd4654e4fd9958230c/src/libraries/System.Diagnostics.DiagnosticSource/src/System/Diagnostics/Activity.DateTime.corefx.cs#L13-L14.

The implementation for net462 looks different as the runtime doesn't provide an accurate UtcNow value: https://github.com/dotnet/runtime/blob/main/src/libraries/System.Diagnostics.DiagnosticSource/src/System/Diagnostics/Activity.DateTime.netfx.cs.

These assumptions are a good example of why we decided to explicitly block the invocation of the .NETStandard asset on the .NETFramework and .NETCoreApp runtime. If the netstandard2.0 assembly's GetUtcNow method would be invoked on .NET Framework (which nuget would only promote for net461 as the package offers a net462 asset), you would notice a behavioral difference.

We hadn't had this problem before as we relied on NuGet promoting the .NETCoreApp asset on modern .NET and the .NETFramework asset on .NET Framework (This worked well when the baseline was consistent: net461, netcoreapp2.0, netstandard2.0). Now that we don't support net461 and netcoreapp2.0 anymore, we have these holes which we currently stitch by blocking consumption.
EDIT: Admittedly, this specific assumption wouldn't hold for any other runtime like UWP or Unity anyway.

@CodeBlanch
Copy link
Member Author

@ViktorHofer Thanks for jumping in on this! I totally understand the reasoning behind the change. I support the goals fully because as a maintainer of OTel, I don't really want to maintain out of support runtimes either 😄

The only challenge for me is the execution. The case we see in OTel is users have some mix of hosting applications (let's say .NET Framework 4.6.1 & .NET 5) which consume a shared library (.NETStandard2.0) which consumes OTel SDK (has .NETStandard2.0 and other things).

When they upgrade that shared NETStandard2.0 to OTel w/ DS 7 everything will appear fine. But then both of their hosts will no longer build, with errors from a library they don't necessarily even know exists.

Tough user experience, right? Kind of an ecosystem bomb for library authors out there.

I don't have a great solution. The only thing I have thought of is expand the nuget metadata(?) so you could say .NET Standard 2.0 (net462+, net6.0+, mono XYZ) or something like that so there is at least some hint about what is going on. In theory the tooling could be smarter in that case and say "You can't compile [.NET 5 host] because dependency chain Library1 -> OTel -> DS does not support .NET 5" (more or less). But probably a huge effort to do that?

@alanwest
Copy link
Member

alanwest commented Jul 15, 2022

I'm beginning to flip-flop on my position that we should not target DiagnosticSource 6 for older frameworks.

Taking off my .NET hat 🤠 and putting on just my OTel hat 🎩:

OTel is an emerging technology - it’s on us to promote its adoption first and foremost and not .NET.

Under this prioritization then, the right thing to do is to reference DS 6 for older unsupported frameworks because we want OTel to succeed and that requires our end users to be able to update frequently.

Our users on older frameworks won't get all the new features since many of them are in fact DiagnosticSource features (e.g., UpDownCounter), but they’ll get some of the new features (e.g., min/max on histograms, exponential histograms). We'd have communicate what features are available under which circumstances.


I anticipate that this idea will be super unpopular among maintainers (myself included) because I think in order to make this work, we'd have to drop our netstandard2.0 target and instead reintroduce our net5 target and maybe even a netcoreapp3.1 target, so as to avoid any degradation in OTel experience as runtime versions go out of support.

@tarekgh
Copy link
Contributor

tarekgh commented Jul 15, 2022

@alanwest

OTel is an emerging technology - it’s on us to promote its adoption first and foremost and not .NET.

For my education, do you have more data about that? I mean, the volume of users who are using OTel, targeting unsupported .NET version, want to use the new features, and not willing to upgrade, which will cause OTel adaptation issue? Thanks!

@noahfalk
Copy link

Feedback/Questions for Viktor

[@ViktorHofer] These assumptions are a good example of why we decided to explicitly block the invocation of the .NETStandard asset on the .NETFramework and .NETCoreApp runtime

@ViktorHofer - Thanks! The example was very useful to understand where you are coming from. My perspective is a bit different though : ) I didn't find this example about GetUtcNow a compelling justification of the policy. Partly it is because I anticipate the behavior change would be sufficiently small that 99.99% of customers wouldn't notice it at all, and partly because I think most developers would begrudgingly accept that a major version update of a NuGet package running on an unsupported runtime might introduce behavior changes and bugs.

A couple suggestions given my thinking on the issue so far:

  1. Whatever we do, we should make sure developers understand what to expect and I don't think our previous announcement is sufficient to have people understand all the implications of the new policy. Perhaps we should chat with Rich or Immo about how to best make clarifications? I'd be happy to help. Reading through the announcement you linked to, nothing in that announcement says .NET developers will get build errors now. I noticed one sentence that maybe was intended to imply this, but I never would have guessed that conclusion myself without significant additional deliberation:

If you're currently referencing an impacted package from an earlier framework, you'll no longer be able to update the referenced package to a later version

The claim is that developers won't be able to update, but it doesn't mention what is the enforcement mechanism. Also the sentence says "if you reference an impacted package", the announcement contains a list of these packages, and OpenTelemetry is not a library in that list. I think people will interpret "reference" to mean a direct reference that appears in their project file whereas the actual policy seems to include any reference in the transitive closure of their NuGet package graph. In practice I think our current policy is asking developers to audit their entire transitive package graph any time they upgrade anything which is a much greater scope than what they are likely to assume reading the announcement.

The announcement repeatedly says that .NETStandard 2.0 is supported, which is likely to lead most library developers who author .NETStandard 2.0 libraries to believe they are unaffected by this change. It might also lead app developers who target netcoreapp2.1 or net46 to believe that they too are OK by referencing packages with .NETStandard 2.0 binaries. I think we need to clarify that none of these assumptions is accurate under the current policy.

  1. How much feedback have we been hearing from customers around the presence of these poison pills either good or bad? Now that these poison pills have been present in our released packages for 6 months we should be able to evaluate how that decision is working out. Are we seeing a variety of people unhappy with the result or is confusion/aggravation relatively sparse?

Fwiw I'd be in favor of less aggresive blocking:

  • Instead of making it an error to restore on unsupported runtimes we could make it a warning.
  • We could change the error message so that people better understand the issue and their options. For example it could say:
netcoreapp2.1 is no longer under support and has not been tested with System.Runtime.CompilerServices.Unsafe 7.0.
Consider upgrading your TargetFramework to netcoreapp3.1 or later. You may also set
<SuppressTfmSupportBuildWarnings>true</SuppressTfmSupportBuildWarnings> in the project file
to ignore these errors and attempt to run in this unsupported configuration at your own risk.
  • I think it would probably be fine not to warn or block at all and just let people on unsupported runtimes see behavior changes, but I'm willing to accept that it wasn't what we decided to do in .NET 6 and without some significant negative feedback there is probably no reason to risk changing our .NET 6 decision that drastically.

Feedback/Questions for the OTel crew

[@cijothomas] The main concern here is the confusion. We anticipate issues being opened in our repo because the intention was to upgrade OTel, not these other transitive packages where the errors are being thrown.

DiagnosticSource version 6.0 has had the poison pill in it for netcoreapp2.1 since it shipped, and OpenTelemetry 1.2.0 depends on it. Given that 1.2.0 has been downloaded 500K times and has been available for 3 months, if users were confused wouldn't we already be seeing that signal? Have we gotten some complaints so far? Is there reason to believe that what is coming will be significantly worse than what we've already seen?

[@cijothomas] The case we see in OTel is users have some mix of hosting applications (let's say .NET Framework 4.6.1 & .NET 5)

What is Open Telemetry's stance on supporting runtime versions that the .NET team itself no longer supports? I think the easy policy would be to say when .NET team stops support, so does OTel. In that case when a hypothetical customer complains:

I just updated to OTel 1.4 and it gives me an error when I try to use it on net5.0. What gives?

I'd imagine the OTel response would be:

Microsoft support for .net5.0 ended May 10th 2022 and OTel stopped support at the same time. OTel 1.4 shipped in 2023
and did not include any .net 5.0 support. If you need to stay on .net5.0 despite it no longer being supported then we'd
recommend you use OTel 1.3 which was the last shipped version of OTel designed and tested to work with .net5.0.

PS A little shout out to Andrew

Off topic but I just wanted to say I've seen a lot of your blog posts on different topics and you consistently do an awesome job describing how .NET stuff works. I've written a number of the official .NET docs on docs.microsoft.com but I don't think I achieve the same level of flow and clarity that your posts have. I'm both envious and thankful that your content is out there : )

@alanwest
Copy link
Member

@tarekgh

do you have more data about that?

I can't tell you how much I want to be able to answer this question 😂, but the best I think I'll be able to do is come up with are anecdotes. I intend to reach out to a few .NET customers to try to get a sense of the impact. Will update you with anything I learn.

Measuring OTel adoption faceted by interesting things like language and runtime version are hard problems at the moment. Any attributes on telemetry identifying this kind of metadata are optional per the spec. Things like user agent might be used to at least identify the language of the library used to send telemetry. However, many customers proxy their data through the OpenTelemetry Collector, so from the perspective of user agent it looks like it came from a Go service.

@alanwest
Copy link
Member

@tarekgh Certainly doesn't answer all the questions we have, but I found it interesting...

This is data representing modern .NET applications (i.e., not .NET Framework) reporting to New Relic. This data is sourced from metadata New Relic's .NET agent sends when an application it's monitoring starts up and connects.

These are all connections from the last 30 days. The majority of which are from LTS versions of .NET: 50% from .NET Core 3.1 and 30% from .NET 6.

runtimes

I can't say how representative this data is for users of OpenTelemetry .NET, but it does at least indicate .NET Core 3.1 is still going strong. It'll be interesting to see how this changes between now and December when 3.1 reached EOL.

@tarekgh
Copy link
Contributor

tarekgh commented Jul 19, 2022

@alanwest this is helpful. In the next few months, if possible, we can monitor that to see how this chart changes. But the important data we need to get too (if this is possible), how many of these users want to use the new features (i.e. UpDownCounter) and will not be willing to upgrade. I know it is tough to get such data in the current time, but it will be good if we keep our eyes on that. Thanks again for your follow!

@danmoseley
Copy link

@alanwest your data there is interesting. Does New Relic post this kind of info regularly, or would consider doing so?

ViktorHofer added a commit to dotnet/runtime that referenced this issue Jul 20, 2022
Based on the discussed in open-telemetry/opentelemetry-dotnet#3448, we are now emitting a warning instead of an error on unsupported target frameworks. We still explicitly indicate that this is an unsupported scenario but don't actively block consumers anymore and also indicate how to suppress the warning.
@ViktorHofer
Copy link

@noahfalk thanks for your feedback, very much appreciate that. Let me comment on a few items:

I didn't find this example about GetUtcNow a compelling justification of the policy. Partly it is because I anticipate the behavior change would be sufficiently small that 99.99% of customers wouldn't notice it at all, and partly because I think most developers would begrudgingly accept that a major version update of a NuGet package running on an unsupported runtime might introduce behavior changes and bugs.

I think it would probably be fine not to warn or block at all and just let people on unsupported runtimes see behavior changes, but I'm willing to accept that it wasn't what we decided to do in .NET 6 and without some significant negative feedback there is probably no reason to risk changing our .NET 6 decision that drastically.

DiagnosticSource was chosen as an example as it's the package mentioned in this thread. Another example is System.IO.Ports which isn't runnable on .NETStandard at all. Any invocations of public methods throw PlatformNotSupportedExceptions and consumers won't notice this until they actually run the code on an unsupported framework like net461 or netcoreapp2.0. This effectively means that such packages don't work on any other netstandard2.0 compatible runtimes than >= net462 and net6.0.
There are more of these packages which either have a different implementation for .NETStandard or don't work at all. For these reasons we believed that blocking consumers is the right action.

I don't think that we should make fundamental changes in that regard for .NET 7 but very much would love refine our current support strategy and see if we can come up with a better design that better communicates (to customers) which platforms and frameworks are supported and which aren't.

How much feedback have we been hearing from customers around the presence of these poison pills either good or bad? Now that these poison pills have been present in our released packages for 6 months we should be able to evaluate how that decision is working out. Are we seeing a variety of people unhappy with the result or is confusion/aggravation relatively sparse?

Aside from @andrewlock's blog post and a few issues in the last year, we didn't receive too much feedback on this behavior. That doesn't necessarily mean the current behavior is well received but just that customers didn't reach out directly.

Note that parts of the changes that were mentioned in the announcement already happened earlier in the .NET 6 cycle, starting with P5, until RC1.

The announcement repeatedly says that .NETStandard 2.0 is supported, which is likely to lead most library developers who author .NETStandard 2.0 libraries to believe they are unaffected by this change. It might also lead app developers who target netcoreapp2.1 or net46 to believe that they too are OK by referencing packages with .NETStandard 2.0 binaries. I think we need to clarify that none of these assumptions is accurate under the current policy.

Agreed, though I don't yet know what the right action would be for all the packages that offer a .NETStandard contract but a fully throwing implementation. Would it be OK to drop .NET Standard support for these completely? What about app models that still heavily rely on .NET Standard contracts, like source generators which either run on modern .NET or on .NET Framework (as part of roslyn)? Those wouldn't be able to reference such packages anymore without multi-targeting and I don't know if that's even possible today for source generators. More importantly, class library that target .NET Standard would be cut off as well. How would customers perceive us dropping .NET Standard support from some packages?

Instead of making it an error to restore on unsupported runtimes we could make it a warning.
We could change the error message so that people better understand the issue and their options. For example it could say:

That's what I and others like Andrew were suggestion above as well and I believe this is the best short-term fix that we can apply to the 7.0.x packages (that already ship as part of the .NET 7 previews). I submitted dotnet/runtime#72518.

@andrewlock
Copy link

Thanks @ViktorHofer. With regard to dropping .NET Standard support - if the package throws at runtime (for significant parts of the API), then in my opinion it doesn't support netstandard already. But it doesn't warn you at build time, so it gives you the worst of both worlds!

Source generators and analyzers are clearly in a strange place: they require netstandard but then presumably run on netfx or netcore in practice. It feels like the only longer term "solution" here is for analyzers etc to target a specific runtime, e.g. net462 and netcoreapp3.1. that would solve the issue completely, though this is such a big breaking change (which would require changes in the compiler) that presumably this will never happen?

(Apologies for poor formatting, AFK ATM)

@alanwest
Copy link
Member

@danmoseley

Does New Relic post this kind of info regularly, or would consider doing so?

This is not something New Relic posts regularly today. Though, it does produce a regular update of the Java ecosystem based on similar data from NR's Java agent. I can certainly reach out to see if this is something that could also be considered for the .NET ecosystem.

@danmoseley
Copy link

@alanwest personally I think any data of this sort you're comfortable sharing regularly could be valuable for the .NET ecosystem, for example it could help inform .NET platform decisions.
cc @DamianEdwards

@noahfalk
Copy link

Another example is System.IO.Ports which isn't runnable on .NETStandard at all. Any invocations of public methods throw PlatformNotSupportedExceptions...

Yikes! That sounds a lot more problematic. Just brainstorming...

  1. Maybe we should let .NET library owners decide whether to give a build time error or warning based on the likelihood of failure? I feel like we could communicate different expectations for libraries where it probably works but we've stopped testing it on the old runtime you are using vs. we know this will fail immediately if you run it.
  2. Are these libraries with no viable NS20 implementation the outliers or are there a bunch of them?
  3. In the future perhaps there is a way we could mark libraries with bad NS20 implementations so that the libraries which reference them will get an error when they build, but also those libraries would have a way to suppress the error and preserve the reference if they choose to? Maybe that marking would be removing the NS20 TFM or maybe it is something else... I'm not at all an expert in NuGet and I fully admit I am in handwavy territory : )

I submitted dotnet/runtime#72518.

Thanks! Aside from the product side changes, how do you feel about chatting with Immo or Rich about potentially clarifying our past announcement or maybe messaging it somewhere else (blog post, doc on msdn)? I'm glad to help but I don't want to push the messaging alone if nobody else thinks it is worthwhile : )

mythz added a commit to ServiceStack/ServiceStack that referenced this issue Jul 26, 2022
@andrewlock
Copy link

Just thought I'd flag it here @noahfalk and @ViktorHofer - ServiceStack recently added a dependency on DiagnosticSource v6.0 NuGet pkg on the basis that it "supports" netstandard 2.0. As we've already discussed, that's not really the case, so in reality ServiceStack accidentally dropped support for .NET Core 2.1 and .NET Core 3.0 (as well as any other .NET Standard platforms) in a minor release.

To me, this really highlights the danger of keeping the netstandard2.0 targets - @mythz is an experienced .NET developer, and still got caught by this.

From twitter:
image

@mythz
Copy link

mythz commented Jul 26, 2022

Want to thank @andrewlock for bringing this to our attention and reiterate that the behavior of not supporting something you say you do is a mine field that defies all expectations.

It'd be ok if this was an inert package with stub classes that did nothing on unsupported platforms, but to throw runtime exceptions and unintuitively break platforms it says it supports is a footgun that should never have existed. IMO its netstandard2.0 implementation should be changed so it has inert behavior that's silently ignored or netstandard2.0 should be removed as a supported platform.

ViktorHofer added a commit to dotnet/runtime that referenced this issue Jul 26, 2022
Based on the discussed in open-telemetry/opentelemetry-dotnet#3448, we are now emitting a warning instead of an error on unsupported target frameworks. We still explicitly indicate that this is an unsupported scenario but don't actively block consumers anymore and also indicate how to suppress the warning.
@ViktorHofer
Copy link

ViktorHofer commented Jul 26, 2022

Thanks for sharing your feedback @mythz and @andrewlock. With dotnet/runtime#72518, the currently emitted error is downgraded to a warning. That means that we don't actively block consumers anymore but still indicate that consuming the packages on an unsupported runtime is a non supported scenario.

Please see packages.xlsx which highlights all packages produced by dotnet/runtime which throw on runtime.

Please stop lying about .NET Standard 2.0 support!

To me, this really highlights the danger of keeping the netstandard2.0 targets

As we now emit a warning and by that don't actively block consumers on unsupported runtimes anymore, the remaining feedback item is around not lying about supporting .NETStandard. This would require removing the .NETStandard asset from the following 18 packages (from the above shared file):

  • Microsoft.Win32.Registry.AccessControl
  • Microsoft.Win32.SystemEvents
  • System.ComponentModel.Composition
  • System.Data.Odbc
  • System.Data.OleDb
  • System.Diagnostics.EventLog
  • System.Diagnostics.PerformanceCounter
  • System.DirectoryServices
  • System.DirectoryServices.AccountManagement
  • System.DirectoryServices.Protocols
  • System.Drawing.Common
  • System.IO.Ports
  • System.Management
  • System.Reflection.Context
  • System.Security.Cryptography.ProtectedData
  • System.ServiceProcess.ServiceController
  • System.Speech
  • System.Threading.AccessControl

Thoughts on that? Those netstandard2.0 assets are currently exposed as refs but are throwing on an unsupported runtime (i.e. netcoreapp2.0, net461, ...) or a runtime other than .NETFramework and .NETCoreApp.

Note that the DiagnosticSource package isn't on that list as its .NET Standard implementation does work as expected. This doesn't imply that the .NET Standard asset is supported on unsupported runtimes though.

@mythz
Copy link

mythz commented Jul 26, 2022

Note that the DiagnosticSource package isn't on that list as its .NET Standard implementation does work as expected.

Not clear if this applies to .NET Core 2.1/3.x/5, i.e. could I safely use System.Diagnostics.* classes like DiagnosticListener and Activity in netstandard2.0 TFM builds without it throwing runtime exceptions in these platforms?

@ViktorHofer
Copy link

ViktorHofer commented Jul 26, 2022

Not clear if this applies to .NET Core 2.1/3.x/5, i.e. could I safely use System.Diagnostics.* classes like DiagnosticListener and Activity in netstandard2.0 TFM builds without it throwing runtime exceptions in these platforms?

You can today as the .NET Standard implementation does work as expected but it's an unsupported scenario on out-of-support runtimes (i.e. net461, netcoreapp2.1, netcoreapp3.1 and net5.0 for the .NET 7 package).

That is what we try to communicate via the warning that we emit (i.e. when running on netcoreapp2.1):

System.Diagnostics.DiagnosticSource 7.0.0 doesn't support netcoreapp2.1 and has not been tested with it. Consider upgrading your TargetFramework to net6.0 or later. You may also set <SuppressTfmSupportBuildWarnings>true</SuppressTfmSupportBuildWarnings> in the project file to ignore this warning and attempt to run in this unsupported configuration at your own risk.

@mythz
Copy link

mythz commented Jul 26, 2022

Ok great thx for the clarification, I don't mind that it doesn't work, only that it doesn't throw runtime exceptions breaking older runtimes from working when System.Diagnostics classes are accessed.

@mythz
Copy link

mythz commented Jul 27, 2022

@ViktorHofer Happy to confirm, after reverting our change and reincluding System.Diagnostics.DiagnosticSource v6.0.0 in netstandard2.0 TFM build works in .NET Core 3.1/5 but can't build in <= .NET Core 3.0 due to its System.Runtime.CompilerServices.Unsafe v6.0.0 transitive dep requiring .netcoreapp3.1+.

@Romfos
Copy link

Romfos commented Jan 4, 2024

If library with netstandard2.0 do not work on runtime that support netstandard2.0 - what sense to have it in tfm list?
I think now it is a time to start EOL of .net standard and just use multitargeting for minimal supported runtimes like net462 + net6.0

@RoderickIveans
Copy link

@danmoseley @ViktorHofer @tarekgh Should the runtime library consider drop netstandard support?
dotnet/runtime#107081

@MSDN-WhiteKnight
Copy link

What are concrete benefits for Open Telemetry if .NET Standard is dropped in runtime? Just the reduced users confusion when they get warning on unsupported runtime version? If the .NET Standard support is dropped, OT still would need to do something, like drop .NET Standard TFM in their libraries, so they continue to build, right? So it does not seem like this issue justifies this big breaking change to runtime. It's simpler if it would be handled on OT side.

adrianiftode added a commit to adrianiftode/FluentAssertions.Web that referenced this issue Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

No branches or pull requests