Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CompatHelper: bump compat for "ProtoBuf" to "0.10" #35

Conversation

github-actions[bot]
Copy link
Contributor

@github-actions github-actions bot commented Dec 2, 2020

This pull request changes the compat entry for the ProtoBuf package from 0.7, 0.8 to 0.7, 0.8, 0.10.

This keeps the compat entries for earlier versions.

Note: I have not tested your package with this new compat entry. It is your responsibility to make sure that your package tests pass before you merge this pull request.

@NHDaly
Copy link
Member

NHDaly commented Dec 22, 2021

It looks like the ProtoBuf API changes between 0.8 and 0.10, so we probably need to do some manual work here if we want this to succeed:

  UndefVarError: fillunset not defined
  Stacktrace:
   [1] PProf.perftools.profiles.ValueType(; kwargs::Base.Iterators.Pairs{Symbol, Int64, Tuple{Symbol, Symbol}, NamedTuple{(:_type, :unit), Tuple{Int64, Int64}}})
     @ PProf.perftools.profiles ~/build/JuliaPerf/PProf.jl/lib/profile_pb.jl:9
   [2] (::PProf.var"#ValueType!#4")(_type::String, unit::String)
     @ PProf ~/build/JuliaPerf/PProf.jl/src/PProf.jl:124
   [3] pprof(data::Nothing, lidict::Nothing; sampling_delay::Nothing, web::Bool, webhost::String, webport::Int64, out::String, from_c::Bool, full_signatures::Bool, drop_frames::Nothing, keep_frames::Nothing, ui_relative_percentages::Bool)
     @ PProf ~/build/JuliaPerf/PProf.jl/src/PProf.jl:136

@vchuravy vchuravy closed this Oct 3, 2022
@vchuravy vchuravy deleted the compathelper/new_version/2020-12-02-00-13-25-001-3436503229 branch October 3, 2022 14:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants