Description
Consider this curious case:
#![allow(dead_code)]
#![feature(conservative_impl_trait)]
#![feature(in_band_lifetimes)]
use std::cell::Cell;
trait Trait<'a> { }
impl Trait<'b> for Cell<&'a u32> { }
fn foo(x: Cell<&'x u32>) -> impl Trait<'y>
where 'x: 'y
{
x
}
fn main() { }
Here, the value we are returning is of type Cell<&'x u32>
, but the return type ought only to be able to mention 'y
. In other words, we are inferring something like
abstract type Foo<'z>: Trait<'z> = Cell<&'x u32>
which is clearly malformed. We allow this because the current code has regionck bound all lifetimes with 'y
and considers that sufficient to ensure that nothing is "leaking" that shouldn't -- but, as can be seen here, that's not always true.
I don't think we should accept this -- at least for now. However, I also could not find an obvious way to weaponize it. There may not be one. Consider that this variant, using Box<dyn Trait<'y> + 'y>
in place of impl Trait<'y>
, type-checks and ought to be sound (afaik). (In part, though, this relies on the fact that we expand impl Trait<'y>
to impl Trait<'y> + 'y
internally, but it seems like impl Trait<'y>
must outlive 'y
just by construction, since it could not name any other lifetime.)
One could imagine permitting the example via some sort of subtyping relation on abstract types -- that is, perhaps we might determine a variance for Foo
with respect to its type parameters based on the traits that it implements. Not sure if that would be sound, but you could imagine it. Until we have such a justification, though, I think we should not accept the example.
I'll probably fix this en passane while doing some refactoring to support impl Trait
in the NLL code.
cc #34511