rustc_mir_build/builder/
scope.rs

1/*!
2Managing the scope stack. The scopes are tied to lexical scopes, so as
3we descend the THIR, we push a scope on the stack, build its
4contents, and then pop it off. Every scope is named by a
5`region::Scope`.
6
7### SEME Regions
8
9When pushing a new [Scope], we record the current point in the graph (a
10basic block); this marks the entry to the scope. We then generate more
11stuff in the control-flow graph. Whenever the scope is exited, either
12via a `break` or `return` or just by fallthrough, that marks an exit
13from the scope. Each lexical scope thus corresponds to a single-entry,
14multiple-exit (SEME) region in the control-flow graph.
15
16For now, we record the `region::Scope` to each SEME region for later reference
17(see caveat in next paragraph). This is because destruction scopes are tied to
18them. This may change in the future so that MIR lowering determines its own
19destruction scopes.
20
21### Not so SEME Regions
22
23In the course of building matches, it sometimes happens that certain code
24(namely guards) gets executed multiple times. This means that the scope lexical
25scope may in fact correspond to multiple, disjoint SEME regions. So in fact our
26mapping is from one scope to a vector of SEME regions. Since the SEME regions
27are disjoint, the mapping is still one-to-one for the set of SEME regions that
28we're currently in.
29
30Also in matches, the scopes assigned to arms are not always even SEME regions!
31Each arm has a single region with one entry for each pattern. We manually
32manipulate the scheduled drops in this scope to avoid dropping things multiple
33times.
34
35### Drops
36
37The primary purpose for scopes is to insert drops: while building
38the contents, we also accumulate places that need to be dropped upon
39exit from each scope. This is done by calling `schedule_drop`. Once a
40drop is scheduled, whenever we branch out we will insert drops of all
41those places onto the outgoing edge. Note that we don't know the full
42set of scheduled drops up front, and so whenever we exit from the
43scope we only drop the values scheduled thus far. For example, consider
44the scope S corresponding to this loop:
45
46```
47# let cond = true;
48loop {
49    let x = ..;
50    if cond { break; }
51    let y = ..;
52}
53```
54
55When processing the `let x`, we will add one drop to the scope for
56`x`. The break will then insert a drop for `x`. When we process `let
57y`, we will add another drop (in fact, to a subscope, but let's ignore
58that for now); any later drops would also drop `y`.
59
60### Early exit
61
62There are numerous "normal" ways to early exit a scope: `break`,
63`continue`, `return` (panics are handled separately). Whenever an
64early exit occurs, the method `break_scope` is called. It is given the
65current point in execution where the early exit occurs, as well as the
66scope you want to branch to (note that all early exits from to some
67other enclosing scope). `break_scope` will record the set of drops currently
68scheduled in a [DropTree]. Later, before `in_breakable_scope` exits, the drops
69will be added to the CFG.
70
71Panics are handled in a similar fashion, except that the drops are added to the
72MIR once the rest of the function has finished being lowered. If a terminator
73can panic, call `diverge_from(block)` with the block containing the terminator
74`block`.
75
76### Breakable scopes
77
78In addition to the normal scope stack, we track a loop scope stack
79that contains only loops and breakable blocks. It tracks where a `break`,
80`continue` or `return` should go to.
81
82*/
83
84use std::mem;
85
86use interpret::ErrorHandled;
87use rustc_data_structures::fx::FxHashMap;
88use rustc_hir::HirId;
89use rustc_index::{IndexSlice, IndexVec};
90use rustc_middle::middle::region;
91use rustc_middle::mir::{self, *};
92use rustc_middle::thir::{AdtExpr, AdtExprBase, ArmId, ExprId, ExprKind};
93use rustc_middle::ty::{self, Ty, TyCtxt, TypeVisitableExt, ValTree};
94use rustc_middle::{bug, span_bug};
95use rustc_pattern_analysis::rustc::RustcPatCtxt;
96use rustc_session::lint::Level;
97use rustc_span::source_map::Spanned;
98use rustc_span::{DUMMY_SP, Span};
99use tracing::{debug, instrument};
100
101use super::matches::BuiltMatchTree;
102use crate::builder::{BlockAnd, BlockAndExtension, BlockFrame, Builder, CFG};
103use crate::errors::{
104    ConstContinueBadConst, ConstContinueNotMonomorphicConst, ConstContinueUnknownJumpTarget,
105};
106
107#[derive(#[automatically_derived]
impl<'tcx> ::core::fmt::Debug for Scopes<'tcx> {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        let names: &'static _ =
            &["scopes", "breakable_scopes", "const_continuable_scopes",
                        "if_then_scope", "unwind_drops", "coroutine_drops"];
        let values: &[&dyn ::core::fmt::Debug] =
            &[&self.scopes, &self.breakable_scopes,
                        &self.const_continuable_scopes, &self.if_then_scope,
                        &self.unwind_drops, &&self.coroutine_drops];
        ::core::fmt::Formatter::debug_struct_fields_finish(f, "Scopes", names,
            values)
    }
}Debug)]
108pub(crate) struct Scopes<'tcx> {
109    scopes: Vec<Scope>,
110
111    /// The current set of breakable scopes. See module comment for more details.
112    breakable_scopes: Vec<BreakableScope<'tcx>>,
113
114    const_continuable_scopes: Vec<ConstContinuableScope<'tcx>>,
115
116    /// The scope of the innermost if-then currently being lowered.
117    if_then_scope: Option<IfThenScope>,
118
119    /// Drops that need to be done on unwind paths. See the comment on
120    /// [DropTree] for more details.
121    unwind_drops: DropTree,
122
123    /// Drops that need to be done on paths to the `CoroutineDrop` terminator.
124    coroutine_drops: DropTree,
125}
126
127#[derive(#[automatically_derived]
impl ::core::fmt::Debug for Scope {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        let names: &'static _ =
            &["source_scope", "region_scope", "drops", "moved_locals",
                        "cached_unwind_block", "cached_coroutine_drop_block"];
        let values: &[&dyn ::core::fmt::Debug] =
            &[&self.source_scope, &self.region_scope, &self.drops,
                        &self.moved_locals, &self.cached_unwind_block,
                        &&self.cached_coroutine_drop_block];
        ::core::fmt::Formatter::debug_struct_fields_finish(f, "Scope", names,
            values)
    }
}Debug)]
128struct Scope {
129    /// The source scope this scope was created in.
130    source_scope: SourceScope,
131
132    /// the region span of this scope within source code.
133    region_scope: region::Scope,
134
135    /// set of places to drop when exiting this scope. This starts
136    /// out empty but grows as variables are declared during the
137    /// building process. This is a stack, so we always drop from the
138    /// end of the vector (top of the stack) first.
139    drops: Vec<DropData>,
140
141    moved_locals: Vec<Local>,
142
143    /// The drop index that will drop everything in and below this scope on an
144    /// unwind path.
145    cached_unwind_block: Option<DropIdx>,
146
147    /// The drop index that will drop everything in and below this scope on a
148    /// coroutine drop path.
149    cached_coroutine_drop_block: Option<DropIdx>,
150}
151
152#[derive(#[automatically_derived]
impl ::core::clone::Clone for DropData {
    #[inline]
    fn clone(&self) -> DropData {
        let _: ::core::clone::AssertParamIsClone<SourceInfo>;
        let _: ::core::clone::AssertParamIsClone<Local>;
        let _: ::core::clone::AssertParamIsClone<DropKind>;
        *self
    }
}Clone, #[automatically_derived]
impl ::core::marker::Copy for DropData { }Copy, #[automatically_derived]
impl ::core::fmt::Debug for DropData {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field3_finish(f, "DropData",
            "source_info", &self.source_info, "local", &self.local, "kind",
            &&self.kind)
    }
}Debug)]
153struct DropData {
154    /// The `Span` where drop obligation was incurred (typically where place was
155    /// declared)
156    source_info: SourceInfo,
157
158    /// local to drop
159    local: Local,
160
161    /// Whether this is a value Drop or a StorageDead.
162    kind: DropKind,
163}
164
165#[derive(#[automatically_derived]
impl ::core::fmt::Debug for DropKind {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::write_str(f,
            match self {
                DropKind::Value => "Value",
                DropKind::Storage => "Storage",
                DropKind::ForLint => "ForLint",
            })
    }
}Debug, #[automatically_derived]
impl ::core::clone::Clone for DropKind {
    #[inline]
    fn clone(&self) -> DropKind { *self }
}Clone, #[automatically_derived]
impl ::core::marker::Copy for DropKind { }Copy, #[automatically_derived]
impl ::core::cmp::PartialEq for DropKind {
    #[inline]
    fn eq(&self, other: &DropKind) -> bool {
        let __self_discr = ::core::intrinsics::discriminant_value(self);
        let __arg1_discr = ::core::intrinsics::discriminant_value(other);
        __self_discr == __arg1_discr
    }
}PartialEq, #[automatically_derived]
impl ::core::cmp::Eq for DropKind {
    #[inline]
    #[doc(hidden)]
    #[coverage(off)]
    fn assert_receiver_is_total_eq(&self) -> () {}
}Eq, #[automatically_derived]
impl ::core::hash::Hash for DropKind {
    #[inline]
    fn hash<__H: ::core::hash::Hasher>(&self, state: &mut __H) -> () {
        let __self_discr = ::core::intrinsics::discriminant_value(self);
        ::core::hash::Hash::hash(&__self_discr, state)
    }
}Hash)]
166pub(crate) enum DropKind {
167    Value,
168    Storage,
169    ForLint,
170}
171
172#[derive(#[automatically_derived]
impl<'tcx> ::core::fmt::Debug for BreakableScope<'tcx> {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field4_finish(f,
            "BreakableScope", "region_scope", &self.region_scope,
            "break_destination", &self.break_destination, "break_drops",
            &self.break_drops, "continue_drops", &&self.continue_drops)
    }
}Debug)]
173struct BreakableScope<'tcx> {
174    /// Region scope of the loop
175    region_scope: region::Scope,
176    /// The destination of the loop/block expression itself (i.e., where to put
177    /// the result of a `break` or `return` expression)
178    break_destination: Place<'tcx>,
179    /// Drops that happen on the `break`/`return` path.
180    break_drops: DropTree,
181    /// Drops that happen on the `continue` path.
182    continue_drops: Option<DropTree>,
183}
184
185#[derive(#[automatically_derived]
impl<'tcx> ::core::fmt::Debug for ConstContinuableScope<'tcx> {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field5_finish(f,
            "ConstContinuableScope", "region_scope", &self.region_scope,
            "state_place", &self.state_place, "arms", &self.arms,
            "built_match_tree", &self.built_match_tree,
            "const_continue_drops", &&self.const_continue_drops)
    }
}Debug)]
186struct ConstContinuableScope<'tcx> {
187    /// The scope for the `#[loop_match]` which its `#[const_continue]`s will jump to.
188    region_scope: region::Scope,
189    /// The place of the state of a `#[loop_match]`, which a `#[const_continue]` must update.
190    state_place: Place<'tcx>,
191
192    arms: Box<[ArmId]>,
193    built_match_tree: BuiltMatchTree<'tcx>,
194
195    /// Drops that happen on a `#[const_continue]`
196    const_continue_drops: DropTree,
197}
198
199#[derive(#[automatically_derived]
impl ::core::fmt::Debug for IfThenScope {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field2_finish(f, "IfThenScope",
            "region_scope", &self.region_scope, "else_drops",
            &&self.else_drops)
    }
}Debug)]
200struct IfThenScope {
201    /// The if-then scope or arm scope
202    region_scope: region::Scope,
203    /// Drops that happen on the `else` path.
204    else_drops: DropTree,
205}
206
207/// The target of an expression that breaks out of a scope
208#[derive(#[automatically_derived]
impl ::core::clone::Clone for BreakableTarget {
    #[inline]
    fn clone(&self) -> BreakableTarget {
        let _: ::core::clone::AssertParamIsClone<region::Scope>;
        let _: ::core::clone::AssertParamIsClone<region::Scope>;
        *self
    }
}Clone, #[automatically_derived]
impl ::core::marker::Copy for BreakableTarget { }Copy, #[automatically_derived]
impl ::core::fmt::Debug for BreakableTarget {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        match self {
            BreakableTarget::Continue(__self_0) =>
                ::core::fmt::Formatter::debug_tuple_field1_finish(f,
                    "Continue", &__self_0),
            BreakableTarget::Break(__self_0) =>
                ::core::fmt::Formatter::debug_tuple_field1_finish(f, "Break",
                    &__self_0),
            BreakableTarget::Return =>
                ::core::fmt::Formatter::write_str(f, "Return"),
        }
    }
}Debug)]
209pub(crate) enum BreakableTarget {
210    Continue(region::Scope),
211    Break(region::Scope),
212    Return,
213}
214
215impl ::std::fmt::Debug for DropIdx {
    fn fmt(&self, fmt: &mut ::std::fmt::Formatter<'_>) -> ::std::fmt::Result {
        fmt.write_fmt(format_args!("{0}", self.as_u32()))
    }
}rustc_index::newtype_index! {
216    #[orderable]
217    struct DropIdx {}
218}
219
220const ROOT_NODE: DropIdx = DropIdx::ZERO;
221
222/// A tree of drops that we have deferred lowering. It's used for:
223///
224/// * Drops on unwind paths
225/// * Drops on coroutine drop paths (when a suspended coroutine is dropped)
226/// * Drops on return and loop exit paths
227/// * Drops on the else path in an `if let` chain
228///
229/// Once no more nodes could be added to the tree, we lower it to MIR in one go
230/// in `build_mir`.
231#[derive(#[automatically_derived]
impl ::core::fmt::Debug for DropTree {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field3_finish(f, "DropTree",
            "drop_nodes", &self.drop_nodes, "existing_drops_map",
            &self.existing_drops_map, "entry_points", &&self.entry_points)
    }
}Debug)]
232struct DropTree {
233    /// Nodes in the drop tree, containing drop data and a link to the next node.
234    drop_nodes: IndexVec<DropIdx, DropNode>,
235    /// Map for finding the index of an existing node, given its contents.
236    existing_drops_map: FxHashMap<DropNodeKey, DropIdx>,
237    /// Edges into the `DropTree` that need to be added once it's lowered.
238    entry_points: Vec<(DropIdx, BasicBlock)>,
239}
240
241/// A single node in the drop tree.
242#[derive(#[automatically_derived]
impl ::core::fmt::Debug for DropNode {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field2_finish(f, "DropNode",
            "data", &self.data, "next", &&self.next)
    }
}Debug)]
243struct DropNode {
244    /// Info about the drop to be performed at this node in the drop tree.
245    data: DropData,
246    /// Index of the "next" drop to perform (in drop order, not declaration order).
247    next: DropIdx,
248}
249
250/// Subset of [`DropNode`] used for reverse lookup in a hash table.
251#[derive(#[automatically_derived]
impl ::core::fmt::Debug for DropNodeKey {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        ::core::fmt::Formatter::debug_struct_field2_finish(f, "DropNodeKey",
            "next", &self.next, "local", &&self.local)
    }
}Debug, #[automatically_derived]
impl ::core::cmp::PartialEq for DropNodeKey {
    #[inline]
    fn eq(&self, other: &DropNodeKey) -> bool {
        self.next == other.next && self.local == other.local
    }
}PartialEq, #[automatically_derived]
impl ::core::cmp::Eq for DropNodeKey {
    #[inline]
    #[doc(hidden)]
    #[coverage(off)]
    fn assert_receiver_is_total_eq(&self) -> () {
        let _: ::core::cmp::AssertParamIsEq<DropIdx>;
        let _: ::core::cmp::AssertParamIsEq<Local>;
    }
}Eq, #[automatically_derived]
impl ::core::hash::Hash for DropNodeKey {
    #[inline]
    fn hash<__H: ::core::hash::Hasher>(&self, state: &mut __H) -> () {
        ::core::hash::Hash::hash(&self.next, state);
        ::core::hash::Hash::hash(&self.local, state)
    }
}Hash)]
252struct DropNodeKey {
253    next: DropIdx,
254    local: Local,
255}
256
257impl Scope {
258    /// Whether there's anything to do for the cleanup path, that is,
259    /// when unwinding through this scope. This includes destructors,
260    /// but not StorageDead statements, which don't get emitted at all
261    /// for unwinding, for several reasons:
262    ///  * clang doesn't emit llvm.lifetime.end for C++ unwinding
263    ///  * LLVM's memory dependency analysis can't handle it atm
264    ///  * polluting the cleanup MIR with StorageDead creates
265    ///    landing pads even though there's no actual destructors
266    ///  * freeing up stack space has no effect during unwinding
267    /// Note that for coroutines we do emit StorageDeads, for the
268    /// use of optimizations in the MIR coroutine transform.
269    fn needs_cleanup(&self) -> bool {
270        self.drops.iter().any(|drop| match drop.kind {
271            DropKind::Value | DropKind::ForLint => true,
272            DropKind::Storage => false,
273        })
274    }
275
276    fn invalidate_cache(&mut self) {
277        self.cached_unwind_block = None;
278        self.cached_coroutine_drop_block = None;
279    }
280}
281
282/// A trait that determined how [DropTree] creates its blocks and
283/// links to any entry nodes.
284trait DropTreeBuilder<'tcx> {
285    /// Create a new block for the tree. This should call either
286    /// `cfg.start_new_block()` or `cfg.start_new_cleanup_block()`.
287    fn make_block(cfg: &mut CFG<'tcx>) -> BasicBlock;
288
289    /// Links a block outside the drop tree, `from`, to the block `to` inside
290    /// the drop tree.
291    fn link_entry_point(cfg: &mut CFG<'tcx>, from: BasicBlock, to: BasicBlock);
292}
293
294impl DropTree {
295    fn new() -> Self {
296        // The root node of the tree doesn't represent a drop, but instead
297        // represents the block in the tree that should be jumped to once all
298        // of the required drops have been performed.
299        let fake_source_info = SourceInfo::outermost(DUMMY_SP);
300        let fake_data =
301            DropData { source_info: fake_source_info, local: Local::MAX, kind: DropKind::Storage };
302        let drop_nodes = IndexVec::from_raw(<[_]>::into_vec(::alloc::boxed::box_new([DropNode {
                    data: fake_data,
                    next: DropIdx::MAX,
                }]))vec![DropNode { data: fake_data, next: DropIdx::MAX }]);
303        Self { drop_nodes, entry_points: Vec::new(), existing_drops_map: FxHashMap::default() }
304    }
305
306    /// Adds a node to the drop tree, consisting of drop data and the index of
307    /// the "next" drop (in drop order), which could be the sentinel [`ROOT_NODE`].
308    ///
309    /// If there is already an equivalent node in the tree, nothing is added, and
310    /// that node's index is returned. Otherwise, the new node's index is returned.
311    fn add_drop(&mut self, data: DropData, next: DropIdx) -> DropIdx {
312        let drop_nodes = &mut self.drop_nodes;
313        *self
314            .existing_drops_map
315            .entry(DropNodeKey { next, local: data.local })
316            // Create a new node, and also add its index to the map.
317            .or_insert_with(|| drop_nodes.push(DropNode { data, next }))
318    }
319
320    /// Registers `from` as an entry point to this drop tree, at `to`.
321    ///
322    /// During [`Self::build_mir`], `from` will be linked to the corresponding
323    /// block within the drop tree.
324    fn add_entry_point(&mut self, from: BasicBlock, to: DropIdx) {
325        if true {
    if !(to < self.drop_nodes.next_index()) {
        ::core::panicking::panic("assertion failed: to < self.drop_nodes.next_index()")
    };
};debug_assert!(to < self.drop_nodes.next_index());
326        self.entry_points.push((to, from));
327    }
328
329    /// Builds the MIR for a given drop tree.
330    fn build_mir<'tcx, T: DropTreeBuilder<'tcx>>(
331        &mut self,
332        cfg: &mut CFG<'tcx>,
333        root_node: Option<BasicBlock>,
334    ) -> IndexVec<DropIdx, Option<BasicBlock>> {
335        {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:335",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(335u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("DropTree::build_mir(drops = {0:#?})",
                                                    self) as &dyn Value))])
            });
    } else { ; }
};debug!("DropTree::build_mir(drops = {:#?})", self);
336
337        let mut blocks = self.assign_blocks::<T>(cfg, root_node);
338        self.link_blocks(cfg, &mut blocks);
339
340        blocks
341    }
342
343    /// Assign blocks for all of the drops in the drop tree that need them.
344    fn assign_blocks<'tcx, T: DropTreeBuilder<'tcx>>(
345        &mut self,
346        cfg: &mut CFG<'tcx>,
347        root_node: Option<BasicBlock>,
348    ) -> IndexVec<DropIdx, Option<BasicBlock>> {
349        // StorageDead statements can share blocks with each other and also with
350        // a Drop terminator. We iterate through the drops to find which drops
351        // need their own block.
352        #[derive(#[automatically_derived]
impl ::core::clone::Clone for Block {
    #[inline]
    fn clone(&self) -> Block {
        let _: ::core::clone::AssertParamIsClone<DropIdx>;
        *self
    }
}Clone, #[automatically_derived]
impl ::core::marker::Copy for Block { }Copy)]
353        enum Block {
354            // This drop is unreachable
355            None,
356            // This drop is only reachable through the `StorageDead` with the
357            // specified index.
358            Shares(DropIdx),
359            // This drop has more than one way of being reached, or it is
360            // branched to from outside the tree, or its predecessor is a
361            // `Value` drop.
362            Own,
363        }
364
365        let mut blocks = IndexVec::from_elem(None, &self.drop_nodes);
366        blocks[ROOT_NODE] = root_node;
367
368        let mut needs_block = IndexVec::from_elem(Block::None, &self.drop_nodes);
369        if root_node.is_some() {
370            // In some cases (such as drops for `continue`) the root node
371            // already has a block. In this case, make sure that we don't
372            // override it.
373            needs_block[ROOT_NODE] = Block::Own;
374        }
375
376        // Sort so that we only need to check the last value.
377        let entry_points = &mut self.entry_points;
378        entry_points.sort();
379
380        for (drop_idx, drop_node) in self.drop_nodes.iter_enumerated().rev() {
381            if entry_points.last().is_some_and(|entry_point| entry_point.0 == drop_idx) {
382                let block = *blocks[drop_idx].get_or_insert_with(|| T::make_block(cfg));
383                needs_block[drop_idx] = Block::Own;
384                while entry_points.last().is_some_and(|entry_point| entry_point.0 == drop_idx) {
385                    let entry_block = entry_points.pop().unwrap().1;
386                    T::link_entry_point(cfg, entry_block, block);
387                }
388            }
389            match needs_block[drop_idx] {
390                Block::None => continue,
391                Block::Own => {
392                    blocks[drop_idx].get_or_insert_with(|| T::make_block(cfg));
393                }
394                Block::Shares(pred) => {
395                    blocks[drop_idx] = blocks[pred];
396                }
397            }
398            if let DropKind::Value = drop_node.data.kind {
399                needs_block[drop_node.next] = Block::Own;
400            } else if drop_idx != ROOT_NODE {
401                match &mut needs_block[drop_node.next] {
402                    pred @ Block::None => *pred = Block::Shares(drop_idx),
403                    pred @ Block::Shares(_) => *pred = Block::Own,
404                    Block::Own => (),
405                }
406            }
407        }
408
409        {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:409",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(409u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("assign_blocks: blocks = {0:#?}",
                                                    blocks) as &dyn Value))])
            });
    } else { ; }
};debug!("assign_blocks: blocks = {:#?}", blocks);
410        if !entry_points.is_empty() {
    ::core::panicking::panic("assertion failed: entry_points.is_empty()")
};assert!(entry_points.is_empty());
411
412        blocks
413    }
414
415    fn link_blocks<'tcx>(
416        &self,
417        cfg: &mut CFG<'tcx>,
418        blocks: &IndexSlice<DropIdx, Option<BasicBlock>>,
419    ) {
420        for (drop_idx, drop_node) in self.drop_nodes.iter_enumerated().rev() {
421            let Some(block) = blocks[drop_idx] else { continue };
422            match drop_node.data.kind {
423                DropKind::Value => {
424                    let terminator = TerminatorKind::Drop {
425                        target: blocks[drop_node.next].unwrap(),
426                        // The caller will handle this if needed.
427                        unwind: UnwindAction::Terminate(UnwindTerminateReason::InCleanup),
428                        place: drop_node.data.local.into(),
429                        replace: false,
430                        drop: None,
431                        async_fut: None,
432                    };
433                    cfg.terminate(block, drop_node.data.source_info, terminator);
434                }
435                DropKind::ForLint => {
436                    let stmt = Statement::new(
437                        drop_node.data.source_info,
438                        StatementKind::BackwardIncompatibleDropHint {
439                            place: Box::new(drop_node.data.local.into()),
440                            reason: BackwardIncompatibleDropReason::Edition2024,
441                        },
442                    );
443                    cfg.push(block, stmt);
444                    let target = blocks[drop_node.next].unwrap();
445                    if target != block {
446                        // Diagnostics don't use this `Span` but debuginfo
447                        // might. Since we don't want breakpoints to be placed
448                        // here, especially when this is on an unwind path, we
449                        // use `DUMMY_SP`.
450                        let source_info =
451                            SourceInfo { span: DUMMY_SP, ..drop_node.data.source_info };
452                        let terminator = TerminatorKind::Goto { target };
453                        cfg.terminate(block, source_info, terminator);
454                    }
455                }
456                // Root nodes don't correspond to a drop.
457                DropKind::Storage if drop_idx == ROOT_NODE => {}
458                DropKind::Storage => {
459                    let stmt = Statement::new(
460                        drop_node.data.source_info,
461                        StatementKind::StorageDead(drop_node.data.local),
462                    );
463                    cfg.push(block, stmt);
464                    let target = blocks[drop_node.next].unwrap();
465                    if target != block {
466                        // Diagnostics don't use this `Span` but debuginfo
467                        // might. Since we don't want breakpoints to be placed
468                        // here, especially when this is on an unwind path, we
469                        // use `DUMMY_SP`.
470                        let source_info =
471                            SourceInfo { span: DUMMY_SP, ..drop_node.data.source_info };
472                        let terminator = TerminatorKind::Goto { target };
473                        cfg.terminate(block, source_info, terminator);
474                    }
475                }
476            }
477        }
478    }
479}
480
481impl<'tcx> Scopes<'tcx> {
482    pub(crate) fn new() -> Self {
483        Self {
484            scopes: Vec::new(),
485            breakable_scopes: Vec::new(),
486            const_continuable_scopes: Vec::new(),
487            if_then_scope: None,
488            unwind_drops: DropTree::new(),
489            coroutine_drops: DropTree::new(),
490        }
491    }
492
493    fn push_scope(&mut self, region_scope: (region::Scope, SourceInfo), vis_scope: SourceScope) {
494        {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:494",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(494u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("push_scope({0:?})",
                                                    region_scope) as &dyn Value))])
            });
    } else { ; }
};debug!("push_scope({:?})", region_scope);
495        self.scopes.push(Scope {
496            source_scope: vis_scope,
497            region_scope: region_scope.0,
498            drops: ::alloc::vec::Vec::new()vec![],
499            moved_locals: ::alloc::vec::Vec::new()vec![],
500            cached_unwind_block: None,
501            cached_coroutine_drop_block: None,
502        });
503    }
504
505    fn pop_scope(&mut self, region_scope: (region::Scope, SourceInfo)) -> Scope {
506        let scope = self.scopes.pop().unwrap();
507        match (&scope.region_scope, &region_scope.0) {
    (left_val, right_val) => {
        if !(*left_val == *right_val) {
            let kind = ::core::panicking::AssertKind::Eq;
            ::core::panicking::assert_failed(kind, &*left_val, &*right_val,
                ::core::option::Option::None);
        }
    }
};assert_eq!(scope.region_scope, region_scope.0);
508        scope
509    }
510
511    fn scope_index(&self, region_scope: region::Scope, span: Span) -> usize {
512        self.scopes
513            .iter()
514            .rposition(|scope| scope.region_scope == region_scope)
515            .unwrap_or_else(|| ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("region_scope {0:?} does not enclose", region_scope))span_bug!(span, "region_scope {:?} does not enclose", region_scope))
516    }
517
518    /// Returns the topmost active scope, which is known to be alive until
519    /// the next scope expression.
520    fn topmost(&self) -> region::Scope {
521        self.scopes.last().expect("topmost_scope: no scopes present").region_scope
522    }
523}
524
525/// Used by [`Builder::in_scope`] to create source scopes mapping from MIR back to HIR at points
526/// where lint levels change.
527#[derive(#[automatically_derived]
impl ::core::marker::Copy for LintLevel { }Copy, #[automatically_derived]
impl ::core::clone::Clone for LintLevel {
    #[inline]
    fn clone(&self) -> LintLevel {
        let _: ::core::clone::AssertParamIsClone<HirId>;
        *self
    }
}Clone, #[automatically_derived]
impl ::core::fmt::Debug for LintLevel {
    #[inline]
    fn fmt(&self, f: &mut ::core::fmt::Formatter) -> ::core::fmt::Result {
        match self {
            LintLevel::Inherited =>
                ::core::fmt::Formatter::write_str(f, "Inherited"),
            LintLevel::Explicit(__self_0) =>
                ::core::fmt::Formatter::debug_tuple_field1_finish(f,
                    "Explicit", &__self_0),
        }
    }
}Debug)]
528pub(crate) enum LintLevel {
529    Inherited,
530    Explicit(HirId),
531}
532
533impl<'a, 'tcx> Builder<'a, 'tcx> {
534    // Adding and removing scopes
535    // ==========================
536
537    ///  Start a breakable scope, which tracks where `continue`, `break` and
538    ///  `return` should branch to.
539    pub(crate) fn in_breakable_scope<F>(
540        &mut self,
541        loop_block: Option<BasicBlock>,
542        break_destination: Place<'tcx>,
543        span: Span,
544        f: F,
545    ) -> BlockAnd<()>
546    where
547        F: FnOnce(&mut Builder<'a, 'tcx>) -> Option<BlockAnd<()>>,
548    {
549        let region_scope = self.scopes.topmost();
550        let scope = BreakableScope {
551            region_scope,
552            break_destination,
553            break_drops: DropTree::new(),
554            continue_drops: loop_block.map(|_| DropTree::new()),
555        };
556        self.scopes.breakable_scopes.push(scope);
557        let normal_exit_block = f(self);
558        let breakable_scope = self.scopes.breakable_scopes.pop().unwrap();
559        if !(breakable_scope.region_scope == region_scope) {
    ::core::panicking::panic("assertion failed: breakable_scope.region_scope == region_scope")
};assert!(breakable_scope.region_scope == region_scope);
560        let break_block =
561            self.build_exit_tree(breakable_scope.break_drops, region_scope, span, None);
562        if let Some(drops) = breakable_scope.continue_drops {
563            self.build_exit_tree(drops, region_scope, span, loop_block);
564        }
565        match (normal_exit_block, break_block) {
566            (Some(block), None) | (None, Some(block)) => block,
567            (None, None) => self.cfg.start_new_block().unit(),
568            (Some(normal_block), Some(exit_block)) => {
569                let target = self.cfg.start_new_block();
570                let source_info = self.source_info(span);
571                self.cfg.terminate(
572                    normal_block.into_block(),
573                    source_info,
574                    TerminatorKind::Goto { target },
575                );
576                self.cfg.terminate(
577                    exit_block.into_block(),
578                    source_info,
579                    TerminatorKind::Goto { target },
580                );
581                target.unit()
582            }
583        }
584    }
585
586    /// Start a const-continuable scope, which tracks where `#[const_continue] break` should
587    /// branch to.
588    pub(crate) fn in_const_continuable_scope<F>(
589        &mut self,
590        arms: Box<[ArmId]>,
591        built_match_tree: BuiltMatchTree<'tcx>,
592        state_place: Place<'tcx>,
593        span: Span,
594        f: F,
595    ) -> BlockAnd<()>
596    where
597        F: FnOnce(&mut Builder<'a, 'tcx>) -> BlockAnd<()>,
598    {
599        let region_scope = self.scopes.topmost();
600        let scope = ConstContinuableScope {
601            region_scope,
602            state_place,
603            const_continue_drops: DropTree::new(),
604            arms,
605            built_match_tree,
606        };
607        self.scopes.const_continuable_scopes.push(scope);
608        let normal_exit_block = f(self);
609        let const_continue_scope = self.scopes.const_continuable_scopes.pop().unwrap();
610        if !(const_continue_scope.region_scope == region_scope) {
    ::core::panicking::panic("assertion failed: const_continue_scope.region_scope == region_scope")
};assert!(const_continue_scope.region_scope == region_scope);
611
612        let break_block = self.build_exit_tree(
613            const_continue_scope.const_continue_drops,
614            region_scope,
615            span,
616            None,
617        );
618
619        match (normal_exit_block, break_block) {
620            (block, None) => block,
621            (normal_block, Some(exit_block)) => {
622                let target = self.cfg.start_new_block();
623                let source_info = self.source_info(span);
624                self.cfg.terminate(
625                    normal_block.into_block(),
626                    source_info,
627                    TerminatorKind::Goto { target },
628                );
629                self.cfg.terminate(
630                    exit_block.into_block(),
631                    source_info,
632                    TerminatorKind::Goto { target },
633                );
634                target.unit()
635            }
636        }
637    }
638
639    /// Start an if-then scope which tracks drop for `if` expressions and `if`
640    /// guards.
641    ///
642    /// For an if-let chain:
643    ///
644    /// if let Some(x) = a && let Some(y) = b && let Some(z) = c { ... }
645    ///
646    /// There are three possible ways the condition can be false and we may have
647    /// to drop `x`, `x` and `y`, or neither depending on which binding fails.
648    /// To handle this correctly we use a `DropTree` in a similar way to a
649    /// `loop` expression and 'break' out on all of the 'else' paths.
650    ///
651    /// Notes:
652    /// - We don't need to keep a stack of scopes in the `Builder` because the
653    ///   'else' paths will only leave the innermost scope.
654    /// - This is also used for match guards.
655    pub(crate) fn in_if_then_scope<F>(
656        &mut self,
657        region_scope: region::Scope,
658        span: Span,
659        f: F,
660    ) -> (BasicBlock, BasicBlock)
661    where
662        F: FnOnce(&mut Builder<'a, 'tcx>) -> BlockAnd<()>,
663    {
664        let scope = IfThenScope { region_scope, else_drops: DropTree::new() };
665        let previous_scope = mem::replace(&mut self.scopes.if_then_scope, Some(scope));
666
667        let then_block = f(self).into_block();
668
669        let if_then_scope = mem::replace(&mut self.scopes.if_then_scope, previous_scope).unwrap();
670        if !(if_then_scope.region_scope == region_scope) {
    ::core::panicking::panic("assertion failed: if_then_scope.region_scope == region_scope")
};assert!(if_then_scope.region_scope == region_scope);
671
672        let else_block =
673            self.build_exit_tree(if_then_scope.else_drops, region_scope, span, None).map_or_else(
674                || self.cfg.start_new_block(),
675                |else_block_and| else_block_and.into_block(),
676            );
677
678        (then_block, else_block)
679    }
680
681    /// Convenience wrapper that pushes a scope and then executes `f`
682    /// to build its contents, popping the scope afterwards.
683    #[allow(clippy :: suspicious_else_formatting)]
{
    let __tracing_attr_span;
    let __tracing_attr_guard;
    if ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() ||
            { false } {
        __tracing_attr_span =
            {
                use ::tracing::__macro_support::Callsite as _;
                static __CALLSITE: ::tracing::callsite::DefaultCallsite =
                    {
                        static META: ::tracing::Metadata<'static> =
                            {
                                ::tracing_core::metadata::Metadata::new("in_scope",
                                    "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                                    ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                                    ::tracing_core::__macro_support::Option::Some(683u32),
                                    ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                                    ::tracing_core::field::FieldSet::new(&["region_scope",
                                                    "lint_level"],
                                        ::tracing_core::callsite::Identifier(&__CALLSITE)),
                                    ::tracing::metadata::Kind::SPAN)
                            };
                        ::tracing::callsite::DefaultCallsite::new(&META)
                    };
                let mut interest = ::tracing::subscriber::Interest::never();
                if ::tracing::Level::DEBUG <=
                                    ::tracing::level_filters::STATIC_MAX_LEVEL &&
                                ::tracing::Level::DEBUG <=
                                    ::tracing::level_filters::LevelFilter::current() &&
                            { interest = __CALLSITE.interest(); !interest.is_never() }
                        &&
                        ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                            interest) {
                    let meta = __CALLSITE.metadata();
                    ::tracing::Span::new(meta,
                        &{
                                #[allow(unused_imports)]
                                use ::tracing::field::{debug, display, Value};
                                let mut iter = meta.fields().iter();
                                meta.fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                                    ::tracing::__macro_support::Option::Some(&::tracing::field::debug(&region_scope)
                                                            as &dyn Value)),
                                                (&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                                    ::tracing::__macro_support::Option::Some(&::tracing::field::debug(&lint_level)
                                                            as &dyn Value))])
                            })
                } else {
                    let span =
                        ::tracing::__macro_support::__disabled_span(__CALLSITE.metadata());
                    {};
                    span
                }
            };
        __tracing_attr_guard = __tracing_attr_span.enter();
    }

    #[warn(clippy :: suspicious_else_formatting)]
    {

        #[allow(unknown_lints, unreachable_code, clippy ::
        diverging_sub_expression, clippy :: empty_loop, clippy ::
        let_unit_value, clippy :: let_with_type_underscore, clippy ::
        needless_return, clippy :: unreachable)]
        if false {
            let __tracing_attr_fake_return: BlockAnd<R> = loop {};
            return __tracing_attr_fake_return;
        }
        {
            let source_scope = self.source_scope;
            if let LintLevel::Explicit(current_hir_id) = lint_level {
                let parent_id =
                    self.source_scopes[source_scope].local_data.as_ref().unwrap_crate_local().lint_root;
                self.maybe_new_source_scope(region_scope.1.span,
                    current_hir_id, parent_id);
            }
            self.push_scope(region_scope);
            let mut block;
            let rv = { let BlockAnd(b, v) = f(self); block = b; v };
            block = self.pop_scope(region_scope, block).into_block();
            self.source_scope = source_scope;
            {
                use ::tracing::__macro_support::Callsite as _;
                static __CALLSITE: ::tracing::callsite::DefaultCallsite =
                    {
                        static META: ::tracing::Metadata<'static> =
                            {
                                ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:704",
                                    "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                                    ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                                    ::tracing_core::__macro_support::Option::Some(704u32),
                                    ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                                    ::tracing_core::field::FieldSet::new(&["block"],
                                        ::tracing_core::callsite::Identifier(&__CALLSITE)),
                                    ::tracing::metadata::Kind::EVENT)
                            };
                        ::tracing::callsite::DefaultCallsite::new(&META)
                    };
                let enabled =
                    ::tracing::Level::DEBUG <=
                                ::tracing::level_filters::STATIC_MAX_LEVEL &&
                            ::tracing::Level::DEBUG <=
                                ::tracing::level_filters::LevelFilter::current() &&
                        {
                            let interest = __CALLSITE.interest();
                            !interest.is_never() &&
                                ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                                    interest)
                        };
                if enabled {
                    (|value_set: ::tracing::field::ValueSet|
                                {
                                    let meta = __CALLSITE.metadata();
                                    ::tracing::Event::dispatch(meta, &value_set);
                                    ;
                                })({
                            #[allow(unused_imports)]
                            use ::tracing::field::{debug, display, Value};
                            let mut iter = __CALLSITE.metadata().fields().iter();
                            __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                                ::tracing::__macro_support::Option::Some(&debug(&block) as
                                                        &dyn Value))])
                        });
                } else { ; }
            };
            block.and(rv)
        }
    }
}#[instrument(skip(self, f), level = "debug")]
684    pub(crate) fn in_scope<F, R>(
685        &mut self,
686        region_scope: (region::Scope, SourceInfo),
687        lint_level: LintLevel,
688        f: F,
689    ) -> BlockAnd<R>
690    where
691        F: FnOnce(&mut Builder<'a, 'tcx>) -> BlockAnd<R>,
692    {
693        let source_scope = self.source_scope;
694        if let LintLevel::Explicit(current_hir_id) = lint_level {
695            let parent_id =
696                self.source_scopes[source_scope].local_data.as_ref().unwrap_crate_local().lint_root;
697            self.maybe_new_source_scope(region_scope.1.span, current_hir_id, parent_id);
698        }
699        self.push_scope(region_scope);
700        let mut block;
701        let rv = unpack!(block = f(self));
702        block = self.pop_scope(region_scope, block).into_block();
703        self.source_scope = source_scope;
704        debug!(?block);
705        block.and(rv)
706    }
707
708    /// Convenience wrapper that executes `f` either within the current scope or a new scope.
709    /// Used for pattern matching, which introduces an additional scope for patterns with guards.
710    pub(crate) fn opt_in_scope<R>(
711        &mut self,
712        opt_region_scope: Option<(region::Scope, SourceInfo)>,
713        f: impl FnOnce(&mut Builder<'a, 'tcx>) -> BlockAnd<R>,
714    ) -> BlockAnd<R> {
715        if let Some(region_scope) = opt_region_scope {
716            self.in_scope(region_scope, LintLevel::Inherited, f)
717        } else {
718            f(self)
719        }
720    }
721
722    /// Push a scope onto the stack. You can then build code in this
723    /// scope and call `pop_scope` afterwards. Note that these two
724    /// calls must be paired; using `in_scope` as a convenience
725    /// wrapper maybe preferable.
726    pub(crate) fn push_scope(&mut self, region_scope: (region::Scope, SourceInfo)) {
727        self.scopes.push_scope(region_scope, self.source_scope);
728    }
729
730    /// Pops a scope, which should have region scope `region_scope`,
731    /// adding any drops onto the end of `block` that are needed.
732    /// This must match 1-to-1 with `push_scope`.
733    pub(crate) fn pop_scope(
734        &mut self,
735        region_scope: (region::Scope, SourceInfo),
736        mut block: BasicBlock,
737    ) -> BlockAnd<()> {
738        {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:738",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(738u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("pop_scope({0:?}, {1:?})",
                                                    region_scope, block) as &dyn Value))])
            });
    } else { ; }
};debug!("pop_scope({:?}, {:?})", region_scope, block);
739
740        block = self.leave_top_scope(block);
741
742        self.scopes.pop_scope(region_scope);
743
744        block.unit()
745    }
746
747    /// Sets up the drops for breaking from `block` to `target`.
748    pub(crate) fn break_scope(
749        &mut self,
750        mut block: BasicBlock,
751        value: Option<ExprId>,
752        target: BreakableTarget,
753        source_info: SourceInfo,
754    ) -> BlockAnd<()> {
755        let span = source_info.span;
756
757        let get_scope_index = |scope: region::Scope| {
758            // find the loop-scope by its `region::Scope`.
759            self.scopes
760                .breakable_scopes
761                .iter()
762                .rposition(|breakable_scope| breakable_scope.region_scope == scope)
763                .unwrap_or_else(|| ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("no enclosing breakable scope found"))span_bug!(span, "no enclosing breakable scope found"))
764        };
765        let (break_index, destination) = match target {
766            BreakableTarget::Return => {
767                let scope = &self.scopes.breakable_scopes[0];
768                if scope.break_destination != Place::return_place() {
769                    ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("`return` in item with no return scope"));span_bug!(span, "`return` in item with no return scope");
770                }
771                (0, Some(scope.break_destination))
772            }
773            BreakableTarget::Break(scope) => {
774                let break_index = get_scope_index(scope);
775                let scope = &self.scopes.breakable_scopes[break_index];
776                (break_index, Some(scope.break_destination))
777            }
778            BreakableTarget::Continue(scope) => {
779                let break_index = get_scope_index(scope);
780                (break_index, None)
781            }
782        };
783
784        match (destination, value) {
785            (Some(destination), Some(value)) => {
786                {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:786",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(786u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("stmt_expr Break val block_context.push(SubExpr)")
                                            as &dyn Value))])
            });
    } else { ; }
};debug!("stmt_expr Break val block_context.push(SubExpr)");
787                self.block_context.push(BlockFrame::SubExpr);
788                block = self.expr_into_dest(destination, block, value).into_block();
789                self.block_context.pop();
790            }
791            (Some(destination), None) => {
792                self.cfg.push_assign_unit(block, source_info, destination, self.tcx)
793            }
794            (None, Some(_)) => {
795                {
    ::core::panicking::panic_fmt(format_args!("`return`, `become` and `break` with value and must have a destination"));
}panic!("`return`, `become` and `break` with value and must have a destination")
796            }
797            (None, None) => {
798                if self.tcx.sess.instrument_coverage() {
799                    // Normally we wouldn't build any MIR in this case, but that makes it
800                    // harder for coverage instrumentation to extract a relevant span for
801                    // `continue` expressions. So here we inject a dummy statement with the
802                    // desired span.
803                    self.cfg.push_coverage_span_marker(block, source_info);
804                }
805            }
806        }
807
808        let region_scope = self.scopes.breakable_scopes[break_index].region_scope;
809        let scope_index = self.scopes.scope_index(region_scope, span);
810        let drops = if destination.is_some() {
811            &mut self.scopes.breakable_scopes[break_index].break_drops
812        } else {
813            let Some(drops) = self.scopes.breakable_scopes[break_index].continue_drops.as_mut()
814            else {
815                self.tcx.dcx().span_delayed_bug(
816                    source_info.span,
817                    "unlabelled `continue` within labelled block",
818                );
819                self.cfg.terminate(block, source_info, TerminatorKind::Unreachable);
820
821                return self.cfg.start_new_block().unit();
822            };
823            drops
824        };
825
826        let mut drop_idx = ROOT_NODE;
827        for scope in &self.scopes.scopes[scope_index + 1..] {
828            for drop in &scope.drops {
829                drop_idx = drops.add_drop(*drop, drop_idx);
830            }
831        }
832        drops.add_entry_point(block, drop_idx);
833
834        // `build_drop_trees` doesn't have access to our source_info, so we
835        // create a dummy terminator now. `TerminatorKind::UnwindResume` is used
836        // because MIR type checking will panic if it hasn't been overwritten.
837        // (See `<ExitScopes as DropTreeBuilder>::link_entry_point`.)
838        self.cfg.terminate(block, source_info, TerminatorKind::UnwindResume);
839
840        self.cfg.start_new_block().unit()
841    }
842
843    /// Based on `FunctionCx::eval_unevaluated_mir_constant_to_valtree`.
844    fn eval_unevaluated_mir_constant_to_valtree(
845        &self,
846        constant: ConstOperand<'tcx>,
847    ) -> Result<(ty::ValTree<'tcx>, Ty<'tcx>), interpret::ErrorHandled> {
848        if !!constant.const_.ty().has_param() {
    ::core::panicking::panic("assertion failed: !constant.const_.ty().has_param()")
};assert!(!constant.const_.ty().has_param());
849        let (uv, ty) = match constant.const_ {
850            mir::Const::Unevaluated(uv, ty) => (uv.shrink(), ty),
851            mir::Const::Ty(_, c) => match c.kind() {
852                // A constant that came from a const generic but was then used as an argument to
853                // old-style simd_shuffle (passing as argument instead of as a generic param).
854                ty::ConstKind::Value(cv) => return Ok((cv.valtree, cv.ty)),
855                other => ::rustc_middle::util::bug::span_bug_fmt(constant.span,
    format_args!("{0:#?}", other))span_bug!(constant.span, "{other:#?}"),
856            },
857            mir::Const::Val(mir::ConstValue::Scalar(mir::interpret::Scalar::Int(val)), ty) => {
858                return Ok((ValTree::from_scalar_int(self.tcx, val), ty));
859            }
860            // We should never encounter `Const::Val` unless MIR opts (like const prop) evaluate
861            // a constant and write that value back into `Operand`s. This could happen, but is
862            // unlikely. Also: all users of `simd_shuffle` are on unstable and already need to take
863            // a lot of care around intrinsics. For an issue to happen here, it would require a
864            // macro expanding to a `simd_shuffle` call without wrapping the constant argument in a
865            // `const {}` block, but the user pass through arbitrary expressions.
866
867            // FIXME(oli-obk): Replace the magic const generic argument of `simd_shuffle` with a
868            // real const generic, and get rid of this entire function.
869            other => ::rustc_middle::util::bug::span_bug_fmt(constant.span,
    format_args!("{0:#?}", other))span_bug!(constant.span, "{other:#?}"),
870        };
871
872        match self.tcx.const_eval_resolve_for_typeck(self.typing_env(), uv, constant.span) {
873            Ok(Ok(valtree)) => Ok((valtree, ty)),
874            Ok(Err(ty)) => ::rustc_middle::util::bug::span_bug_fmt(constant.span,
    format_args!("could not convert {0:?} to a valtree", ty))span_bug!(constant.span, "could not convert {ty:?} to a valtree"),
875            Err(e) => Err(e),
876        }
877    }
878
879    /// Sets up the drops for jumping from `block` to `scope`.
880    pub(crate) fn break_const_continuable_scope(
881        &mut self,
882        mut block: BasicBlock,
883        value: ExprId,
884        scope: region::Scope,
885        source_info: SourceInfo,
886    ) -> BlockAnd<()> {
887        let span = source_info.span;
888
889        // A break can only break out of a scope, so the value should be a scope.
890        let rustc_middle::thir::ExprKind::Scope { value, .. } = self.thir[value].kind else {
891            ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("break value must be a scope"))span_bug!(span, "break value must be a scope")
892        };
893
894        let expr = &self.thir[value];
895        let constant = match &expr.kind {
896            ExprKind::Adt(box AdtExpr { variant_index, fields, base, .. }) => {
897                if !#[allow(non_exhaustive_omitted_patterns)] match base {
            AdtExprBase::None => true,
            _ => false,
        } {
    ::core::panicking::panic("assertion failed: matches!(base, AdtExprBase::None)")
};assert!(matches!(base, AdtExprBase::None));
898                if !fields.is_empty() {
    ::core::panicking::panic("assertion failed: fields.is_empty()")
};assert!(fields.is_empty());
899                ConstOperand {
900                    span: self.thir[value].span,
901                    user_ty: None,
902                    const_: Const::Ty(
903                        self.thir[value].ty,
904                        ty::Const::new_value(
905                            self.tcx,
906                            ValTree::from_branches(
907                                self.tcx,
908                                [ty::Const::new_value(
909                                    self.tcx,
910                                    ValTree::from_scalar_int(
911                                        self.tcx,
912                                        variant_index.as_u32().into(),
913                                    ),
914                                    self.tcx.types.u32,
915                                )],
916                            ),
917                            self.thir[value].ty,
918                        ),
919                    ),
920                }
921            }
922
923            ExprKind::Literal { .. }
924            | ExprKind::NonHirLiteral { .. }
925            | ExprKind::ZstLiteral { .. }
926            | ExprKind::NamedConst { .. } => self.as_constant(&self.thir[value]),
927
928            other => {
929                use crate::errors::ConstContinueNotMonomorphicConstReason as Reason;
930
931                let span = expr.span;
932                let reason = match other {
933                    ExprKind::ConstParam { .. } => Reason::ConstantParameter { span },
934                    ExprKind::ConstBlock { .. } => Reason::ConstBlock { span },
935                    _ => Reason::Other { span },
936                };
937
938                self.tcx
939                    .dcx()
940                    .emit_err(ConstContinueNotMonomorphicConst { span: expr.span, reason });
941                return block.unit();
942            }
943        };
944
945        let break_index = self
946            .scopes
947            .const_continuable_scopes
948            .iter()
949            .rposition(|const_continuable_scope| const_continuable_scope.region_scope == scope)
950            .unwrap_or_else(|| ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("no enclosing const-continuable scope found"))span_bug!(span, "no enclosing const-continuable scope found"));
951
952        let scope = &self.scopes.const_continuable_scopes[break_index];
953
954        let state_decl = &self.local_decls[scope.state_place.as_local().unwrap()];
955        let state_ty = state_decl.ty;
956        let (discriminant_ty, rvalue) = match state_ty.kind() {
957            ty::Adt(adt_def, _) if adt_def.is_enum() => {
958                (state_ty.discriminant_ty(self.tcx), Rvalue::Discriminant(scope.state_place))
959            }
960            ty::Uint(_) | ty::Int(_) | ty::Float(_) | ty::Bool | ty::Char => {
961                (state_ty, Rvalue::Use(Operand::Copy(scope.state_place)))
962            }
963            _ => ::rustc_middle::util::bug::span_bug_fmt(state_decl.source_info.span,
    format_args!("unsupported #[loop_match] state"))span_bug!(state_decl.source_info.span, "unsupported #[loop_match] state"),
964        };
965
966        // The `PatCtxt` is normally used in pattern exhaustiveness checking, but reused
967        // here because it performs normalization and const evaluation.
968        let dropless_arena = rustc_arena::DroplessArena::default();
969        let typeck_results = self.tcx.typeck(self.def_id);
970        let cx = RustcPatCtxt {
971            tcx: self.tcx,
972            typeck_results,
973            module: self.tcx.parent_module(self.hir_id).to_def_id(),
974            // FIXME(#132279): We're in a body, should handle opaques.
975            typing_env: rustc_middle::ty::TypingEnv::non_body_analysis(self.tcx, self.def_id),
976            dropless_arena: &dropless_arena,
977            match_lint_level: self.hir_id,
978            whole_match_span: Some(rustc_span::Span::default()),
979            scrut_span: rustc_span::Span::default(),
980            refutable: true,
981            known_valid_scrutinee: true,
982            internal_state: Default::default(),
983        };
984
985        let valtree = match self.eval_unevaluated_mir_constant_to_valtree(constant) {
986            Ok((valtree, ty)) => {
987                // Defensively check that the type is monomorphic.
988                if !!ty.has_param() {
    ::core::panicking::panic("assertion failed: !ty.has_param()")
};assert!(!ty.has_param());
989
990                valtree
991            }
992            Err(ErrorHandled::Reported(..)) => {
993                return block.unit();
994            }
995            Err(ErrorHandled::TooGeneric(_)) => {
996                self.tcx.dcx().emit_fatal(ConstContinueBadConst { span: constant.span });
997            }
998        };
999
1000        let Some(real_target) =
1001            self.static_pattern_match(&cx, valtree, &*scope.arms, &scope.built_match_tree)
1002        else {
1003            self.tcx.dcx().emit_fatal(ConstContinueUnknownJumpTarget { span })
1004        };
1005
1006        self.block_context.push(BlockFrame::SubExpr);
1007        let state_place = scope.state_place;
1008        block = self.expr_into_dest(state_place, block, value).into_block();
1009        self.block_context.pop();
1010
1011        let discr = self.temp(discriminant_ty, source_info.span);
1012        let scope_index = self
1013            .scopes
1014            .scope_index(self.scopes.const_continuable_scopes[break_index].region_scope, span);
1015        let scope = &mut self.scopes.const_continuable_scopes[break_index];
1016        self.cfg.push_assign(block, source_info, discr, rvalue);
1017        let drop_and_continue_block = self.cfg.start_new_block();
1018        let imaginary_target = self.cfg.start_new_block();
1019        self.cfg.terminate(
1020            block,
1021            source_info,
1022            TerminatorKind::FalseEdge { real_target: drop_and_continue_block, imaginary_target },
1023        );
1024
1025        let drops = &mut scope.const_continue_drops;
1026
1027        let drop_idx = self.scopes.scopes[scope_index + 1..]
1028            .iter()
1029            .flat_map(|scope| &scope.drops)
1030            .fold(ROOT_NODE, |drop_idx, &drop| drops.add_drop(drop, drop_idx));
1031
1032        drops.add_entry_point(imaginary_target, drop_idx);
1033
1034        self.cfg.terminate(imaginary_target, source_info, TerminatorKind::UnwindResume);
1035
1036        let region_scope = scope.region_scope;
1037        let scope_index = self.scopes.scope_index(region_scope, span);
1038        let mut drops = DropTree::new();
1039
1040        let drop_idx = self.scopes.scopes[scope_index + 1..]
1041            .iter()
1042            .flat_map(|scope| &scope.drops)
1043            .fold(ROOT_NODE, |drop_idx, &drop| drops.add_drop(drop, drop_idx));
1044
1045        drops.add_entry_point(drop_and_continue_block, drop_idx);
1046
1047        // `build_drop_trees` doesn't have access to our source_info, so we
1048        // create a dummy terminator now. `TerminatorKind::UnwindResume` is used
1049        // because MIR type checking will panic if it hasn't been overwritten.
1050        // (See `<ExitScopes as DropTreeBuilder>::link_entry_point`.)
1051        self.cfg.terminate(drop_and_continue_block, source_info, TerminatorKind::UnwindResume);
1052
1053        self.build_exit_tree(drops, region_scope, span, Some(real_target));
1054
1055        return self.cfg.start_new_block().unit();
1056    }
1057
1058    /// Sets up the drops for breaking from `block` due to an `if` condition
1059    /// that turned out to be false.
1060    ///
1061    /// Must be called in the context of [`Builder::in_if_then_scope`], so that
1062    /// there is an if-then scope to tell us what the target scope is.
1063    pub(crate) fn break_for_else(&mut self, block: BasicBlock, source_info: SourceInfo) {
1064        let if_then_scope = self
1065            .scopes
1066            .if_then_scope
1067            .as_ref()
1068            .unwrap_or_else(|| ::rustc_middle::util::bug::span_bug_fmt(source_info.span,
    format_args!("no if-then scope found"))span_bug!(source_info.span, "no if-then scope found"));
1069
1070        let target = if_then_scope.region_scope;
1071        let scope_index = self.scopes.scope_index(target, source_info.span);
1072
1073        // Upgrade `if_then_scope` to `&mut`.
1074        let if_then_scope = self.scopes.if_then_scope.as_mut().expect("upgrading & to &mut");
1075
1076        let mut drop_idx = ROOT_NODE;
1077        let drops = &mut if_then_scope.else_drops;
1078        for scope in &self.scopes.scopes[scope_index + 1..] {
1079            for drop in &scope.drops {
1080                drop_idx = drops.add_drop(*drop, drop_idx);
1081            }
1082        }
1083        drops.add_entry_point(block, drop_idx);
1084
1085        // `build_drop_trees` doesn't have access to our source_info, so we
1086        // create a dummy terminator now. `TerminatorKind::UnwindResume` is used
1087        // because MIR type checking will panic if it hasn't been overwritten.
1088        // (See `<ExitScopes as DropTreeBuilder>::link_entry_point`.)
1089        self.cfg.terminate(block, source_info, TerminatorKind::UnwindResume);
1090    }
1091
1092    /// Sets up the drops for explicit tail calls.
1093    ///
1094    /// Unlike other kinds of early exits, tail calls do not go through the drop tree.
1095    /// Instead, all scheduled drops are immediately added to the CFG.
1096    pub(crate) fn break_for_tail_call(
1097        &mut self,
1098        mut block: BasicBlock,
1099        args: &[Spanned<Operand<'tcx>>],
1100        source_info: SourceInfo,
1101    ) -> BlockAnd<()> {
1102        let arg_drops: Vec<_> = args
1103            .iter()
1104            .rev()
1105            .filter_map(|arg| match &arg.node {
1106                Operand::Copy(_) => ::rustc_middle::util::bug::bug_fmt(format_args!("copy op in tail call args"))bug!("copy op in tail call args"),
1107                Operand::Move(place) => {
1108                    let local =
1109                        place.as_local().unwrap_or_else(|| ::rustc_middle::util::bug::bug_fmt(format_args!("projection in tail call args"))bug!("projection in tail call args"));
1110
1111                    if !self.local_decls[local].ty.needs_drop(self.tcx, self.typing_env()) {
1112                        return None;
1113                    }
1114
1115                    Some(DropData { source_info, local, kind: DropKind::Value })
1116                }
1117                Operand::Constant(_) | Operand::RuntimeChecks(_) => None,
1118            })
1119            .collect();
1120
1121        let mut unwind_to = self.diverge_cleanup_target(
1122            self.scopes.scopes.iter().rev().nth(1).unwrap().region_scope,
1123            DUMMY_SP,
1124        );
1125        let typing_env = self.typing_env();
1126        let unwind_drops = &mut self.scopes.unwind_drops;
1127
1128        // the innermost scope contains only the destructors for the tail call arguments
1129        // we only want to drop these in case of a panic, so we skip it
1130        for scope in self.scopes.scopes[1..].iter().rev().skip(1) {
1131            // FIXME(explicit_tail_calls) code duplication with `build_scope_drops`
1132            for drop_data in scope.drops.iter().rev() {
1133                let source_info = drop_data.source_info;
1134                let local = drop_data.local;
1135
1136                if !self.local_decls[local].ty.needs_drop(self.tcx, typing_env) {
1137                    continue;
1138                }
1139
1140                match drop_data.kind {
1141                    DropKind::Value => {
1142                        // `unwind_to` should drop the value that we're about to
1143                        // schedule. If dropping this value panics, then we continue
1144                        // with the *next* value on the unwind path.
1145                        if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.local, &drop_data.local) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(
1146                            unwind_drops.drop_nodes[unwind_to].data.local,
1147                            drop_data.local
1148                        );
1149                        if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.kind, &drop_data.kind) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(
1150                            unwind_drops.drop_nodes[unwind_to].data.kind,
1151                            drop_data.kind
1152                        );
1153                        unwind_to = unwind_drops.drop_nodes[unwind_to].next;
1154
1155                        let mut unwind_entry_point = unwind_to;
1156
1157                        // the tail call arguments must be dropped if any of these drops panic
1158                        for drop in arg_drops.iter().copied() {
1159                            unwind_entry_point = unwind_drops.add_drop(drop, unwind_entry_point);
1160                        }
1161
1162                        unwind_drops.add_entry_point(block, unwind_entry_point);
1163
1164                        let next = self.cfg.start_new_block();
1165                        self.cfg.terminate(
1166                            block,
1167                            source_info,
1168                            TerminatorKind::Drop {
1169                                place: local.into(),
1170                                target: next,
1171                                unwind: UnwindAction::Continue,
1172                                replace: false,
1173                                drop: None,
1174                                async_fut: None,
1175                            },
1176                        );
1177                        block = next;
1178                    }
1179                    DropKind::ForLint => {
1180                        self.cfg.push(
1181                            block,
1182                            Statement::new(
1183                                source_info,
1184                                StatementKind::BackwardIncompatibleDropHint {
1185                                    place: Box::new(local.into()),
1186                                    reason: BackwardIncompatibleDropReason::Edition2024,
1187                                },
1188                            ),
1189                        );
1190                    }
1191                    DropKind::Storage => {
1192                        // Only temps and vars need their storage dead.
1193                        if !(local.index() > self.arg_count) {
    ::core::panicking::panic("assertion failed: local.index() > self.arg_count")
};assert!(local.index() > self.arg_count);
1194                        self.cfg.push(
1195                            block,
1196                            Statement::new(source_info, StatementKind::StorageDead(local)),
1197                        );
1198                    }
1199                }
1200            }
1201        }
1202
1203        block.unit()
1204    }
1205
1206    fn is_async_drop_impl(
1207        tcx: TyCtxt<'tcx>,
1208        local_decls: &IndexVec<Local, LocalDecl<'tcx>>,
1209        typing_env: ty::TypingEnv<'tcx>,
1210        local: Local,
1211    ) -> bool {
1212        let ty = local_decls[local].ty;
1213        if ty.is_async_drop(tcx, typing_env) || ty.is_coroutine() {
1214            return true;
1215        }
1216        ty.needs_async_drop(tcx, typing_env)
1217    }
1218    fn is_async_drop(&self, local: Local) -> bool {
1219        Self::is_async_drop_impl(self.tcx, &self.local_decls, self.typing_env(), local)
1220    }
1221
1222    fn leave_top_scope(&mut self, block: BasicBlock) -> BasicBlock {
1223        // If we are emitting a `drop` statement, we need to have the cached
1224        // diverge cleanup pads ready in case that drop panics.
1225        let needs_cleanup = self.scopes.scopes.last().is_some_and(|scope| scope.needs_cleanup());
1226        let is_coroutine = self.coroutine.is_some();
1227        let unwind_to = if needs_cleanup { self.diverge_cleanup() } else { DropIdx::MAX };
1228
1229        let scope = self.scopes.scopes.last().expect("leave_top_scope called with no scopes");
1230        let has_async_drops = is_coroutine
1231            && scope.drops.iter().any(|v| v.kind == DropKind::Value && self.is_async_drop(v.local));
1232        let dropline_to = if has_async_drops { Some(self.diverge_dropline()) } else { None };
1233        let scope = self.scopes.scopes.last().expect("leave_top_scope called with no scopes");
1234        let typing_env = self.typing_env();
1235        build_scope_drops(
1236            &mut self.cfg,
1237            &mut self.scopes.unwind_drops,
1238            &mut self.scopes.coroutine_drops,
1239            scope,
1240            block,
1241            unwind_to,
1242            dropline_to,
1243            is_coroutine && needs_cleanup,
1244            self.arg_count,
1245            |v: Local| Self::is_async_drop_impl(self.tcx, &self.local_decls, typing_env, v),
1246        )
1247        .into_block()
1248    }
1249
1250    /// Possibly creates a new source scope if `current_root` and `parent_root`
1251    /// are different, or if -Zmaximal-hir-to-mir-coverage is enabled.
1252    pub(crate) fn maybe_new_source_scope(
1253        &mut self,
1254        span: Span,
1255        current_id: HirId,
1256        parent_id: HirId,
1257    ) {
1258        let (current_root, parent_root) =
1259            if self.tcx.sess.opts.unstable_opts.maximal_hir_to_mir_coverage {
1260                // Some consumers of rustc need to map MIR locations back to HIR nodes. Currently
1261                // the only part of rustc that tracks MIR -> HIR is the
1262                // `SourceScopeLocalData::lint_root` field that tracks lint levels for MIR
1263                // locations. Normally the number of source scopes is limited to the set of nodes
1264                // with lint annotations. The -Zmaximal-hir-to-mir-coverage flag changes this
1265                // behavior to maximize the number of source scopes, increasing the granularity of
1266                // the MIR->HIR mapping.
1267                (current_id, parent_id)
1268            } else {
1269                // Use `maybe_lint_level_root_bounded` to avoid adding Hir dependencies on our
1270                // parents. We estimate the true lint roots here to avoid creating a lot of source
1271                // scopes.
1272                (
1273                    self.maybe_lint_level_root_bounded(current_id),
1274                    if parent_id == self.hir_id {
1275                        parent_id // this is very common
1276                    } else {
1277                        self.maybe_lint_level_root_bounded(parent_id)
1278                    },
1279                )
1280            };
1281
1282        if current_root != parent_root {
1283            let lint_level = LintLevel::Explicit(current_root);
1284            self.source_scope = self.new_source_scope(span, lint_level);
1285        }
1286    }
1287
1288    /// Walks upwards from `orig_id` to find a node which might change lint levels with attributes.
1289    /// It stops at `self.hir_id` and just returns it if reached.
1290    fn maybe_lint_level_root_bounded(&mut self, orig_id: HirId) -> HirId {
1291        // This assertion lets us just store `ItemLocalId` in the cache, rather
1292        // than the full `HirId`.
1293        match (&orig_id.owner, &self.hir_id.owner) {
    (left_val, right_val) => {
        if !(*left_val == *right_val) {
            let kind = ::core::panicking::AssertKind::Eq;
            ::core::panicking::assert_failed(kind, &*left_val, &*right_val,
                ::core::option::Option::None);
        }
    }
};assert_eq!(orig_id.owner, self.hir_id.owner);
1294
1295        let mut id = orig_id;
1296        loop {
1297            if id == self.hir_id {
1298                // This is a moderately common case, mostly hit for previously unseen nodes.
1299                break;
1300            }
1301
1302            if self.tcx.hir_attrs(id).iter().any(|attr| Level::from_attr(attr).is_some()) {
1303                // This is a rare case. It's for a node path that doesn't reach the root due to an
1304                // intervening lint level attribute. This result doesn't get cached.
1305                return id;
1306            }
1307
1308            let next = self.tcx.parent_hir_id(id);
1309            if next == id {
1310                ::rustc_middle::util::bug::bug_fmt(format_args!("lint traversal reached the root of the crate"));bug!("lint traversal reached the root of the crate");
1311            }
1312            id = next;
1313
1314            // This lookup is just an optimization; it can be removed without affecting
1315            // functionality. It might seem strange to see this at the end of this loop, but the
1316            // `orig_id` passed in to this function is almost always previously unseen, for which a
1317            // lookup will be a miss. So we only do lookups for nodes up the parent chain, where
1318            // cache lookups have a very high hit rate.
1319            if self.lint_level_roots_cache.contains(id.local_id) {
1320                break;
1321            }
1322        }
1323
1324        // `orig_id` traced to `self_id`; record this fact. If `orig_id` is a leaf node it will
1325        // rarely (never?) subsequently be searched for, but it's hard to know if that is the case.
1326        // The performance wins from the cache all come from caching non-leaf nodes.
1327        self.lint_level_roots_cache.insert(orig_id.local_id);
1328        self.hir_id
1329    }
1330
1331    /// Creates a new source scope, nested in the current one.
1332    pub(crate) fn new_source_scope(&mut self, span: Span, lint_level: LintLevel) -> SourceScope {
1333        let parent = self.source_scope;
1334        {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:1334",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(1334u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("new_source_scope({0:?}, {1:?}) - parent({2:?})={3:?}",
                                                    span, lint_level, parent, self.source_scopes.get(parent)) as
                                            &dyn Value))])
            });
    } else { ; }
};debug!(
1335            "new_source_scope({:?}, {:?}) - parent({:?})={:?}",
1336            span,
1337            lint_level,
1338            parent,
1339            self.source_scopes.get(parent)
1340        );
1341        let scope_local_data = SourceScopeLocalData {
1342            lint_root: if let LintLevel::Explicit(lint_root) = lint_level {
1343                lint_root
1344            } else {
1345                self.source_scopes[parent].local_data.as_ref().unwrap_crate_local().lint_root
1346            },
1347        };
1348        self.source_scopes.push(SourceScopeData {
1349            span,
1350            parent_scope: Some(parent),
1351            inlined: None,
1352            inlined_parent_scope: None,
1353            local_data: ClearCrossCrate::Set(scope_local_data),
1354        })
1355    }
1356
1357    /// Given a span and the current source scope, make a SourceInfo.
1358    pub(crate) fn source_info(&self, span: Span) -> SourceInfo {
1359        SourceInfo { span, scope: self.source_scope }
1360    }
1361
1362    // Finding scopes
1363    // ==============
1364
1365    /// Returns the scope that we should use as the lifetime of an
1366    /// operand. Basically, an operand must live until it is consumed.
1367    /// This is similar to, but not quite the same as, the temporary
1368    /// scope (which can be larger or smaller).
1369    ///
1370    /// Consider:
1371    /// ```ignore (illustrative)
1372    /// let x = foo(bar(X, Y));
1373    /// ```
1374    /// We wish to pop the storage for X and Y after `bar()` is
1375    /// called, not after the whole `let` is completed.
1376    ///
1377    /// As another example, if the second argument diverges:
1378    /// ```ignore (illustrative)
1379    /// foo(Box::new(2), panic!())
1380    /// ```
1381    /// We would allocate the box but then free it on the unwinding
1382    /// path; we would also emit a free on the 'success' path from
1383    /// panic, but that will turn out to be removed as dead-code.
1384    pub(crate) fn local_scope(&self) -> region::Scope {
1385        self.scopes.topmost()
1386    }
1387
1388    // Scheduling drops
1389    // ================
1390
1391    pub(crate) fn schedule_drop_storage_and_value(
1392        &mut self,
1393        span: Span,
1394        region_scope: region::Scope,
1395        local: Local,
1396    ) {
1397        self.schedule_drop(span, region_scope, local, DropKind::Storage);
1398        self.schedule_drop(span, region_scope, local, DropKind::Value);
1399    }
1400
1401    /// Indicates that `place` should be dropped on exit from `region_scope`.
1402    ///
1403    /// When called with `DropKind::Storage`, `place` shouldn't be the return
1404    /// place, or a function parameter.
1405    pub(crate) fn schedule_drop(
1406        &mut self,
1407        span: Span,
1408        region_scope: region::Scope,
1409        local: Local,
1410        drop_kind: DropKind,
1411    ) {
1412        let needs_drop = match drop_kind {
1413            DropKind::Value | DropKind::ForLint => {
1414                if !self.local_decls[local].ty.needs_drop(self.tcx, self.typing_env()) {
1415                    return;
1416                }
1417                true
1418            }
1419            DropKind::Storage => {
1420                if local.index() <= self.arg_count {
1421                    ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("`schedule_drop` called with body argument {0:?} but its storage does not require a drop",
        local))span_bug!(
1422                        span,
1423                        "`schedule_drop` called with body argument {:?} \
1424                        but its storage does not require a drop",
1425                        local,
1426                    )
1427                }
1428                false
1429            }
1430        };
1431
1432        // When building drops, we try to cache chains of drops to reduce the
1433        // number of `DropTree::add_drop` calls. This, however, means that
1434        // whenever we add a drop into a scope which already had some entries
1435        // in the drop tree built (and thus, cached) for it, we must invalidate
1436        // all caches which might branch into the scope which had a drop just
1437        // added to it. This is necessary, because otherwise some other code
1438        // might use the cache to branch into already built chain of drops,
1439        // essentially ignoring the newly added drop.
1440        //
1441        // For example consider there’s two scopes with a drop in each. These
1442        // are built and thus the caches are filled:
1443        //
1444        // +--------------------------------------------------------+
1445        // | +---------------------------------+                    |
1446        // | | +--------+     +-------------+  |  +---------------+ |
1447        // | | | return | <-+ | drop(outer) | <-+ |  drop(middle) | |
1448        // | | +--------+     +-------------+  |  +---------------+ |
1449        // | +------------|outer_scope cache|--+                    |
1450        // +------------------------------|middle_scope cache|------+
1451        //
1452        // Now, a new, innermost scope is added along with a new drop into
1453        // both innermost and outermost scopes:
1454        //
1455        // +------------------------------------------------------------+
1456        // | +----------------------------------+                       |
1457        // | | +--------+      +-------------+  |   +---------------+   | +-------------+
1458        // | | | return | <+   | drop(new)   | <-+  |  drop(middle) | <--+| drop(inner) |
1459        // | | +--------+  |   | drop(outer) |  |   +---------------+   | +-------------+
1460        // | |             +-+ +-------------+  |                       |
1461        // | +---|invalid outer_scope cache|----+                       |
1462        // +----=----------------|invalid middle_scope cache|-----------+
1463        //
1464        // If, when adding `drop(new)` we do not invalidate the cached blocks for both
1465        // outer_scope and middle_scope, then, when building drops for the inner (rightmost)
1466        // scope, the old, cached blocks, without `drop(new)` will get used, producing the
1467        // wrong results.
1468        //
1469        // Note that this code iterates scopes from the innermost to the outermost,
1470        // invalidating caches of each scope visited. This way bare minimum of the
1471        // caches gets invalidated. i.e., if a new drop is added into the middle scope, the
1472        // cache of outer scope stays intact.
1473        //
1474        // Since we only cache drops for the unwind path and the coroutine drop
1475        // path, we only need to invalidate the cache for drops that happen on
1476        // the unwind or coroutine drop paths. This means that for
1477        // non-coroutines we don't need to invalidate caches for `DropKind::Storage`.
1478        let invalidate_caches = needs_drop || self.coroutine.is_some();
1479        for scope in self.scopes.scopes.iter_mut().rev() {
1480            if invalidate_caches {
1481                scope.invalidate_cache();
1482            }
1483
1484            if scope.region_scope == region_scope {
1485                let region_scope_span = region_scope.span(self.tcx, self.region_scope_tree);
1486                // Attribute scope exit drops to scope's closing brace.
1487                let scope_end = self.tcx.sess.source_map().end_point(region_scope_span);
1488
1489                scope.drops.push(DropData {
1490                    source_info: SourceInfo { span: scope_end, scope: scope.source_scope },
1491                    local,
1492                    kind: drop_kind,
1493                });
1494
1495                return;
1496            }
1497        }
1498
1499        ::rustc_middle::util::bug::span_bug_fmt(span,
    format_args!("region scope {0:?} not in scope to drop {1:?}",
        region_scope, local));span_bug!(span, "region scope {:?} not in scope to drop {:?}", region_scope, local);
1500    }
1501
1502    /// Schedule emission of a backwards incompatible drop lint hint.
1503    /// Applicable only to temporary values for now.
1504    #[allow(clippy :: suspicious_else_formatting)]
{
    let __tracing_attr_span;
    let __tracing_attr_guard;
    if ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() ||
            { false } {
        __tracing_attr_span =
            {
                use ::tracing::__macro_support::Callsite as _;
                static __CALLSITE: ::tracing::callsite::DefaultCallsite =
                    {
                        static META: ::tracing::Metadata<'static> =
                            {
                                ::tracing_core::metadata::Metadata::new("schedule_backwards_incompatible_drop",
                                    "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                                    ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                                    ::tracing_core::__macro_support::Option::Some(1504u32),
                                    ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                                    ::tracing_core::field::FieldSet::new(&["span",
                                                    "region_scope", "local"],
                                        ::tracing_core::callsite::Identifier(&__CALLSITE)),
                                    ::tracing::metadata::Kind::SPAN)
                            };
                        ::tracing::callsite::DefaultCallsite::new(&META)
                    };
                let mut interest = ::tracing::subscriber::Interest::never();
                if ::tracing::Level::DEBUG <=
                                    ::tracing::level_filters::STATIC_MAX_LEVEL &&
                                ::tracing::Level::DEBUG <=
                                    ::tracing::level_filters::LevelFilter::current() &&
                            { interest = __CALLSITE.interest(); !interest.is_never() }
                        &&
                        ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                            interest) {
                    let meta = __CALLSITE.metadata();
                    ::tracing::Span::new(meta,
                        &{
                                #[allow(unused_imports)]
                                use ::tracing::field::{debug, display, Value};
                                let mut iter = meta.fields().iter();
                                meta.fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                                    ::tracing::__macro_support::Option::Some(&::tracing::field::debug(&span)
                                                            as &dyn Value)),
                                                (&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                                    ::tracing::__macro_support::Option::Some(&::tracing::field::debug(&region_scope)
                                                            as &dyn Value)),
                                                (&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                                    ::tracing::__macro_support::Option::Some(&::tracing::field::debug(&local)
                                                            as &dyn Value))])
                            })
                } else {
                    let span =
                        ::tracing::__macro_support::__disabled_span(__CALLSITE.metadata());
                    {};
                    span
                }
            };
        __tracing_attr_guard = __tracing_attr_span.enter();
    }

    #[warn(clippy :: suspicious_else_formatting)]
    {

        #[allow(unknown_lints, unreachable_code, clippy ::
        diverging_sub_expression, clippy :: empty_loop, clippy ::
        let_unit_value, clippy :: let_with_type_underscore, clippy ::
        needless_return, clippy :: unreachable)]
        if false {
            let __tracing_attr_fake_return: () = loop {};
            return __tracing_attr_fake_return;
        }
        {
            for scope in self.scopes.scopes.iter_mut().rev() {
                scope.invalidate_cache();
                if scope.region_scope == region_scope {
                    let region_scope_span =
                        region_scope.span(self.tcx, self.region_scope_tree);
                    let scope_end =
                        self.tcx.sess.source_map().end_point(region_scope_span);
                    scope.drops.push(DropData {
                            source_info: SourceInfo {
                                span: scope_end,
                                scope: scope.source_scope,
                            },
                            local,
                            kind: DropKind::ForLint,
                        });
                    return;
                }
            }
            ::rustc_middle::util::bug::span_bug_fmt(span,
                format_args!("region scope {0:?} not in scope to drop {1:?} for linting",
                    region_scope, local));
        }
    }
}#[instrument(level = "debug", skip(self))]
1505    pub(crate) fn schedule_backwards_incompatible_drop(
1506        &mut self,
1507        span: Span,
1508        region_scope: region::Scope,
1509        local: Local,
1510    ) {
1511        // Note that we are *not* gating BIDs here on whether they have significant destructor.
1512        // We need to know all of them so that we can capture potential borrow-checking errors.
1513        for scope in self.scopes.scopes.iter_mut().rev() {
1514            // Since we are inserting linting MIR statement, we have to invalidate the caches
1515            scope.invalidate_cache();
1516            if scope.region_scope == region_scope {
1517                let region_scope_span = region_scope.span(self.tcx, self.region_scope_tree);
1518                let scope_end = self.tcx.sess.source_map().end_point(region_scope_span);
1519
1520                scope.drops.push(DropData {
1521                    source_info: SourceInfo { span: scope_end, scope: scope.source_scope },
1522                    local,
1523                    kind: DropKind::ForLint,
1524                });
1525
1526                return;
1527            }
1528        }
1529        span_bug!(
1530            span,
1531            "region scope {:?} not in scope to drop {:?} for linting",
1532            region_scope,
1533            local
1534        );
1535    }
1536
1537    /// Indicates that the "local operand" stored in `local` is
1538    /// *moved* at some point during execution (see `local_scope` for
1539    /// more information about what a "local operand" is -- in short,
1540    /// it's an intermediate operand created as part of preparing some
1541    /// MIR instruction). We use this information to suppress
1542    /// redundant drops on the non-unwind paths. This results in less
1543    /// MIR, but also avoids spurious borrow check errors
1544    /// (c.f. #64391).
1545    ///
1546    /// Example: when compiling the call to `foo` here:
1547    ///
1548    /// ```ignore (illustrative)
1549    /// foo(bar(), ...)
1550    /// ```
1551    ///
1552    /// we would evaluate `bar()` to an operand `_X`. We would also
1553    /// schedule `_X` to be dropped when the expression scope for
1554    /// `foo(bar())` is exited. This is relevant, for example, if the
1555    /// later arguments should unwind (it would ensure that `_X` gets
1556    /// dropped). However, if no unwind occurs, then `_X` will be
1557    /// unconditionally consumed by the `call`:
1558    ///
1559    /// ```ignore (illustrative)
1560    /// bb {
1561    ///   ...
1562    ///   _R = CALL(foo, _X, ...)
1563    /// }
1564    /// ```
1565    ///
1566    /// However, `_X` is still registered to be dropped, and so if we
1567    /// do nothing else, we would generate a `DROP(_X)` that occurs
1568    /// after the call. This will later be optimized out by the
1569    /// drop-elaboration code, but in the meantime it can lead to
1570    /// spurious borrow-check errors -- the problem, ironically, is
1571    /// not the `DROP(_X)` itself, but the (spurious) unwind pathways
1572    /// that it creates. See #64391 for an example.
1573    pub(crate) fn record_operands_moved(&mut self, operands: &[Spanned<Operand<'tcx>>]) {
1574        let local_scope = self.local_scope();
1575        let scope = self.scopes.scopes.last_mut().unwrap();
1576
1577        match (&scope.region_scope, &local_scope) {
    (left_val, right_val) => {
        if !(*left_val == *right_val) {
            let kind = ::core::panicking::AssertKind::Eq;
            ::core::panicking::assert_failed(kind, &*left_val, &*right_val,
                ::core::option::Option::Some(format_args!("local scope is not the topmost scope!")));
        }
    }
};assert_eq!(scope.region_scope, local_scope, "local scope is not the topmost scope!",);
1578
1579        // look for moves of a local variable, like `MOVE(_X)`
1580        let locals_moved = operands.iter().flat_map(|operand| match operand.node {
1581            Operand::Copy(_) | Operand::Constant(_) | Operand::RuntimeChecks(_) => None,
1582            Operand::Move(place) => place.as_local(),
1583        });
1584
1585        for local in locals_moved {
1586            // check if we have a Drop for this operand and -- if so
1587            // -- add it to the list of moved operands. Note that this
1588            // local might not have been an operand created for this
1589            // call, it could come from other places too.
1590            if scope.drops.iter().any(|drop| drop.local == local && drop.kind == DropKind::Value) {
1591                scope.moved_locals.push(local);
1592            }
1593        }
1594    }
1595
1596    // Other
1597    // =====
1598
1599    /// Returns the [DropIdx] for the innermost drop if the function unwound at
1600    /// this point. The `DropIdx` will be created if it doesn't already exist.
1601    fn diverge_cleanup(&mut self) -> DropIdx {
1602        // It is okay to use dummy span because the getting scope index on the topmost scope
1603        // must always succeed.
1604        self.diverge_cleanup_target(self.scopes.topmost(), DUMMY_SP)
1605    }
1606
1607    /// This is similar to [diverge_cleanup](Self::diverge_cleanup) except its target is set to
1608    /// some ancestor scope instead of the current scope.
1609    /// It is possible to unwind to some ancestor scope if some drop panics as
1610    /// the program breaks out of a if-then scope.
1611    fn diverge_cleanup_target(&mut self, target_scope: region::Scope, span: Span) -> DropIdx {
1612        let target = self.scopes.scope_index(target_scope, span);
1613        let (uncached_scope, mut cached_drop) = self.scopes.scopes[..=target]
1614            .iter()
1615            .enumerate()
1616            .rev()
1617            .find_map(|(scope_idx, scope)| {
1618                scope.cached_unwind_block.map(|cached_block| (scope_idx + 1, cached_block))
1619            })
1620            .unwrap_or((0, ROOT_NODE));
1621
1622        if uncached_scope > target {
1623            return cached_drop;
1624        }
1625
1626        let is_coroutine = self.coroutine.is_some();
1627        for scope in &mut self.scopes.scopes[uncached_scope..=target] {
1628            for drop in &scope.drops {
1629                if is_coroutine || drop.kind == DropKind::Value {
1630                    cached_drop = self.scopes.unwind_drops.add_drop(*drop, cached_drop);
1631                }
1632            }
1633            scope.cached_unwind_block = Some(cached_drop);
1634        }
1635
1636        cached_drop
1637    }
1638
1639    /// Prepares to create a path that performs all required cleanup for a
1640    /// terminator that can unwind at the given basic block.
1641    ///
1642    /// This path terminates in Resume. The path isn't created until after all
1643    /// of the non-unwind paths in this item have been lowered.
1644    pub(crate) fn diverge_from(&mut self, start: BasicBlock) {
1645        if true {
    if !#[allow(non_exhaustive_omitted_patterns)] match self.cfg.block_data(start).terminator().kind
                {
                TerminatorKind::Assert { .. } | TerminatorKind::Call { .. } |
                    TerminatorKind::Drop { .. } | TerminatorKind::FalseUnwind {
                    .. } | TerminatorKind::InlineAsm { .. } => true,
                _ => false,
            } {
        {
            ::core::panicking::panic_fmt(format_args!("diverge_from called on block with terminator that cannot unwind."));
        }
    };
};debug_assert!(
1646            matches!(
1647                self.cfg.block_data(start).terminator().kind,
1648                TerminatorKind::Assert { .. }
1649                    | TerminatorKind::Call { .. }
1650                    | TerminatorKind::Drop { .. }
1651                    | TerminatorKind::FalseUnwind { .. }
1652                    | TerminatorKind::InlineAsm { .. }
1653            ),
1654            "diverge_from called on block with terminator that cannot unwind."
1655        );
1656
1657        let next_drop = self.diverge_cleanup();
1658        self.scopes.unwind_drops.add_entry_point(start, next_drop);
1659    }
1660
1661    /// Returns the [DropIdx] for the innermost drop for dropline (coroutine drop path).
1662    /// The `DropIdx` will be created if it doesn't already exist.
1663    fn diverge_dropline(&mut self) -> DropIdx {
1664        // It is okay to use dummy span because the getting scope index on the topmost scope
1665        // must always succeed.
1666        self.diverge_dropline_target(self.scopes.topmost(), DUMMY_SP)
1667    }
1668
1669    /// Similar to diverge_cleanup_target, but for dropline (coroutine drop path)
1670    fn diverge_dropline_target(&mut self, target_scope: region::Scope, span: Span) -> DropIdx {
1671        if true {
    if !self.coroutine.is_some() {
        {
            ::core::panicking::panic_fmt(format_args!("diverge_dropline_target is valid only for coroutine"));
        }
    };
};debug_assert!(
1672            self.coroutine.is_some(),
1673            "diverge_dropline_target is valid only for coroutine"
1674        );
1675        let target = self.scopes.scope_index(target_scope, span);
1676        let (uncached_scope, mut cached_drop) = self.scopes.scopes[..=target]
1677            .iter()
1678            .enumerate()
1679            .rev()
1680            .find_map(|(scope_idx, scope)| {
1681                scope.cached_coroutine_drop_block.map(|cached_block| (scope_idx + 1, cached_block))
1682            })
1683            .unwrap_or((0, ROOT_NODE));
1684
1685        if uncached_scope > target {
1686            return cached_drop;
1687        }
1688
1689        for scope in &mut self.scopes.scopes[uncached_scope..=target] {
1690            for drop in &scope.drops {
1691                cached_drop = self.scopes.coroutine_drops.add_drop(*drop, cached_drop);
1692            }
1693            scope.cached_coroutine_drop_block = Some(cached_drop);
1694        }
1695
1696        cached_drop
1697    }
1698
1699    /// Sets up a path that performs all required cleanup for dropping a
1700    /// coroutine, starting from the given block that ends in
1701    /// [TerminatorKind::Yield].
1702    ///
1703    /// This path terminates in CoroutineDrop.
1704    pub(crate) fn coroutine_drop_cleanup(&mut self, yield_block: BasicBlock) {
1705        if true {
    if !#[allow(non_exhaustive_omitted_patterns)] match self.cfg.block_data(yield_block).terminator().kind
                {
                TerminatorKind::Yield { .. } => true,
                _ => false,
            } {
        {
            ::core::panicking::panic_fmt(format_args!("coroutine_drop_cleanup called on block with non-yield terminator."));
        }
    };
};debug_assert!(
1706            matches!(
1707                self.cfg.block_data(yield_block).terminator().kind,
1708                TerminatorKind::Yield { .. }
1709            ),
1710            "coroutine_drop_cleanup called on block with non-yield terminator."
1711        );
1712        let cached_drop = self.diverge_dropline();
1713        self.scopes.coroutine_drops.add_entry_point(yield_block, cached_drop);
1714    }
1715
1716    /// Utility function for *non*-scope code to build their own drops
1717    /// Force a drop at this point in the MIR by creating a new block.
1718    pub(crate) fn build_drop_and_replace(
1719        &mut self,
1720        block: BasicBlock,
1721        span: Span,
1722        place: Place<'tcx>,
1723        value: Rvalue<'tcx>,
1724    ) -> BlockAnd<()> {
1725        let source_info = self.source_info(span);
1726
1727        // create the new block for the assignment
1728        let assign = self.cfg.start_new_block();
1729        self.cfg.push_assign(assign, source_info, place, value.clone());
1730
1731        // create the new block for the assignment in the case of unwinding
1732        let assign_unwind = self.cfg.start_new_cleanup_block();
1733        self.cfg.push_assign(assign_unwind, source_info, place, value.clone());
1734
1735        self.cfg.terminate(
1736            block,
1737            source_info,
1738            TerminatorKind::Drop {
1739                place,
1740                target: assign,
1741                unwind: UnwindAction::Cleanup(assign_unwind),
1742                replace: true,
1743                drop: None,
1744                async_fut: None,
1745            },
1746        );
1747        self.diverge_from(block);
1748
1749        assign.unit()
1750    }
1751
1752    /// Creates an `Assert` terminator and return the success block.
1753    /// If the boolean condition operand is not the expected value,
1754    /// a runtime panic will be caused with the given message.
1755    pub(crate) fn assert(
1756        &mut self,
1757        block: BasicBlock,
1758        cond: Operand<'tcx>,
1759        expected: bool,
1760        msg: AssertMessage<'tcx>,
1761        span: Span,
1762    ) -> BasicBlock {
1763        let source_info = self.source_info(span);
1764        let success_block = self.cfg.start_new_block();
1765
1766        self.cfg.terminate(
1767            block,
1768            source_info,
1769            TerminatorKind::Assert {
1770                cond,
1771                expected,
1772                msg: Box::new(msg),
1773                target: success_block,
1774                unwind: UnwindAction::Continue,
1775            },
1776        );
1777        self.diverge_from(block);
1778
1779        success_block
1780    }
1781
1782    /// Unschedules any drops in the top two scopes.
1783    ///
1784    /// This is only needed for pattern-matches combining guards and or-patterns: or-patterns lead
1785    /// to guards being lowered multiple times before lowering the arm body, so we unschedle drops
1786    /// for guards' temporaries and bindings between lowering each instance of an match arm's guard.
1787    pub(crate) fn clear_match_arm_and_guard_scopes(&mut self, region_scope: region::Scope) {
1788        let [.., arm_scope, guard_scope] = &mut *self.scopes.scopes else {
1789            ::rustc_middle::util::bug::bug_fmt(format_args!("matches with guards should introduce separate scopes for the pattern and guard"));bug!("matches with guards should introduce separate scopes for the pattern and guard");
1790        };
1791
1792        match (&arm_scope.region_scope, &region_scope) {
    (left_val, right_val) => {
        if !(*left_val == *right_val) {
            let kind = ::core::panicking::AssertKind::Eq;
            ::core::panicking::assert_failed(kind, &*left_val, &*right_val,
                ::core::option::Option::None);
        }
    }
};assert_eq!(arm_scope.region_scope, region_scope);
1793        match (&guard_scope.region_scope.data, &region::ScopeData::MatchGuard) {
    (left_val, right_val) => {
        if !(*left_val == *right_val) {
            let kind = ::core::panicking::AssertKind::Eq;
            ::core::panicking::assert_failed(kind, &*left_val, &*right_val,
                ::core::option::Option::None);
        }
    }
};assert_eq!(guard_scope.region_scope.data, region::ScopeData::MatchGuard);
1794        match (&guard_scope.region_scope.local_id, &region_scope.local_id) {
    (left_val, right_val) => {
        if !(*left_val == *right_val) {
            let kind = ::core::panicking::AssertKind::Eq;
            ::core::panicking::assert_failed(kind, &*left_val, &*right_val,
                ::core::option::Option::None);
        }
    }
};assert_eq!(guard_scope.region_scope.local_id, region_scope.local_id);
1795
1796        arm_scope.drops.clear();
1797        arm_scope.invalidate_cache();
1798        guard_scope.drops.clear();
1799        guard_scope.invalidate_cache();
1800    }
1801}
1802
1803/// Builds drops for `pop_scope` and `leave_top_scope`.
1804///
1805/// # Parameters
1806///
1807/// * `unwind_drops`, the drop tree data structure storing what needs to be cleaned up if unwind occurs
1808/// * `scope`, describes the drops that will occur on exiting the scope in regular execution
1809/// * `block`, the block to branch to once drops are complete (assuming no unwind occurs)
1810/// * `unwind_to`, describes the drops that would occur at this point in the code if a
1811///   panic occurred (a subset of the drops in `scope`, since we sometimes elide StorageDead and other
1812///   instructions on unwinding)
1813/// * `dropline_to`, describes the drops that would occur at this point in the code if a
1814///    coroutine drop occurred.
1815/// * `storage_dead_on_unwind`, if true, then we should emit `StorageDead` even when unwinding
1816/// * `arg_count`, number of MIR local variables corresponding to fn arguments (used to assert that we don't drop those)
1817fn build_scope_drops<'tcx, F>(
1818    cfg: &mut CFG<'tcx>,
1819    unwind_drops: &mut DropTree,
1820    coroutine_drops: &mut DropTree,
1821    scope: &Scope,
1822    block: BasicBlock,
1823    unwind_to: DropIdx,
1824    dropline_to: Option<DropIdx>,
1825    storage_dead_on_unwind: bool,
1826    arg_count: usize,
1827    is_async_drop: F,
1828) -> BlockAnd<()>
1829where
1830    F: Fn(Local) -> bool,
1831{
1832    {
    use ::tracing::__macro_support::Callsite as _;
    static __CALLSITE: ::tracing::callsite::DefaultCallsite =
        {
            static META: ::tracing::Metadata<'static> =
                {
                    ::tracing_core::metadata::Metadata::new("event compiler/rustc_mir_build/src/builder/scope.rs:1832",
                        "rustc_mir_build::builder::scope", ::tracing::Level::DEBUG,
                        ::tracing_core::__macro_support::Option::Some("compiler/rustc_mir_build/src/builder/scope.rs"),
                        ::tracing_core::__macro_support::Option::Some(1832u32),
                        ::tracing_core::__macro_support::Option::Some("rustc_mir_build::builder::scope"),
                        ::tracing_core::field::FieldSet::new(&["message"],
                            ::tracing_core::callsite::Identifier(&__CALLSITE)),
                        ::tracing::metadata::Kind::EVENT)
                };
            ::tracing::callsite::DefaultCallsite::new(&META)
        };
    let enabled =
        ::tracing::Level::DEBUG <= ::tracing::level_filters::STATIC_MAX_LEVEL
                &&
                ::tracing::Level::DEBUG <=
                    ::tracing::level_filters::LevelFilter::current() &&
            {
                let interest = __CALLSITE.interest();
                !interest.is_never() &&
                    ::tracing::__macro_support::__is_enabled(__CALLSITE.metadata(),
                        interest)
            };
    if enabled {
        (|value_set: ::tracing::field::ValueSet|
                    {
                        let meta = __CALLSITE.metadata();
                        ::tracing::Event::dispatch(meta, &value_set);
                        ;
                    })({
                #[allow(unused_imports)]
                use ::tracing::field::{debug, display, Value};
                let mut iter = __CALLSITE.metadata().fields().iter();
                __CALLSITE.metadata().fields().value_set(&[(&::tracing::__macro_support::Iterator::next(&mut iter).expect("FieldSet corrupted (this is a bug)"),
                                    ::tracing::__macro_support::Option::Some(&format_args!("build_scope_drops({0:?} -> {1:?}), dropline_to={2:?}",
                                                    block, scope, dropline_to) as &dyn Value))])
            });
    } else { ; }
};debug!("build_scope_drops({:?} -> {:?}), dropline_to={:?}", block, scope, dropline_to);
1833
1834    // Build up the drops in evaluation order. The end result will
1835    // look like:
1836    //
1837    // [SDs, drops[n]] --..> [SDs, drop[1]] -> [SDs, drop[0]] -> [[SDs]]
1838    //               |                    |                 |
1839    //               :                    |                 |
1840    //                                    V                 V
1841    // [drop[n]] -...-> [drop[1]] ------> [drop[0]] ------> [last_unwind_to]
1842    //
1843    // The horizontal arrows represent the execution path when the drops return
1844    // successfully. The downwards arrows represent the execution path when the
1845    // drops panic (panicking while unwinding will abort, so there's no need for
1846    // another set of arrows).
1847    //
1848    // For coroutines, we unwind from a drop on a local to its StorageDead
1849    // statement. For other functions we don't worry about StorageDead. The
1850    // drops for the unwind path should have already been generated by
1851    // `diverge_cleanup_gen`.
1852
1853    // `unwind_to` indicates what needs to be dropped should unwinding occur.
1854    // This is a subset of what needs to be dropped when exiting the scope.
1855    // As we unwind the scope, we will also move `unwind_to` backwards to match,
1856    // so that we can use it should a destructor panic.
1857    let mut unwind_to = unwind_to;
1858
1859    // The block that we should jump to after drops complete. We start by building the final drop (`drops[n]`
1860    // in the diagram above) and then build the drops (e.g., `drop[1]`, `drop[0]`) that come before it.
1861    // block begins as the successor of `drops[n]` and then becomes `drops[n]` so that `drops[n-1]`
1862    // will branch to `drops[n]`.
1863    let mut block = block;
1864
1865    // `dropline_to` indicates what needs to be dropped should coroutine drop occur.
1866    let mut dropline_to = dropline_to;
1867
1868    for drop_data in scope.drops.iter().rev() {
1869        let source_info = drop_data.source_info;
1870        let local = drop_data.local;
1871
1872        match drop_data.kind {
1873            DropKind::Value => {
1874                // `unwind_to` should drop the value that we're about to
1875                // schedule. If dropping this value panics, then we continue
1876                // with the *next* value on the unwind path.
1877                //
1878                // We adjust this BEFORE we create the drop (e.g., `drops[n]`)
1879                // because `drops[n]` should unwind to `drops[n-1]`.
1880                if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.local, &drop_data.local) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(unwind_drops.drop_nodes[unwind_to].data.local, drop_data.local);
1881                if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.kind, &drop_data.kind) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(unwind_drops.drop_nodes[unwind_to].data.kind, drop_data.kind);
1882                unwind_to = unwind_drops.drop_nodes[unwind_to].next;
1883
1884                if let Some(idx) = dropline_to {
1885                    if true {
    match (&coroutine_drops.drop_nodes[idx].data.local, &drop_data.local) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(coroutine_drops.drop_nodes[idx].data.local, drop_data.local);
1886                    if true {
    match (&coroutine_drops.drop_nodes[idx].data.kind, &drop_data.kind) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(coroutine_drops.drop_nodes[idx].data.kind, drop_data.kind);
1887                    dropline_to = Some(coroutine_drops.drop_nodes[idx].next);
1888                }
1889
1890                // If the operand has been moved, and we are not on an unwind
1891                // path, then don't generate the drop. (We only take this into
1892                // account for non-unwind paths so as not to disturb the
1893                // caching mechanism.)
1894                if scope.moved_locals.contains(&local) {
1895                    continue;
1896                }
1897
1898                unwind_drops.add_entry_point(block, unwind_to);
1899                if let Some(to) = dropline_to
1900                    && is_async_drop(local)
1901                {
1902                    coroutine_drops.add_entry_point(block, to);
1903                }
1904
1905                let next = cfg.start_new_block();
1906                cfg.terminate(
1907                    block,
1908                    source_info,
1909                    TerminatorKind::Drop {
1910                        place: local.into(),
1911                        target: next,
1912                        unwind: UnwindAction::Continue,
1913                        replace: false,
1914                        drop: None,
1915                        async_fut: None,
1916                    },
1917                );
1918                block = next;
1919            }
1920            DropKind::ForLint => {
1921                // As in the `DropKind::Storage` case below:
1922                // normally lint-related drops are not emitted for unwind,
1923                // so we can just leave `unwind_to` unmodified, but in some
1924                // cases we emit things ALSO on the unwind path, so we need to adjust
1925                // `unwind_to` in that case.
1926                if storage_dead_on_unwind {
1927                    if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.local, &drop_data.local) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(
1928                        unwind_drops.drop_nodes[unwind_to].data.local,
1929                        drop_data.local
1930                    );
1931                    if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.kind, &drop_data.kind) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(unwind_drops.drop_nodes[unwind_to].data.kind, drop_data.kind);
1932                    unwind_to = unwind_drops.drop_nodes[unwind_to].next;
1933                }
1934
1935                // If the operand has been moved, and we are not on an unwind
1936                // path, then don't generate the drop. (We only take this into
1937                // account for non-unwind paths so as not to disturb the
1938                // caching mechanism.)
1939                if scope.moved_locals.contains(&local) {
1940                    continue;
1941                }
1942
1943                cfg.push(
1944                    block,
1945                    Statement::new(
1946                        source_info,
1947                        StatementKind::BackwardIncompatibleDropHint {
1948                            place: Box::new(local.into()),
1949                            reason: BackwardIncompatibleDropReason::Edition2024,
1950                        },
1951                    ),
1952                );
1953            }
1954            DropKind::Storage => {
1955                // Ordinarily, storage-dead nodes are not emitted on unwind, so we don't
1956                // need to adjust `unwind_to` on this path. However, in some specific cases
1957                // we *do* emit storage-dead nodes on the unwind path, and in that case now that
1958                // the storage-dead has completed, we need to adjust the `unwind_to` pointer
1959                // so that any future drops we emit will not register storage-dead.
1960                if storage_dead_on_unwind {
1961                    if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.local, &drop_data.local) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(
1962                        unwind_drops.drop_nodes[unwind_to].data.local,
1963                        drop_data.local
1964                    );
1965                    if true {
    match (&unwind_drops.drop_nodes[unwind_to].data.kind, &drop_data.kind) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(unwind_drops.drop_nodes[unwind_to].data.kind, drop_data.kind);
1966                    unwind_to = unwind_drops.drop_nodes[unwind_to].next;
1967                }
1968                if let Some(idx) = dropline_to {
1969                    if true {
    match (&coroutine_drops.drop_nodes[idx].data.local, &drop_data.local) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(coroutine_drops.drop_nodes[idx].data.local, drop_data.local);
1970                    if true {
    match (&coroutine_drops.drop_nodes[idx].data.kind, &drop_data.kind) {
        (left_val, right_val) => {
            if !(*left_val == *right_val) {
                let kind = ::core::panicking::AssertKind::Eq;
                ::core::panicking::assert_failed(kind, &*left_val,
                    &*right_val, ::core::option::Option::None);
            }
        }
    };
};debug_assert_eq!(coroutine_drops.drop_nodes[idx].data.kind, drop_data.kind);
1971                    dropline_to = Some(coroutine_drops.drop_nodes[idx].next);
1972                }
1973                // Only temps and vars need their storage dead.
1974                if !(local.index() > arg_count) {
    ::core::panicking::panic("assertion failed: local.index() > arg_count")
};assert!(local.index() > arg_count);
1975                cfg.push(block, Statement::new(source_info, StatementKind::StorageDead(local)));
1976            }
1977        }
1978    }
1979    block.unit()
1980}
1981
1982impl<'a, 'tcx: 'a> Builder<'a, 'tcx> {
1983    /// Build a drop tree for a breakable scope.
1984    ///
1985    /// If `continue_block` is `Some`, then the tree is for `continue` inside a
1986    /// loop. Otherwise this is for `break` or `return`.
1987    fn build_exit_tree(
1988        &mut self,
1989        mut drops: DropTree,
1990        else_scope: region::Scope,
1991        span: Span,
1992        continue_block: Option<BasicBlock>,
1993    ) -> Option<BlockAnd<()>> {
1994        let blocks = drops.build_mir::<ExitScopes>(&mut self.cfg, continue_block);
1995        let is_coroutine = self.coroutine.is_some();
1996
1997        // Link the exit drop tree to unwind drop tree.
1998        if drops.drop_nodes.iter().any(|drop_node| drop_node.data.kind == DropKind::Value) {
1999            let unwind_target = self.diverge_cleanup_target(else_scope, span);
2000            let mut unwind_indices = IndexVec::from_elem_n(unwind_target, 1);
2001            for (drop_idx, drop_node) in drops.drop_nodes.iter_enumerated().skip(1) {
2002                match drop_node.data.kind {
2003                    DropKind::Storage | DropKind::ForLint => {
2004                        if is_coroutine {
2005                            let unwind_drop = self
2006                                .scopes
2007                                .unwind_drops
2008                                .add_drop(drop_node.data, unwind_indices[drop_node.next]);
2009                            unwind_indices.push(unwind_drop);
2010                        } else {
2011                            unwind_indices.push(unwind_indices[drop_node.next]);
2012                        }
2013                    }
2014                    DropKind::Value => {
2015                        let unwind_drop = self
2016                            .scopes
2017                            .unwind_drops
2018                            .add_drop(drop_node.data, unwind_indices[drop_node.next]);
2019                        self.scopes.unwind_drops.add_entry_point(
2020                            blocks[drop_idx].unwrap(),
2021                            unwind_indices[drop_node.next],
2022                        );
2023                        unwind_indices.push(unwind_drop);
2024                    }
2025                }
2026            }
2027        }
2028        // Link the exit drop tree to dropline drop tree (coroutine drop path) for async drops
2029        if is_coroutine
2030            && drops.drop_nodes.iter().any(|DropNode { data, next: _ }| {
2031                data.kind == DropKind::Value && self.is_async_drop(data.local)
2032            })
2033        {
2034            let dropline_target = self.diverge_dropline_target(else_scope, span);
2035            let mut dropline_indices = IndexVec::from_elem_n(dropline_target, 1);
2036            for (drop_idx, drop_data) in drops.drop_nodes.iter_enumerated().skip(1) {
2037                let coroutine_drop = self
2038                    .scopes
2039                    .coroutine_drops
2040                    .add_drop(drop_data.data, dropline_indices[drop_data.next]);
2041                match drop_data.data.kind {
2042                    DropKind::Storage | DropKind::ForLint => {}
2043                    DropKind::Value => {
2044                        if self.is_async_drop(drop_data.data.local) {
2045                            self.scopes.coroutine_drops.add_entry_point(
2046                                blocks[drop_idx].unwrap(),
2047                                dropline_indices[drop_data.next],
2048                            );
2049                        }
2050                    }
2051                }
2052                dropline_indices.push(coroutine_drop);
2053            }
2054        }
2055        blocks[ROOT_NODE].map(BasicBlock::unit)
2056    }
2057
2058    /// Build the unwind and coroutine drop trees.
2059    pub(crate) fn build_drop_trees(&mut self) {
2060        if self.coroutine.is_some() {
2061            self.build_coroutine_drop_trees();
2062        } else {
2063            Self::build_unwind_tree(
2064                &mut self.cfg,
2065                &mut self.scopes.unwind_drops,
2066                self.fn_span,
2067                &mut None,
2068            );
2069        }
2070    }
2071
2072    fn build_coroutine_drop_trees(&mut self) {
2073        // Build the drop tree for dropping the coroutine while it's suspended.
2074        let drops = &mut self.scopes.coroutine_drops;
2075        let cfg = &mut self.cfg;
2076        let fn_span = self.fn_span;
2077        let blocks = drops.build_mir::<CoroutineDrop>(cfg, None);
2078        if let Some(root_block) = blocks[ROOT_NODE] {
2079            cfg.terminate(
2080                root_block,
2081                SourceInfo::outermost(fn_span),
2082                TerminatorKind::CoroutineDrop,
2083            );
2084        }
2085
2086        // Build the drop tree for unwinding in the normal control flow paths.
2087        let resume_block = &mut None;
2088        let unwind_drops = &mut self.scopes.unwind_drops;
2089        Self::build_unwind_tree(cfg, unwind_drops, fn_span, resume_block);
2090
2091        // Build the drop tree for unwinding when dropping a suspended
2092        // coroutine.
2093        //
2094        // This is a different tree to the standard unwind paths here to
2095        // prevent drop elaboration from creating drop flags that would have
2096        // to be captured by the coroutine. I'm not sure how important this
2097        // optimization is, but it is here.
2098        for (drop_idx, drop_node) in drops.drop_nodes.iter_enumerated() {
2099            if let DropKind::Value = drop_node.data.kind
2100                && let Some(bb) = blocks[drop_idx]
2101            {
2102                if true {
    if !(drop_node.next < drops.drop_nodes.next_index()) {
        ::core::panicking::panic("assertion failed: drop_node.next < drops.drop_nodes.next_index()")
    };
};debug_assert!(drop_node.next < drops.drop_nodes.next_index());
2103                drops.entry_points.push((drop_node.next, bb));
2104            }
2105        }
2106        Self::build_unwind_tree(cfg, drops, fn_span, resume_block);
2107    }
2108
2109    fn build_unwind_tree(
2110        cfg: &mut CFG<'tcx>,
2111        drops: &mut DropTree,
2112        fn_span: Span,
2113        resume_block: &mut Option<BasicBlock>,
2114    ) {
2115        let blocks = drops.build_mir::<Unwind>(cfg, *resume_block);
2116        if let (None, Some(resume)) = (*resume_block, blocks[ROOT_NODE]) {
2117            cfg.terminate(resume, SourceInfo::outermost(fn_span), TerminatorKind::UnwindResume);
2118
2119            *resume_block = blocks[ROOT_NODE];
2120        }
2121    }
2122}
2123
2124// DropTreeBuilder implementations.
2125
2126struct ExitScopes;
2127
2128impl<'tcx> DropTreeBuilder<'tcx> for ExitScopes {
2129    fn make_block(cfg: &mut CFG<'tcx>) -> BasicBlock {
2130        cfg.start_new_block()
2131    }
2132    fn link_entry_point(cfg: &mut CFG<'tcx>, from: BasicBlock, to: BasicBlock) {
2133        // There should be an existing terminator with real source info and a
2134        // dummy TerminatorKind. Replace it with a proper goto.
2135        // (The dummy is added by `break_scope` and `break_for_else`.)
2136        let term = cfg.block_data_mut(from).terminator_mut();
2137        if let TerminatorKind::UnwindResume = term.kind {
2138            term.kind = TerminatorKind::Goto { target: to };
2139        } else {
2140            ::rustc_middle::util::bug::span_bug_fmt(term.source_info.span,
    format_args!("unexpected dummy terminator kind: {0:?}", term.kind));span_bug!(term.source_info.span, "unexpected dummy terminator kind: {:?}", term.kind);
2141        }
2142    }
2143}
2144
2145struct CoroutineDrop;
2146
2147impl<'tcx> DropTreeBuilder<'tcx> for CoroutineDrop {
2148    fn make_block(cfg: &mut CFG<'tcx>) -> BasicBlock {
2149        cfg.start_new_block()
2150    }
2151    fn link_entry_point(cfg: &mut CFG<'tcx>, from: BasicBlock, to: BasicBlock) {
2152        let term = cfg.block_data_mut(from).terminator_mut();
2153        if let TerminatorKind::Yield { ref mut drop, .. } = term.kind {
2154            *drop = Some(to);
2155        } else if let TerminatorKind::Drop { ref mut drop, .. } = term.kind {
2156            *drop = Some(to);
2157        } else {
2158            ::rustc_middle::util::bug::span_bug_fmt(term.source_info.span,
    format_args!("cannot enter coroutine drop tree from {0:?}", term.kind))span_bug!(
2159                term.source_info.span,
2160                "cannot enter coroutine drop tree from {:?}",
2161                term.kind
2162            )
2163        }
2164    }
2165}
2166
2167struct Unwind;
2168
2169impl<'tcx> DropTreeBuilder<'tcx> for Unwind {
2170    fn make_block(cfg: &mut CFG<'tcx>) -> BasicBlock {
2171        cfg.start_new_cleanup_block()
2172    }
2173    fn link_entry_point(cfg: &mut CFG<'tcx>, from: BasicBlock, to: BasicBlock) {
2174        let term = &mut cfg.block_data_mut(from).terminator_mut();
2175        match &mut term.kind {
2176            TerminatorKind::Drop { unwind, .. } => {
2177                if let UnwindAction::Cleanup(unwind) = *unwind {
2178                    let source_info = term.source_info;
2179                    cfg.terminate(unwind, source_info, TerminatorKind::Goto { target: to });
2180                } else {
2181                    *unwind = UnwindAction::Cleanup(to);
2182                }
2183            }
2184            TerminatorKind::FalseUnwind { unwind, .. }
2185            | TerminatorKind::Call { unwind, .. }
2186            | TerminatorKind::Assert { unwind, .. }
2187            | TerminatorKind::InlineAsm { unwind, .. } => {
2188                *unwind = UnwindAction::Cleanup(to);
2189            }
2190            TerminatorKind::Goto { .. }
2191            | TerminatorKind::SwitchInt { .. }
2192            | TerminatorKind::UnwindResume
2193            | TerminatorKind::UnwindTerminate(_)
2194            | TerminatorKind::Return
2195            | TerminatorKind::TailCall { .. }
2196            | TerminatorKind::Unreachable
2197            | TerminatorKind::Yield { .. }
2198            | TerminatorKind::CoroutineDrop
2199            | TerminatorKind::FalseEdge { .. } => {
2200                ::rustc_middle::util::bug::span_bug_fmt(term.source_info.span,
    format_args!("cannot unwind from {0:?}", term.kind))span_bug!(term.source_info.span, "cannot unwind from {:?}", term.kind)
2201            }
2202        }
2203    }
2204}