alloc/
boxed.rs

1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//!     Cons(T, Box<List<T>>),
31//!     Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//!     Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//!     let (i, x): (usize, &i32) = item;
155//!     println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//!     let (i, x): (usize, &i32) = item;
161//!     println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//!     let (i, x): (usize, i32) = item;
167//!     println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187use core::clone::CloneToUninit;
188use core::cmp::Ordering;
189use core::error::{self, Error};
190use core::fmt;
191use core::future::Future;
192use core::hash::{Hash, Hasher};
193use core::marker::{Tuple, Unsize};
194#[cfg(not(no_global_oom_handling))]
195use core::mem::MaybeUninit;
196use core::mem::{self, SizedTypeProperties};
197use core::ops::{
198    AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
199    DerefPure, DispatchFromDyn, LegacyReceiver,
200};
201#[cfg(not(no_global_oom_handling))]
202use core::ops::{Residual, Try};
203use core::pin::{Pin, PinCoerceUnsized};
204use core::ptr::{self, NonNull, Unique};
205use core::task::{Context, Poll};
206
207#[cfg(not(no_global_oom_handling))]
208use crate::alloc::handle_alloc_error;
209use crate::alloc::{AllocError, Allocator, Global, Layout};
210use crate::raw_vec::RawVec;
211#[cfg(not(no_global_oom_handling))]
212use crate::str::from_boxed_utf8_unchecked;
213
214/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
215mod convert;
216/// Iterator related impls for `Box<_>`.
217mod iter;
218/// [`ThinBox`] implementation.
219mod thin;
220
221#[unstable(feature = "thin_box", issue = "92791")]
222pub use thin::ThinBox;
223
224/// A pointer type that uniquely owns a heap allocation of type `T`.
225///
226/// See the [module-level documentation](../../std/boxed/index.html) for more.
227#[lang = "owned_box"]
228#[fundamental]
229#[stable(feature = "rust1", since = "1.0.0")]
230#[rustc_insignificant_dtor]
231#[doc(search_unbox)]
232// The declaration of the `Box` struct must be kept in sync with the
233// compiler or ICEs will happen.
234pub struct Box<
235    T: ?Sized,
236    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
237>(Unique<T>, A);
238
239/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
240/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
241///
242/// This is the surface syntax for `box <expr>` expressions.
243#[doc(hidden)]
244#[rustc_intrinsic]
245#[unstable(feature = "liballoc_internals", issue = "none")]
246pub fn box_new<T>(x: T) -> Box<T>;
247
248impl<T> Box<T> {
249    /// Allocates memory on the heap and then places `x` into it.
250    ///
251    /// This doesn't actually allocate if `T` is zero-sized.
252    ///
253    /// # Examples
254    ///
255    /// ```
256    /// let five = Box::new(5);
257    /// ```
258    #[cfg(not(no_global_oom_handling))]
259    #[inline(always)]
260    #[stable(feature = "rust1", since = "1.0.0")]
261    #[must_use]
262    #[rustc_diagnostic_item = "box_new"]
263    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
264    pub fn new(x: T) -> Self {
265        return box_new(x);
266    }
267
268    /// Constructs a new box with uninitialized contents.
269    ///
270    /// # Examples
271    ///
272    /// ```
273    /// let mut five = Box::<u32>::new_uninit();
274    /// // Deferred initialization:
275    /// five.write(5);
276    /// let five = unsafe { five.assume_init() };
277    ///
278    /// assert_eq!(*five, 5)
279    /// ```
280    #[cfg(not(no_global_oom_handling))]
281    #[stable(feature = "new_uninit", since = "1.82.0")]
282    #[must_use]
283    #[inline]
284    pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
285        Self::new_uninit_in(Global)
286    }
287
288    /// Constructs a new `Box` with uninitialized contents, with the memory
289    /// being filled with `0` bytes.
290    ///
291    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
292    /// of this method.
293    ///
294    /// # Examples
295    ///
296    /// ```
297    /// let zero = Box::<u32>::new_zeroed();
298    /// let zero = unsafe { zero.assume_init() };
299    ///
300    /// assert_eq!(*zero, 0)
301    /// ```
302    ///
303    /// [zeroed]: mem::MaybeUninit::zeroed
304    #[cfg(not(no_global_oom_handling))]
305    #[inline]
306    #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
307    #[must_use]
308    pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
309        Self::new_zeroed_in(Global)
310    }
311
312    /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
313    /// `x` will be pinned in memory and unable to be moved.
314    ///
315    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
316    /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
317    /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
318    /// construct a (pinned) `Box` in a different way than with [`Box::new`].
319    #[cfg(not(no_global_oom_handling))]
320    #[stable(feature = "pin", since = "1.33.0")]
321    #[must_use]
322    #[inline(always)]
323    pub fn pin(x: T) -> Pin<Box<T>> {
324        Box::new(x).into()
325    }
326
327    /// Allocates memory on the heap then places `x` into it,
328    /// returning an error if the allocation fails
329    ///
330    /// This doesn't actually allocate if `T` is zero-sized.
331    ///
332    /// # Examples
333    ///
334    /// ```
335    /// #![feature(allocator_api)]
336    ///
337    /// let five = Box::try_new(5)?;
338    /// # Ok::<(), std::alloc::AllocError>(())
339    /// ```
340    #[unstable(feature = "allocator_api", issue = "32838")]
341    #[inline]
342    pub fn try_new(x: T) -> Result<Self, AllocError> {
343        Self::try_new_in(x, Global)
344    }
345
346    /// Constructs a new box with uninitialized contents on the heap,
347    /// returning an error if the allocation fails
348    ///
349    /// # Examples
350    ///
351    /// ```
352    /// #![feature(allocator_api)]
353    ///
354    /// let mut five = Box::<u32>::try_new_uninit()?;
355    /// // Deferred initialization:
356    /// five.write(5);
357    /// let five = unsafe { five.assume_init() };
358    ///
359    /// assert_eq!(*five, 5);
360    /// # Ok::<(), std::alloc::AllocError>(())
361    /// ```
362    #[unstable(feature = "allocator_api", issue = "32838")]
363    #[inline]
364    pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
365        Box::try_new_uninit_in(Global)
366    }
367
368    /// Constructs a new `Box` with uninitialized contents, with the memory
369    /// being filled with `0` bytes on the heap
370    ///
371    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
372    /// of this method.
373    ///
374    /// # Examples
375    ///
376    /// ```
377    /// #![feature(allocator_api)]
378    ///
379    /// let zero = Box::<u32>::try_new_zeroed()?;
380    /// let zero = unsafe { zero.assume_init() };
381    ///
382    /// assert_eq!(*zero, 0);
383    /// # Ok::<(), std::alloc::AllocError>(())
384    /// ```
385    ///
386    /// [zeroed]: mem::MaybeUninit::zeroed
387    #[unstable(feature = "allocator_api", issue = "32838")]
388    #[inline]
389    pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
390        Box::try_new_zeroed_in(Global)
391    }
392
393    /// Maps the value in a box, reusing the allocation if possible.
394    ///
395    /// `f` is called on the value in the box, and the result is returned, also boxed.
396    ///
397    /// Note: this is an associated function, which means that you have
398    /// to call it as `Box::map(b, f)` instead of `b.map(f)`. This
399    /// is so that there is no conflict with a method on the inner type.
400    ///
401    /// # Examples
402    ///
403    /// ```
404    /// #![feature(smart_pointer_try_map)]
405    ///
406    /// let b = Box::new(7);
407    /// let new = Box::map(b, |i| i + 7);
408    /// assert_eq!(*new, 14);
409    /// ```
410    #[cfg(not(no_global_oom_handling))]
411    #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
412    pub fn map<U>(this: Self, f: impl FnOnce(T) -> U) -> Box<U> {
413        if size_of::<T>() == size_of::<U>() && align_of::<T>() == align_of::<U>() {
414            let (value, allocation) = Box::take(this);
415            Box::write(
416                unsafe { mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<U>>>(allocation) },
417                f(value),
418            )
419        } else {
420            Box::new(f(*this))
421        }
422    }
423
424    /// Attempts to map the value in a box, reusing the allocation if possible.
425    ///
426    /// `f` is called on the value in the box, and if the operation succeeds, the result is
427    /// returned, also boxed.
428    ///
429    /// Note: this is an associated function, which means that you have
430    /// to call it as `Box::try_map(b, f)` instead of `b.try_map(f)`. This
431    /// is so that there is no conflict with a method on the inner type.
432    ///
433    /// # Examples
434    ///
435    /// ```
436    /// #![feature(smart_pointer_try_map)]
437    ///
438    /// let b = Box::new(7);
439    /// let new = Box::try_map(b, u32::try_from).unwrap();
440    /// assert_eq!(*new, 7);
441    /// ```
442    #[cfg(not(no_global_oom_handling))]
443    #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
444    pub fn try_map<R>(
445        this: Self,
446        f: impl FnOnce(T) -> R,
447    ) -> <R::Residual as Residual<Box<R::Output>>>::TryType
448    where
449        R: Try,
450        R::Residual: Residual<Box<R::Output>>,
451    {
452        if size_of::<T>() == size_of::<R::Output>() && align_of::<T>() == align_of::<R::Output>() {
453            let (value, allocation) = Box::take(this);
454            try {
455                Box::write(
456                    unsafe {
457                        mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<R::Output>>>(
458                            allocation,
459                        )
460                    },
461                    f(value)?,
462                )
463            }
464        } else {
465            try { Box::new(f(*this)?) }
466        }
467    }
468}
469
470impl<T, A: Allocator> Box<T, A> {
471    /// Allocates memory in the given allocator then places `x` into it.
472    ///
473    /// This doesn't actually allocate if `T` is zero-sized.
474    ///
475    /// # Examples
476    ///
477    /// ```
478    /// #![feature(allocator_api)]
479    ///
480    /// use std::alloc::System;
481    ///
482    /// let five = Box::new_in(5, System);
483    /// ```
484    #[cfg(not(no_global_oom_handling))]
485    #[unstable(feature = "allocator_api", issue = "32838")]
486    #[must_use]
487    #[inline]
488    pub fn new_in(x: T, alloc: A) -> Self
489    where
490        A: Allocator,
491    {
492        let mut boxed = Self::new_uninit_in(alloc);
493        boxed.write(x);
494        unsafe { boxed.assume_init() }
495    }
496
497    /// Allocates memory in the given allocator then places `x` into it,
498    /// returning an error if the allocation fails
499    ///
500    /// This doesn't actually allocate if `T` is zero-sized.
501    ///
502    /// # Examples
503    ///
504    /// ```
505    /// #![feature(allocator_api)]
506    ///
507    /// use std::alloc::System;
508    ///
509    /// let five = Box::try_new_in(5, System)?;
510    /// # Ok::<(), std::alloc::AllocError>(())
511    /// ```
512    #[unstable(feature = "allocator_api", issue = "32838")]
513    #[inline]
514    pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
515    where
516        A: Allocator,
517    {
518        let mut boxed = Self::try_new_uninit_in(alloc)?;
519        boxed.write(x);
520        unsafe { Ok(boxed.assume_init()) }
521    }
522
523    /// Constructs a new box with uninitialized contents in the provided allocator.
524    ///
525    /// # Examples
526    ///
527    /// ```
528    /// #![feature(allocator_api)]
529    ///
530    /// use std::alloc::System;
531    ///
532    /// let mut five = Box::<u32, _>::new_uninit_in(System);
533    /// // Deferred initialization:
534    /// five.write(5);
535    /// let five = unsafe { five.assume_init() };
536    ///
537    /// assert_eq!(*five, 5)
538    /// ```
539    #[unstable(feature = "allocator_api", issue = "32838")]
540    #[cfg(not(no_global_oom_handling))]
541    #[must_use]
542    pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
543    where
544        A: Allocator,
545    {
546        let layout = Layout::new::<mem::MaybeUninit<T>>();
547        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
548        // That would make code size bigger.
549        match Box::try_new_uninit_in(alloc) {
550            Ok(m) => m,
551            Err(_) => handle_alloc_error(layout),
552        }
553    }
554
555    /// Constructs a new box with uninitialized contents in the provided allocator,
556    /// returning an error if the allocation fails
557    ///
558    /// # Examples
559    ///
560    /// ```
561    /// #![feature(allocator_api)]
562    ///
563    /// use std::alloc::System;
564    ///
565    /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
566    /// // Deferred initialization:
567    /// five.write(5);
568    /// let five = unsafe { five.assume_init() };
569    ///
570    /// assert_eq!(*five, 5);
571    /// # Ok::<(), std::alloc::AllocError>(())
572    /// ```
573    #[unstable(feature = "allocator_api", issue = "32838")]
574    pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
575    where
576        A: Allocator,
577    {
578        let ptr = if T::IS_ZST {
579            NonNull::dangling()
580        } else {
581            let layout = Layout::new::<mem::MaybeUninit<T>>();
582            alloc.allocate(layout)?.cast()
583        };
584        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
585    }
586
587    /// Constructs a new `Box` with uninitialized contents, with the memory
588    /// being filled with `0` bytes in the provided allocator.
589    ///
590    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
591    /// of this method.
592    ///
593    /// # Examples
594    ///
595    /// ```
596    /// #![feature(allocator_api)]
597    ///
598    /// use std::alloc::System;
599    ///
600    /// let zero = Box::<u32, _>::new_zeroed_in(System);
601    /// let zero = unsafe { zero.assume_init() };
602    ///
603    /// assert_eq!(*zero, 0)
604    /// ```
605    ///
606    /// [zeroed]: mem::MaybeUninit::zeroed
607    #[unstable(feature = "allocator_api", issue = "32838")]
608    #[cfg(not(no_global_oom_handling))]
609    #[must_use]
610    pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
611    where
612        A: Allocator,
613    {
614        let layout = Layout::new::<mem::MaybeUninit<T>>();
615        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
616        // That would make code size bigger.
617        match Box::try_new_zeroed_in(alloc) {
618            Ok(m) => m,
619            Err(_) => handle_alloc_error(layout),
620        }
621    }
622
623    /// Constructs a new `Box` with uninitialized contents, with the memory
624    /// being filled with `0` bytes in the provided allocator,
625    /// returning an error if the allocation fails,
626    ///
627    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
628    /// of this method.
629    ///
630    /// # Examples
631    ///
632    /// ```
633    /// #![feature(allocator_api)]
634    ///
635    /// use std::alloc::System;
636    ///
637    /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
638    /// let zero = unsafe { zero.assume_init() };
639    ///
640    /// assert_eq!(*zero, 0);
641    /// # Ok::<(), std::alloc::AllocError>(())
642    /// ```
643    ///
644    /// [zeroed]: mem::MaybeUninit::zeroed
645    #[unstable(feature = "allocator_api", issue = "32838")]
646    pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
647    where
648        A: Allocator,
649    {
650        let ptr = if T::IS_ZST {
651            NonNull::dangling()
652        } else {
653            let layout = Layout::new::<mem::MaybeUninit<T>>();
654            alloc.allocate_zeroed(layout)?.cast()
655        };
656        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
657    }
658
659    /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
660    /// `x` will be pinned in memory and unable to be moved.
661    ///
662    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
663    /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
664    /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
665    /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
666    #[cfg(not(no_global_oom_handling))]
667    #[unstable(feature = "allocator_api", issue = "32838")]
668    #[must_use]
669    #[inline(always)]
670    pub fn pin_in(x: T, alloc: A) -> Pin<Self>
671    where
672        A: 'static + Allocator,
673    {
674        Self::into_pin(Self::new_in(x, alloc))
675    }
676
677    /// Converts a `Box<T>` into a `Box<[T]>`
678    ///
679    /// This conversion does not allocate on the heap and happens in place.
680    #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
681    pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
682        let (raw, alloc) = Box::into_raw_with_allocator(boxed);
683        unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
684    }
685
686    /// Consumes the `Box`, returning the wrapped value.
687    ///
688    /// # Examples
689    ///
690    /// ```
691    /// #![feature(box_into_inner)]
692    ///
693    /// let c = Box::new(5);
694    ///
695    /// assert_eq!(Box::into_inner(c), 5);
696    /// ```
697    #[unstable(feature = "box_into_inner", issue = "80437")]
698    #[inline]
699    pub fn into_inner(boxed: Self) -> T {
700        *boxed
701    }
702
703    /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
704    /// to the uninitialized memory where the wrapped value used to live.
705    ///
706    /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
707    /// boxed values.
708    ///
709    /// # Examples
710    ///
711    /// ```
712    /// #![feature(box_take)]
713    ///
714    /// let c = Box::new(5);
715    ///
716    /// // take the value out of the box
717    /// let (value, uninit) = Box::take(c);
718    /// assert_eq!(value, 5);
719    ///
720    /// // reuse the box for a second value
721    /// let c = Box::write(uninit, 6);
722    /// assert_eq!(*c, 6);
723    /// ```
724    #[unstable(feature = "box_take", issue = "147212")]
725    pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
726        unsafe {
727            let (raw, alloc) = Box::into_non_null_with_allocator(boxed);
728            let value = raw.read();
729            let uninit = Box::from_non_null_in(raw.cast_uninit(), alloc);
730            (value, uninit)
731        }
732    }
733}
734
735impl<T: ?Sized + CloneToUninit> Box<T> {
736    /// Allocates memory on the heap then clones `src` into it.
737    ///
738    /// This doesn't actually allocate if `src` is zero-sized.
739    ///
740    /// # Examples
741    ///
742    /// ```
743    /// #![feature(clone_from_ref)]
744    ///
745    /// let hello: Box<str> = Box::clone_from_ref("hello");
746    /// ```
747    #[cfg(not(no_global_oom_handling))]
748    #[unstable(feature = "clone_from_ref", issue = "149075")]
749    #[must_use]
750    #[inline]
751    pub fn clone_from_ref(src: &T) -> Box<T> {
752        Box::clone_from_ref_in(src, Global)
753    }
754
755    /// Allocates memory on the heap then clones `src` into it, returning an error if allocation fails.
756    ///
757    /// This doesn't actually allocate if `src` is zero-sized.
758    ///
759    /// # Examples
760    ///
761    /// ```
762    /// #![feature(clone_from_ref)]
763    /// #![feature(allocator_api)]
764    ///
765    /// let hello: Box<str> = Box::try_clone_from_ref("hello")?;
766    /// # Ok::<(), std::alloc::AllocError>(())
767    /// ```
768    #[unstable(feature = "clone_from_ref", issue = "149075")]
769    //#[unstable(feature = "allocator_api", issue = "32838")]
770    #[must_use]
771    #[inline]
772    pub fn try_clone_from_ref(src: &T) -> Result<Box<T>, AllocError> {
773        Box::try_clone_from_ref_in(src, Global)
774    }
775}
776
777impl<T: ?Sized + CloneToUninit, A: Allocator> Box<T, A> {
778    /// Allocates memory in the given allocator then clones `src` into it.
779    ///
780    /// This doesn't actually allocate if `src` is zero-sized.
781    ///
782    /// # Examples
783    ///
784    /// ```
785    /// #![feature(clone_from_ref)]
786    /// #![feature(allocator_api)]
787    ///
788    /// use std::alloc::System;
789    ///
790    /// let hello: Box<str, System> = Box::clone_from_ref_in("hello", System);
791    /// ```
792    #[cfg(not(no_global_oom_handling))]
793    #[unstable(feature = "clone_from_ref", issue = "149075")]
794    //#[unstable(feature = "allocator_api", issue = "32838")]
795    #[must_use]
796    #[inline]
797    pub fn clone_from_ref_in(src: &T, alloc: A) -> Box<T, A> {
798        let layout = Layout::for_value::<T>(src);
799        match Box::try_clone_from_ref_in(src, alloc) {
800            Ok(bx) => bx,
801            Err(_) => handle_alloc_error(layout),
802        }
803    }
804
805    /// Allocates memory in the given allocator then clones `src` into it, returning an error if allocation fails.
806    ///
807    /// This doesn't actually allocate if `src` is zero-sized.
808    ///
809    /// # Examples
810    ///
811    /// ```
812    /// #![feature(clone_from_ref)]
813    /// #![feature(allocator_api)]
814    ///
815    /// use std::alloc::System;
816    ///
817    /// let hello: Box<str, System> = Box::try_clone_from_ref_in("hello", System)?;
818    /// # Ok::<(), std::alloc::AllocError>(())
819    /// ```
820    #[unstable(feature = "clone_from_ref", issue = "149075")]
821    //#[unstable(feature = "allocator_api", issue = "32838")]
822    #[must_use]
823    #[inline]
824    pub fn try_clone_from_ref_in(src: &T, alloc: A) -> Result<Box<T, A>, AllocError> {
825        struct DeallocDropGuard<'a, A: Allocator>(Layout, &'a A, NonNull<u8>);
826        impl<'a, A: Allocator> Drop for DeallocDropGuard<'a, A> {
827            fn drop(&mut self) {
828                let &mut DeallocDropGuard(layout, alloc, ptr) = self;
829                // Safety: `ptr` was allocated by `*alloc` with layout `layout`
830                unsafe {
831                    alloc.deallocate(ptr, layout);
832                }
833            }
834        }
835        let layout = Layout::for_value::<T>(src);
836        let (ptr, guard) = if layout.size() == 0 {
837            (layout.dangling(), None)
838        } else {
839            // Safety: layout is non-zero-sized
840            let ptr = alloc.allocate(layout)?.cast();
841            (ptr, Some(DeallocDropGuard(layout, &alloc, ptr)))
842        };
843        let ptr = ptr.as_ptr();
844        // Safety: `*ptr` is newly allocated, correctly aligned to `align_of_val(src)`,
845        // and is valid for writes for `size_of_val(src)`.
846        // If this panics, then `guard` will deallocate for us (if allocation occuured)
847        unsafe {
848            <T as CloneToUninit>::clone_to_uninit(src, ptr);
849        }
850        // Defuse the deallocate guard
851        core::mem::forget(guard);
852        // Safety: We just initialized `*ptr` as a clone of `src`
853        Ok(unsafe { Box::from_raw_in(ptr.with_metadata_of(src), alloc) })
854    }
855}
856
857impl<T> Box<[T]> {
858    /// Constructs a new boxed slice with uninitialized contents.
859    ///
860    /// # Examples
861    ///
862    /// ```
863    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
864    /// // Deferred initialization:
865    /// values[0].write(1);
866    /// values[1].write(2);
867    /// values[2].write(3);
868    /// let values = unsafe { values.assume_init() };
869    ///
870    /// assert_eq!(*values, [1, 2, 3])
871    /// ```
872    #[cfg(not(no_global_oom_handling))]
873    #[stable(feature = "new_uninit", since = "1.82.0")]
874    #[must_use]
875    pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
876        unsafe { RawVec::with_capacity(len).into_box(len) }
877    }
878
879    /// Constructs a new boxed slice with uninitialized contents, with the memory
880    /// being filled with `0` bytes.
881    ///
882    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
883    /// of this method.
884    ///
885    /// # Examples
886    ///
887    /// ```
888    /// let values = Box::<[u32]>::new_zeroed_slice(3);
889    /// let values = unsafe { values.assume_init() };
890    ///
891    /// assert_eq!(*values, [0, 0, 0])
892    /// ```
893    ///
894    /// [zeroed]: mem::MaybeUninit::zeroed
895    #[cfg(not(no_global_oom_handling))]
896    #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
897    #[must_use]
898    pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
899        unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
900    }
901
902    /// Constructs a new boxed slice with uninitialized contents. Returns an error if
903    /// the allocation fails.
904    ///
905    /// # Examples
906    ///
907    /// ```
908    /// #![feature(allocator_api)]
909    ///
910    /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
911    /// // Deferred initialization:
912    /// values[0].write(1);
913    /// values[1].write(2);
914    /// values[2].write(3);
915    /// let values = unsafe { values.assume_init() };
916    ///
917    /// assert_eq!(*values, [1, 2, 3]);
918    /// # Ok::<(), std::alloc::AllocError>(())
919    /// ```
920    #[unstable(feature = "allocator_api", issue = "32838")]
921    #[inline]
922    pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
923        let ptr = if T::IS_ZST || len == 0 {
924            NonNull::dangling()
925        } else {
926            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
927                Ok(l) => l,
928                Err(_) => return Err(AllocError),
929            };
930            Global.allocate(layout)?.cast()
931        };
932        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
933    }
934
935    /// Constructs a new boxed slice with uninitialized contents, with the memory
936    /// being filled with `0` bytes. Returns an error if the allocation fails.
937    ///
938    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
939    /// of this method.
940    ///
941    /// # Examples
942    ///
943    /// ```
944    /// #![feature(allocator_api)]
945    ///
946    /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
947    /// let values = unsafe { values.assume_init() };
948    ///
949    /// assert_eq!(*values, [0, 0, 0]);
950    /// # Ok::<(), std::alloc::AllocError>(())
951    /// ```
952    ///
953    /// [zeroed]: mem::MaybeUninit::zeroed
954    #[unstable(feature = "allocator_api", issue = "32838")]
955    #[inline]
956    pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
957        let ptr = if T::IS_ZST || len == 0 {
958            NonNull::dangling()
959        } else {
960            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
961                Ok(l) => l,
962                Err(_) => return Err(AllocError),
963            };
964            Global.allocate_zeroed(layout)?.cast()
965        };
966        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
967    }
968
969    /// Converts the boxed slice into a boxed array.
970    ///
971    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
972    ///
973    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
974    #[unstable(feature = "alloc_slice_into_array", issue = "148082")]
975    #[inline]
976    #[must_use]
977    pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
978        if self.len() == N {
979            let ptr = Self::into_raw(self) as *mut [T; N];
980
981            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
982            let me = unsafe { Box::from_raw(ptr) };
983            Some(me)
984        } else {
985            None
986        }
987    }
988}
989
990impl<T, A: Allocator> Box<[T], A> {
991    /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
992    ///
993    /// # Examples
994    ///
995    /// ```
996    /// #![feature(allocator_api)]
997    ///
998    /// use std::alloc::System;
999    ///
1000    /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
1001    /// // Deferred initialization:
1002    /// values[0].write(1);
1003    /// values[1].write(2);
1004    /// values[2].write(3);
1005    /// let values = unsafe { values.assume_init() };
1006    ///
1007    /// assert_eq!(*values, [1, 2, 3])
1008    /// ```
1009    #[cfg(not(no_global_oom_handling))]
1010    #[unstable(feature = "allocator_api", issue = "32838")]
1011    #[must_use]
1012    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1013        unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
1014    }
1015
1016    /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
1017    /// with the memory being filled with `0` bytes.
1018    ///
1019    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1020    /// of this method.
1021    ///
1022    /// # Examples
1023    ///
1024    /// ```
1025    /// #![feature(allocator_api)]
1026    ///
1027    /// use std::alloc::System;
1028    ///
1029    /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
1030    /// let values = unsafe { values.assume_init() };
1031    ///
1032    /// assert_eq!(*values, [0, 0, 0])
1033    /// ```
1034    ///
1035    /// [zeroed]: mem::MaybeUninit::zeroed
1036    #[cfg(not(no_global_oom_handling))]
1037    #[unstable(feature = "allocator_api", issue = "32838")]
1038    #[must_use]
1039    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1040        unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
1041    }
1042
1043    /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
1044    /// the allocation fails.
1045    ///
1046    /// # Examples
1047    ///
1048    /// ```
1049    /// #![feature(allocator_api)]
1050    ///
1051    /// use std::alloc::System;
1052    ///
1053    /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
1054    /// // Deferred initialization:
1055    /// values[0].write(1);
1056    /// values[1].write(2);
1057    /// values[2].write(3);
1058    /// let values = unsafe { values.assume_init() };
1059    ///
1060    /// assert_eq!(*values, [1, 2, 3]);
1061    /// # Ok::<(), std::alloc::AllocError>(())
1062    /// ```
1063    #[unstable(feature = "allocator_api", issue = "32838")]
1064    #[inline]
1065    pub fn try_new_uninit_slice_in(
1066        len: usize,
1067        alloc: A,
1068    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1069        let ptr = if T::IS_ZST || len == 0 {
1070            NonNull::dangling()
1071        } else {
1072            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1073                Ok(l) => l,
1074                Err(_) => return Err(AllocError),
1075            };
1076            alloc.allocate(layout)?.cast()
1077        };
1078        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1079    }
1080
1081    /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
1082    /// being filled with `0` bytes. Returns an error if the allocation fails.
1083    ///
1084    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1085    /// of this method.
1086    ///
1087    /// # Examples
1088    ///
1089    /// ```
1090    /// #![feature(allocator_api)]
1091    ///
1092    /// use std::alloc::System;
1093    ///
1094    /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
1095    /// let values = unsafe { values.assume_init() };
1096    ///
1097    /// assert_eq!(*values, [0, 0, 0]);
1098    /// # Ok::<(), std::alloc::AllocError>(())
1099    /// ```
1100    ///
1101    /// [zeroed]: mem::MaybeUninit::zeroed
1102    #[unstable(feature = "allocator_api", issue = "32838")]
1103    #[inline]
1104    pub fn try_new_zeroed_slice_in(
1105        len: usize,
1106        alloc: A,
1107    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1108        let ptr = if T::IS_ZST || len == 0 {
1109            NonNull::dangling()
1110        } else {
1111            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1112                Ok(l) => l,
1113                Err(_) => return Err(AllocError),
1114            };
1115            alloc.allocate_zeroed(layout)?.cast()
1116        };
1117        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1118    }
1119}
1120
1121impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
1122    /// Converts to `Box<T, A>`.
1123    ///
1124    /// # Safety
1125    ///
1126    /// As with [`MaybeUninit::assume_init`],
1127    /// it is up to the caller to guarantee that the value
1128    /// really is in an initialized state.
1129    /// Calling this when the content is not yet fully initialized
1130    /// causes immediate undefined behavior.
1131    ///
1132    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1133    ///
1134    /// # Examples
1135    ///
1136    /// ```
1137    /// let mut five = Box::<u32>::new_uninit();
1138    /// // Deferred initialization:
1139    /// five.write(5);
1140    /// let five: Box<u32> = unsafe { five.assume_init() };
1141    ///
1142    /// assert_eq!(*five, 5)
1143    /// ```
1144    #[stable(feature = "new_uninit", since = "1.82.0")]
1145    #[inline]
1146    pub unsafe fn assume_init(self) -> Box<T, A> {
1147        let (raw, alloc) = Box::into_raw_with_allocator(self);
1148        unsafe { Box::from_raw_in(raw as *mut T, alloc) }
1149    }
1150
1151    /// Writes the value and converts to `Box<T, A>`.
1152    ///
1153    /// This method converts the box similarly to [`Box::assume_init`] but
1154    /// writes `value` into it before conversion thus guaranteeing safety.
1155    /// In some scenarios use of this method may improve performance because
1156    /// the compiler may be able to optimize copying from stack.
1157    ///
1158    /// # Examples
1159    ///
1160    /// ```
1161    /// let big_box = Box::<[usize; 1024]>::new_uninit();
1162    ///
1163    /// let mut array = [0; 1024];
1164    /// for (i, place) in array.iter_mut().enumerate() {
1165    ///     *place = i;
1166    /// }
1167    ///
1168    /// // The optimizer may be able to elide this copy, so previous code writes
1169    /// // to heap directly.
1170    /// let big_box = Box::write(big_box, array);
1171    ///
1172    /// for (i, x) in big_box.iter().enumerate() {
1173    ///     assert_eq!(*x, i);
1174    /// }
1175    /// ```
1176    #[stable(feature = "box_uninit_write", since = "1.87.0")]
1177    #[inline]
1178    pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
1179        unsafe {
1180            (*boxed).write(value);
1181            boxed.assume_init()
1182        }
1183    }
1184}
1185
1186impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
1187    /// Converts to `Box<[T], A>`.
1188    ///
1189    /// # Safety
1190    ///
1191    /// As with [`MaybeUninit::assume_init`],
1192    /// it is up to the caller to guarantee that the values
1193    /// really are in an initialized state.
1194    /// Calling this when the content is not yet fully initialized
1195    /// causes immediate undefined behavior.
1196    ///
1197    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1198    ///
1199    /// # Examples
1200    ///
1201    /// ```
1202    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1203    /// // Deferred initialization:
1204    /// values[0].write(1);
1205    /// values[1].write(2);
1206    /// values[2].write(3);
1207    /// let values = unsafe { values.assume_init() };
1208    ///
1209    /// assert_eq!(*values, [1, 2, 3])
1210    /// ```
1211    #[stable(feature = "new_uninit", since = "1.82.0")]
1212    #[inline]
1213    pub unsafe fn assume_init(self) -> Box<[T], A> {
1214        let (raw, alloc) = Box::into_raw_with_allocator(self);
1215        unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1216    }
1217}
1218
1219impl<T: ?Sized> Box<T> {
1220    /// Constructs a box from a raw pointer.
1221    ///
1222    /// After calling this function, the raw pointer is owned by the
1223    /// resulting `Box`. Specifically, the `Box` destructor will call
1224    /// the destructor of `T` and free the allocated memory. For this
1225    /// to be safe, the memory must have been allocated in accordance
1226    /// with the [memory layout] used by `Box` .
1227    ///
1228    /// # Safety
1229    ///
1230    /// This function is unsafe because improper use may lead to
1231    /// memory problems. For example, a double-free may occur if the
1232    /// function is called twice on the same raw pointer.
1233    ///
1234    /// The raw pointer must point to a block of memory allocated by the global allocator.
1235    ///
1236    /// The safety conditions are described in the [memory layout] section.
1237    ///
1238    /// # Examples
1239    ///
1240    /// Recreate a `Box` which was previously converted to a raw pointer
1241    /// using [`Box::into_raw`]:
1242    /// ```
1243    /// let x = Box::new(5);
1244    /// let ptr = Box::into_raw(x);
1245    /// let x = unsafe { Box::from_raw(ptr) };
1246    /// ```
1247    /// Manually create a `Box` from scratch by using the global allocator:
1248    /// ```
1249    /// use std::alloc::{alloc, Layout};
1250    ///
1251    /// unsafe {
1252    ///     let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1253    ///     // In general .write is required to avoid attempting to destruct
1254    ///     // the (uninitialized) previous contents of `ptr`, though for this
1255    ///     // simple example `*ptr = 5` would have worked as well.
1256    ///     ptr.write(5);
1257    ///     let x = Box::from_raw(ptr);
1258    /// }
1259    /// ```
1260    ///
1261    /// [memory layout]: self#memory-layout
1262    #[stable(feature = "box_raw", since = "1.4.0")]
1263    #[inline]
1264    #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1265    pub unsafe fn from_raw(raw: *mut T) -> Self {
1266        unsafe { Self::from_raw_in(raw, Global) }
1267    }
1268
1269    /// Constructs a box from a `NonNull` pointer.
1270    ///
1271    /// After calling this function, the `NonNull` pointer is owned by
1272    /// the resulting `Box`. Specifically, the `Box` destructor will call
1273    /// the destructor of `T` and free the allocated memory. For this
1274    /// to be safe, the memory must have been allocated in accordance
1275    /// with the [memory layout] used by `Box` .
1276    ///
1277    /// # Safety
1278    ///
1279    /// This function is unsafe because improper use may lead to
1280    /// memory problems. For example, a double-free may occur if the
1281    /// function is called twice on the same `NonNull` pointer.
1282    ///
1283    /// The non-null pointer must point to a block of memory allocated by the global allocator.
1284    ///
1285    /// The safety conditions are described in the [memory layout] section.
1286    ///
1287    /// # Examples
1288    ///
1289    /// Recreate a `Box` which was previously converted to a `NonNull`
1290    /// pointer using [`Box::into_non_null`]:
1291    /// ```
1292    /// #![feature(box_vec_non_null)]
1293    ///
1294    /// let x = Box::new(5);
1295    /// let non_null = Box::into_non_null(x);
1296    /// let x = unsafe { Box::from_non_null(non_null) };
1297    /// ```
1298    /// Manually create a `Box` from scratch by using the global allocator:
1299    /// ```
1300    /// #![feature(box_vec_non_null)]
1301    ///
1302    /// use std::alloc::{alloc, Layout};
1303    /// use std::ptr::NonNull;
1304    ///
1305    /// unsafe {
1306    ///     let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1307    ///         .expect("allocation failed");
1308    ///     // In general .write is required to avoid attempting to destruct
1309    ///     // the (uninitialized) previous contents of `non_null`.
1310    ///     non_null.write(5);
1311    ///     let x = Box::from_non_null(non_null);
1312    /// }
1313    /// ```
1314    ///
1315    /// [memory layout]: self#memory-layout
1316    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1317    #[inline]
1318    #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1319    pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1320        unsafe { Self::from_raw(ptr.as_ptr()) }
1321    }
1322
1323    /// Consumes the `Box`, returning a wrapped raw pointer.
1324    ///
1325    /// The pointer will be properly aligned and non-null.
1326    ///
1327    /// After calling this function, the caller is responsible for the
1328    /// memory previously managed by the `Box`. In particular, the
1329    /// caller should properly destroy `T` and release the memory, taking
1330    /// into account the [memory layout] used by `Box`. The easiest way to
1331    /// do this is to convert the raw pointer back into a `Box` with the
1332    /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1333    /// the cleanup.
1334    ///
1335    /// Note: this is an associated function, which means that you have
1336    /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1337    /// is so that there is no conflict with a method on the inner type.
1338    ///
1339    /// # Examples
1340    /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1341    /// for automatic cleanup:
1342    /// ```
1343    /// let x = Box::new(String::from("Hello"));
1344    /// let ptr = Box::into_raw(x);
1345    /// let x = unsafe { Box::from_raw(ptr) };
1346    /// ```
1347    /// Manual cleanup by explicitly running the destructor and deallocating
1348    /// the memory:
1349    /// ```
1350    /// use std::alloc::{dealloc, Layout};
1351    /// use std::ptr;
1352    ///
1353    /// let x = Box::new(String::from("Hello"));
1354    /// let ptr = Box::into_raw(x);
1355    /// unsafe {
1356    ///     ptr::drop_in_place(ptr);
1357    ///     dealloc(ptr as *mut u8, Layout::new::<String>());
1358    /// }
1359    /// ```
1360    /// Note: This is equivalent to the following:
1361    /// ```
1362    /// let x = Box::new(String::from("Hello"));
1363    /// let ptr = Box::into_raw(x);
1364    /// unsafe {
1365    ///     drop(Box::from_raw(ptr));
1366    /// }
1367    /// ```
1368    ///
1369    /// [memory layout]: self#memory-layout
1370    #[must_use = "losing the pointer will leak memory"]
1371    #[stable(feature = "box_raw", since = "1.4.0")]
1372    #[inline]
1373    pub fn into_raw(b: Self) -> *mut T {
1374        // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1375        let mut b = mem::ManuallyDrop::new(b);
1376        // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1377        // operation for it's alias tracking.
1378        &raw mut **b
1379    }
1380
1381    /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1382    ///
1383    /// The pointer will be properly aligned.
1384    ///
1385    /// After calling this function, the caller is responsible for the
1386    /// memory previously managed by the `Box`. In particular, the
1387    /// caller should properly destroy `T` and release the memory, taking
1388    /// into account the [memory layout] used by `Box`. The easiest way to
1389    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1390    /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1391    /// perform the cleanup.
1392    ///
1393    /// Note: this is an associated function, which means that you have
1394    /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1395    /// This is so that there is no conflict with a method on the inner type.
1396    ///
1397    /// # Examples
1398    /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1399    /// for automatic cleanup:
1400    /// ```
1401    /// #![feature(box_vec_non_null)]
1402    ///
1403    /// let x = Box::new(String::from("Hello"));
1404    /// let non_null = Box::into_non_null(x);
1405    /// let x = unsafe { Box::from_non_null(non_null) };
1406    /// ```
1407    /// Manual cleanup by explicitly running the destructor and deallocating
1408    /// the memory:
1409    /// ```
1410    /// #![feature(box_vec_non_null)]
1411    ///
1412    /// use std::alloc::{dealloc, Layout};
1413    ///
1414    /// let x = Box::new(String::from("Hello"));
1415    /// let non_null = Box::into_non_null(x);
1416    /// unsafe {
1417    ///     non_null.drop_in_place();
1418    ///     dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1419    /// }
1420    /// ```
1421    /// Note: This is equivalent to the following:
1422    /// ```
1423    /// #![feature(box_vec_non_null)]
1424    ///
1425    /// let x = Box::new(String::from("Hello"));
1426    /// let non_null = Box::into_non_null(x);
1427    /// unsafe {
1428    ///     drop(Box::from_non_null(non_null));
1429    /// }
1430    /// ```
1431    ///
1432    /// [memory layout]: self#memory-layout
1433    #[must_use = "losing the pointer will leak memory"]
1434    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1435    #[inline]
1436    pub fn into_non_null(b: Self) -> NonNull<T> {
1437        // SAFETY: `Box` is guaranteed to be non-null.
1438        unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1439    }
1440}
1441
1442impl<T: ?Sized, A: Allocator> Box<T, A> {
1443    /// Constructs a box from a raw pointer in the given allocator.
1444    ///
1445    /// After calling this function, the raw pointer is owned by the
1446    /// resulting `Box`. Specifically, the `Box` destructor will call
1447    /// the destructor of `T` and free the allocated memory. For this
1448    /// to be safe, the memory must have been allocated in accordance
1449    /// with the [memory layout] used by `Box` .
1450    ///
1451    /// # Safety
1452    ///
1453    /// This function is unsafe because improper use may lead to
1454    /// memory problems. For example, a double-free may occur if the
1455    /// function is called twice on the same raw pointer.
1456    ///
1457    /// The raw pointer must point to a block of memory allocated by `alloc`.
1458    ///
1459    /// # Examples
1460    ///
1461    /// Recreate a `Box` which was previously converted to a raw pointer
1462    /// using [`Box::into_raw_with_allocator`]:
1463    /// ```
1464    /// #![feature(allocator_api)]
1465    ///
1466    /// use std::alloc::System;
1467    ///
1468    /// let x = Box::new_in(5, System);
1469    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1470    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1471    /// ```
1472    /// Manually create a `Box` from scratch by using the system allocator:
1473    /// ```
1474    /// #![feature(allocator_api, slice_ptr_get)]
1475    ///
1476    /// use std::alloc::{Allocator, Layout, System};
1477    ///
1478    /// unsafe {
1479    ///     let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1480    ///     // In general .write is required to avoid attempting to destruct
1481    ///     // the (uninitialized) previous contents of `ptr`, though for this
1482    ///     // simple example `*ptr = 5` would have worked as well.
1483    ///     ptr.write(5);
1484    ///     let x = Box::from_raw_in(ptr, System);
1485    /// }
1486    /// # Ok::<(), std::alloc::AllocError>(())
1487    /// ```
1488    ///
1489    /// [memory layout]: self#memory-layout
1490    #[unstable(feature = "allocator_api", issue = "32838")]
1491    #[inline]
1492    pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1493        Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1494    }
1495
1496    /// Constructs a box from a `NonNull` pointer in the given allocator.
1497    ///
1498    /// After calling this function, the `NonNull` pointer is owned by
1499    /// the resulting `Box`. Specifically, the `Box` destructor will call
1500    /// the destructor of `T` and free the allocated memory. For this
1501    /// to be safe, the memory must have been allocated in accordance
1502    /// with the [memory layout] used by `Box` .
1503    ///
1504    /// # Safety
1505    ///
1506    /// This function is unsafe because improper use may lead to
1507    /// memory problems. For example, a double-free may occur if the
1508    /// function is called twice on the same raw pointer.
1509    ///
1510    /// The non-null pointer must point to a block of memory allocated by `alloc`.
1511    ///
1512    /// # Examples
1513    ///
1514    /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1515    /// using [`Box::into_non_null_with_allocator`]:
1516    /// ```
1517    /// #![feature(allocator_api, box_vec_non_null)]
1518    ///
1519    /// use std::alloc::System;
1520    ///
1521    /// let x = Box::new_in(5, System);
1522    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1523    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1524    /// ```
1525    /// Manually create a `Box` from scratch by using the system allocator:
1526    /// ```
1527    /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1528    ///
1529    /// use std::alloc::{Allocator, Layout, System};
1530    ///
1531    /// unsafe {
1532    ///     let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1533    ///     // In general .write is required to avoid attempting to destruct
1534    ///     // the (uninitialized) previous contents of `non_null`.
1535    ///     non_null.write(5);
1536    ///     let x = Box::from_non_null_in(non_null, System);
1537    /// }
1538    /// # Ok::<(), std::alloc::AllocError>(())
1539    /// ```
1540    ///
1541    /// [memory layout]: self#memory-layout
1542    #[unstable(feature = "allocator_api", issue = "32838")]
1543    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1544    #[inline]
1545    pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1546        // SAFETY: guaranteed by the caller.
1547        unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1548    }
1549
1550    /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1551    ///
1552    /// The pointer will be properly aligned and non-null.
1553    ///
1554    /// After calling this function, the caller is responsible for the
1555    /// memory previously managed by the `Box`. In particular, the
1556    /// caller should properly destroy `T` and release the memory, taking
1557    /// into account the [memory layout] used by `Box`. The easiest way to
1558    /// do this is to convert the raw pointer back into a `Box` with the
1559    /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1560    /// the cleanup.
1561    ///
1562    /// Note: this is an associated function, which means that you have
1563    /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1564    /// is so that there is no conflict with a method on the inner type.
1565    ///
1566    /// # Examples
1567    /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1568    /// for automatic cleanup:
1569    /// ```
1570    /// #![feature(allocator_api)]
1571    ///
1572    /// use std::alloc::System;
1573    ///
1574    /// let x = Box::new_in(String::from("Hello"), System);
1575    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1576    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1577    /// ```
1578    /// Manual cleanup by explicitly running the destructor and deallocating
1579    /// the memory:
1580    /// ```
1581    /// #![feature(allocator_api)]
1582    ///
1583    /// use std::alloc::{Allocator, Layout, System};
1584    /// use std::ptr::{self, NonNull};
1585    ///
1586    /// let x = Box::new_in(String::from("Hello"), System);
1587    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1588    /// unsafe {
1589    ///     ptr::drop_in_place(ptr);
1590    ///     let non_null = NonNull::new_unchecked(ptr);
1591    ///     alloc.deallocate(non_null.cast(), Layout::new::<String>());
1592    /// }
1593    /// ```
1594    ///
1595    /// [memory layout]: self#memory-layout
1596    #[must_use = "losing the pointer will leak memory"]
1597    #[unstable(feature = "allocator_api", issue = "32838")]
1598    #[inline]
1599    pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1600        let mut b = mem::ManuallyDrop::new(b);
1601        // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1602        // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1603        // want *no* aliasing requirements here!
1604        // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1605        // works around that.
1606        let ptr = &raw mut **b;
1607        let alloc = unsafe { ptr::read(&b.1) };
1608        (ptr, alloc)
1609    }
1610
1611    /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1612    ///
1613    /// The pointer will be properly aligned.
1614    ///
1615    /// After calling this function, the caller is responsible for the
1616    /// memory previously managed by the `Box`. In particular, the
1617    /// caller should properly destroy `T` and release the memory, taking
1618    /// into account the [memory layout] used by `Box`. The easiest way to
1619    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1620    /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1621    /// perform the cleanup.
1622    ///
1623    /// Note: this is an associated function, which means that you have
1624    /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1625    /// `b.into_non_null_with_allocator()`. This is so that there is no
1626    /// conflict with a method on the inner type.
1627    ///
1628    /// # Examples
1629    /// Converting the `NonNull` pointer back into a `Box` with
1630    /// [`Box::from_non_null_in`] for automatic cleanup:
1631    /// ```
1632    /// #![feature(allocator_api, box_vec_non_null)]
1633    ///
1634    /// use std::alloc::System;
1635    ///
1636    /// let x = Box::new_in(String::from("Hello"), System);
1637    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1638    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1639    /// ```
1640    /// Manual cleanup by explicitly running the destructor and deallocating
1641    /// the memory:
1642    /// ```
1643    /// #![feature(allocator_api, box_vec_non_null)]
1644    ///
1645    /// use std::alloc::{Allocator, Layout, System};
1646    ///
1647    /// let x = Box::new_in(String::from("Hello"), System);
1648    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1649    /// unsafe {
1650    ///     non_null.drop_in_place();
1651    ///     alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1652    /// }
1653    /// ```
1654    ///
1655    /// [memory layout]: self#memory-layout
1656    #[must_use = "losing the pointer will leak memory"]
1657    #[unstable(feature = "allocator_api", issue = "32838")]
1658    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1659    #[inline]
1660    pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1661        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1662        // SAFETY: `Box` is guaranteed to be non-null.
1663        unsafe { (NonNull::new_unchecked(ptr), alloc) }
1664    }
1665
1666    #[unstable(
1667        feature = "ptr_internals",
1668        issue = "none",
1669        reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1670    )]
1671    #[inline]
1672    #[doc(hidden)]
1673    pub fn into_unique(b: Self) -> (Unique<T>, A) {
1674        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1675        unsafe { (Unique::from(&mut *ptr), alloc) }
1676    }
1677
1678    /// Returns a raw mutable pointer to the `Box`'s contents.
1679    ///
1680    /// The caller must ensure that the `Box` outlives the pointer this
1681    /// function returns, or else it will end up dangling.
1682    ///
1683    /// This method guarantees that for the purpose of the aliasing model, this method
1684    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1685    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1686    /// Note that calling other methods that materialize references to the memory
1687    /// may still invalidate this pointer.
1688    /// See the example below for how this guarantee can be used.
1689    ///
1690    /// # Examples
1691    ///
1692    /// Due to the aliasing guarantee, the following code is legal:
1693    ///
1694    /// ```rust
1695    /// #![feature(box_as_ptr)]
1696    ///
1697    /// unsafe {
1698    ///     let mut b = Box::new(0);
1699    ///     let ptr1 = Box::as_mut_ptr(&mut b);
1700    ///     ptr1.write(1);
1701    ///     let ptr2 = Box::as_mut_ptr(&mut b);
1702    ///     ptr2.write(2);
1703    ///     // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1704    ///     ptr1.write(3);
1705    /// }
1706    /// ```
1707    ///
1708    /// [`as_mut_ptr`]: Self::as_mut_ptr
1709    /// [`as_ptr`]: Self::as_ptr
1710    #[unstable(feature = "box_as_ptr", issue = "129090")]
1711    #[rustc_never_returns_null_ptr]
1712    #[rustc_as_ptr]
1713    #[inline]
1714    pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1715        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1716        // any references.
1717        &raw mut **b
1718    }
1719
1720    /// Returns a raw pointer to the `Box`'s contents.
1721    ///
1722    /// The caller must ensure that the `Box` outlives the pointer this
1723    /// function returns, or else it will end up dangling.
1724    ///
1725    /// The caller must also ensure that the memory the pointer (non-transitively) points to
1726    /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1727    /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1728    ///
1729    /// This method guarantees that for the purpose of the aliasing model, this method
1730    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1731    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1732    /// Note that calling other methods that materialize mutable references to the memory,
1733    /// as well as writing to this memory, may still invalidate this pointer.
1734    /// See the example below for how this guarantee can be used.
1735    ///
1736    /// # Examples
1737    ///
1738    /// Due to the aliasing guarantee, the following code is legal:
1739    ///
1740    /// ```rust
1741    /// #![feature(box_as_ptr)]
1742    ///
1743    /// unsafe {
1744    ///     let mut v = Box::new(0);
1745    ///     let ptr1 = Box::as_ptr(&v);
1746    ///     let ptr2 = Box::as_mut_ptr(&mut v);
1747    ///     let _val = ptr2.read();
1748    ///     // No write to this memory has happened yet, so `ptr1` is still valid.
1749    ///     let _val = ptr1.read();
1750    ///     // However, once we do a write...
1751    ///     ptr2.write(1);
1752    ///     // ... `ptr1` is no longer valid.
1753    ///     // This would be UB: let _val = ptr1.read();
1754    /// }
1755    /// ```
1756    ///
1757    /// [`as_mut_ptr`]: Self::as_mut_ptr
1758    /// [`as_ptr`]: Self::as_ptr
1759    #[unstable(feature = "box_as_ptr", issue = "129090")]
1760    #[rustc_never_returns_null_ptr]
1761    #[rustc_as_ptr]
1762    #[inline]
1763    pub fn as_ptr(b: &Self) -> *const T {
1764        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1765        // any references.
1766        &raw const **b
1767    }
1768
1769    /// Returns a reference to the underlying allocator.
1770    ///
1771    /// Note: this is an associated function, which means that you have
1772    /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1773    /// is so that there is no conflict with a method on the inner type.
1774    #[unstable(feature = "allocator_api", issue = "32838")]
1775    #[inline]
1776    pub fn allocator(b: &Self) -> &A {
1777        &b.1
1778    }
1779
1780    /// Consumes and leaks the `Box`, returning a mutable reference,
1781    /// `&'a mut T`.
1782    ///
1783    /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1784    /// has only static references, or none at all, then this may be chosen to be
1785    /// `'static`.
1786    ///
1787    /// This function is mainly useful for data that lives for the remainder of
1788    /// the program's life. Dropping the returned reference will cause a memory
1789    /// leak. If this is not acceptable, the reference should first be wrapped
1790    /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1791    /// then be dropped which will properly destroy `T` and release the
1792    /// allocated memory.
1793    ///
1794    /// Note: this is an associated function, which means that you have
1795    /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1796    /// is so that there is no conflict with a method on the inner type.
1797    ///
1798    /// # Examples
1799    ///
1800    /// Simple usage:
1801    ///
1802    /// ```
1803    /// let x = Box::new(41);
1804    /// let static_ref: &'static mut usize = Box::leak(x);
1805    /// *static_ref += 1;
1806    /// assert_eq!(*static_ref, 42);
1807    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1808    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1809    /// # drop(unsafe { Box::from_raw(static_ref) });
1810    /// ```
1811    ///
1812    /// Unsized data:
1813    ///
1814    /// ```
1815    /// let x = vec![1, 2, 3].into_boxed_slice();
1816    /// let static_ref = Box::leak(x);
1817    /// static_ref[0] = 4;
1818    /// assert_eq!(*static_ref, [4, 2, 3]);
1819    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1820    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1821    /// # drop(unsafe { Box::from_raw(static_ref) });
1822    /// ```
1823    #[stable(feature = "box_leak", since = "1.26.0")]
1824    #[inline]
1825    pub fn leak<'a>(b: Self) -> &'a mut T
1826    where
1827        A: 'a,
1828    {
1829        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1830        mem::forget(alloc);
1831        unsafe { &mut *ptr }
1832    }
1833
1834    /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1835    /// `*boxed` will be pinned in memory and unable to be moved.
1836    ///
1837    /// This conversion does not allocate on the heap and happens in place.
1838    ///
1839    /// This is also available via [`From`].
1840    ///
1841    /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1842    /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1843    /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1844    /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1845    ///
1846    /// # Notes
1847    ///
1848    /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1849    /// as it'll introduce an ambiguity when calling `Pin::from`.
1850    /// A demonstration of such a poor impl is shown below.
1851    ///
1852    /// ```compile_fail
1853    /// # use std::pin::Pin;
1854    /// struct Foo; // A type defined in this crate.
1855    /// impl From<Box<()>> for Pin<Foo> {
1856    ///     fn from(_: Box<()>) -> Pin<Foo> {
1857    ///         Pin::new(Foo)
1858    ///     }
1859    /// }
1860    ///
1861    /// let foo = Box::new(());
1862    /// let bar = Pin::from(foo);
1863    /// ```
1864    #[stable(feature = "box_into_pin", since = "1.63.0")]
1865    pub fn into_pin(boxed: Self) -> Pin<Self>
1866    where
1867        A: 'static,
1868    {
1869        // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1870        // when `T: !Unpin`, so it's safe to pin it directly without any
1871        // additional requirements.
1872        unsafe { Pin::new_unchecked(boxed) }
1873    }
1874}
1875
1876#[stable(feature = "rust1", since = "1.0.0")]
1877unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1878    #[inline]
1879    fn drop(&mut self) {
1880        // the T in the Box is dropped by the compiler before the destructor is run
1881
1882        let ptr = self.0;
1883
1884        unsafe {
1885            let layout = Layout::for_value_raw(ptr.as_ptr());
1886            if layout.size() != 0 {
1887                self.1.deallocate(From::from(ptr.cast()), layout);
1888            }
1889        }
1890    }
1891}
1892
1893#[cfg(not(no_global_oom_handling))]
1894#[stable(feature = "rust1", since = "1.0.0")]
1895impl<T: Default> Default for Box<T> {
1896    /// Creates a `Box<T>`, with the `Default` value for `T`.
1897    #[inline]
1898    fn default() -> Self {
1899        let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1900        unsafe {
1901            // SAFETY: `x` is valid for writing and has the same layout as `T`.
1902            // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1903            // does not have a destructor.
1904            //
1905            // We use `ptr::write` as `MaybeUninit::write` creates
1906            // extra stack copies of `T` in debug mode.
1907            //
1908            // See https://github.com/rust-lang/rust/issues/136043 for more context.
1909            ptr::write(&raw mut *x as *mut T, T::default());
1910            // SAFETY: `x` was just initialized above.
1911            x.assume_init()
1912        }
1913    }
1914}
1915
1916#[cfg(not(no_global_oom_handling))]
1917#[stable(feature = "rust1", since = "1.0.0")]
1918impl<T> Default for Box<[T]> {
1919    /// Creates an empty `[T]` inside a `Box`.
1920    #[inline]
1921    fn default() -> Self {
1922        let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1923        Box(ptr, Global)
1924    }
1925}
1926
1927#[cfg(not(no_global_oom_handling))]
1928#[stable(feature = "default_box_extra", since = "1.17.0")]
1929impl Default for Box<str> {
1930    #[inline]
1931    fn default() -> Self {
1932        // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1933        let ptr: Unique<str> = unsafe {
1934            let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1935            Unique::new_unchecked(bytes.as_ptr() as *mut str)
1936        };
1937        Box(ptr, Global)
1938    }
1939}
1940
1941#[cfg(not(no_global_oom_handling))]
1942#[stable(feature = "pin_default_impls", since = "1.91.0")]
1943impl<T> Default for Pin<Box<T>>
1944where
1945    T: ?Sized,
1946    Box<T>: Default,
1947{
1948    #[inline]
1949    fn default() -> Self {
1950        Box::into_pin(Box::<T>::default())
1951    }
1952}
1953
1954#[cfg(not(no_global_oom_handling))]
1955#[stable(feature = "rust1", since = "1.0.0")]
1956impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1957    /// Returns a new box with a `clone()` of this box's contents.
1958    ///
1959    /// # Examples
1960    ///
1961    /// ```
1962    /// let x = Box::new(5);
1963    /// let y = x.clone();
1964    ///
1965    /// // The value is the same
1966    /// assert_eq!(x, y);
1967    ///
1968    /// // But they are unique objects
1969    /// assert_ne!(&*x as *const i32, &*y as *const i32);
1970    /// ```
1971    #[inline]
1972    fn clone(&self) -> Self {
1973        // Pre-allocate memory to allow writing the cloned value directly.
1974        let mut boxed = Self::new_uninit_in(self.1.clone());
1975        unsafe {
1976            (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1977            boxed.assume_init()
1978        }
1979    }
1980
1981    /// Copies `source`'s contents into `self` without creating a new allocation.
1982    ///
1983    /// # Examples
1984    ///
1985    /// ```
1986    /// let x = Box::new(5);
1987    /// let mut y = Box::new(10);
1988    /// let yp: *const i32 = &*y;
1989    ///
1990    /// y.clone_from(&x);
1991    ///
1992    /// // The value is the same
1993    /// assert_eq!(x, y);
1994    ///
1995    /// // And no allocation occurred
1996    /// assert_eq!(yp, &*y);
1997    /// ```
1998    #[inline]
1999    fn clone_from(&mut self, source: &Self) {
2000        (**self).clone_from(&(**source));
2001    }
2002}
2003
2004#[cfg(not(no_global_oom_handling))]
2005#[stable(feature = "box_slice_clone", since = "1.3.0")]
2006impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
2007    fn clone(&self) -> Self {
2008        let alloc = Box::allocator(self).clone();
2009        self.to_vec_in(alloc).into_boxed_slice()
2010    }
2011
2012    /// Copies `source`'s contents into `self` without creating a new allocation,
2013    /// so long as the two are of the same length.
2014    ///
2015    /// # Examples
2016    ///
2017    /// ```
2018    /// let x = Box::new([5, 6, 7]);
2019    /// let mut y = Box::new([8, 9, 10]);
2020    /// let yp: *const [i32] = &*y;
2021    ///
2022    /// y.clone_from(&x);
2023    ///
2024    /// // The value is the same
2025    /// assert_eq!(x, y);
2026    ///
2027    /// // And no allocation occurred
2028    /// assert_eq!(yp, &*y);
2029    /// ```
2030    fn clone_from(&mut self, source: &Self) {
2031        if self.len() == source.len() {
2032            self.clone_from_slice(&source);
2033        } else {
2034            *self = source.clone();
2035        }
2036    }
2037}
2038
2039#[cfg(not(no_global_oom_handling))]
2040#[stable(feature = "box_slice_clone", since = "1.3.0")]
2041impl Clone for Box<str> {
2042    fn clone(&self) -> Self {
2043        // this makes a copy of the data
2044        let buf: Box<[u8]> = self.as_bytes().into();
2045        unsafe { from_boxed_utf8_unchecked(buf) }
2046    }
2047}
2048
2049#[stable(feature = "rust1", since = "1.0.0")]
2050impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
2051    #[inline]
2052    fn eq(&self, other: &Self) -> bool {
2053        PartialEq::eq(&**self, &**other)
2054    }
2055    #[inline]
2056    fn ne(&self, other: &Self) -> bool {
2057        PartialEq::ne(&**self, &**other)
2058    }
2059}
2060
2061#[stable(feature = "rust1", since = "1.0.0")]
2062impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
2063    #[inline]
2064    fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
2065        PartialOrd::partial_cmp(&**self, &**other)
2066    }
2067    #[inline]
2068    fn lt(&self, other: &Self) -> bool {
2069        PartialOrd::lt(&**self, &**other)
2070    }
2071    #[inline]
2072    fn le(&self, other: &Self) -> bool {
2073        PartialOrd::le(&**self, &**other)
2074    }
2075    #[inline]
2076    fn ge(&self, other: &Self) -> bool {
2077        PartialOrd::ge(&**self, &**other)
2078    }
2079    #[inline]
2080    fn gt(&self, other: &Self) -> bool {
2081        PartialOrd::gt(&**self, &**other)
2082    }
2083}
2084
2085#[stable(feature = "rust1", since = "1.0.0")]
2086impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
2087    #[inline]
2088    fn cmp(&self, other: &Self) -> Ordering {
2089        Ord::cmp(&**self, &**other)
2090    }
2091}
2092
2093#[stable(feature = "rust1", since = "1.0.0")]
2094impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
2095
2096#[stable(feature = "rust1", since = "1.0.0")]
2097impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
2098    fn hash<H: Hasher>(&self, state: &mut H) {
2099        (**self).hash(state);
2100    }
2101}
2102
2103#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
2104impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
2105    fn finish(&self) -> u64 {
2106        (**self).finish()
2107    }
2108    fn write(&mut self, bytes: &[u8]) {
2109        (**self).write(bytes)
2110    }
2111    fn write_u8(&mut self, i: u8) {
2112        (**self).write_u8(i)
2113    }
2114    fn write_u16(&mut self, i: u16) {
2115        (**self).write_u16(i)
2116    }
2117    fn write_u32(&mut self, i: u32) {
2118        (**self).write_u32(i)
2119    }
2120    fn write_u64(&mut self, i: u64) {
2121        (**self).write_u64(i)
2122    }
2123    fn write_u128(&mut self, i: u128) {
2124        (**self).write_u128(i)
2125    }
2126    fn write_usize(&mut self, i: usize) {
2127        (**self).write_usize(i)
2128    }
2129    fn write_i8(&mut self, i: i8) {
2130        (**self).write_i8(i)
2131    }
2132    fn write_i16(&mut self, i: i16) {
2133        (**self).write_i16(i)
2134    }
2135    fn write_i32(&mut self, i: i32) {
2136        (**self).write_i32(i)
2137    }
2138    fn write_i64(&mut self, i: i64) {
2139        (**self).write_i64(i)
2140    }
2141    fn write_i128(&mut self, i: i128) {
2142        (**self).write_i128(i)
2143    }
2144    fn write_isize(&mut self, i: isize) {
2145        (**self).write_isize(i)
2146    }
2147    fn write_length_prefix(&mut self, len: usize) {
2148        (**self).write_length_prefix(len)
2149    }
2150    fn write_str(&mut self, s: &str) {
2151        (**self).write_str(s)
2152    }
2153}
2154
2155#[stable(feature = "rust1", since = "1.0.0")]
2156impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
2157    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2158        fmt::Display::fmt(&**self, f)
2159    }
2160}
2161
2162#[stable(feature = "rust1", since = "1.0.0")]
2163impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
2164    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2165        fmt::Debug::fmt(&**self, f)
2166    }
2167}
2168
2169#[stable(feature = "rust1", since = "1.0.0")]
2170impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
2171    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2172        // It's not possible to extract the inner Uniq directly from the Box,
2173        // instead we cast it to a *const which aliases the Unique
2174        let ptr: *const T = &**self;
2175        fmt::Pointer::fmt(&ptr, f)
2176    }
2177}
2178
2179#[stable(feature = "rust1", since = "1.0.0")]
2180impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
2181    type Target = T;
2182
2183    fn deref(&self) -> &T {
2184        &**self
2185    }
2186}
2187
2188#[stable(feature = "rust1", since = "1.0.0")]
2189impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
2190    fn deref_mut(&mut self) -> &mut T {
2191        &mut **self
2192    }
2193}
2194
2195#[unstable(feature = "deref_pure_trait", issue = "87121")]
2196unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
2197
2198#[unstable(feature = "legacy_receiver_trait", issue = "none")]
2199impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
2200
2201#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2202impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2203    type Output = <F as FnOnce<Args>>::Output;
2204
2205    extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2206        <F as FnOnce<Args>>::call_once(*self, args)
2207    }
2208}
2209
2210#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2211impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2212    extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2213        <F as FnMut<Args>>::call_mut(self, args)
2214    }
2215}
2216
2217#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2218impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2219    extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2220        <F as Fn<Args>>::call(self, args)
2221    }
2222}
2223
2224#[stable(feature = "async_closure", since = "1.85.0")]
2225impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2226    type Output = F::Output;
2227    type CallOnceFuture = F::CallOnceFuture;
2228
2229    extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2230        F::async_call_once(*self, args)
2231    }
2232}
2233
2234#[stable(feature = "async_closure", since = "1.85.0")]
2235impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2236    type CallRefFuture<'a>
2237        = F::CallRefFuture<'a>
2238    where
2239        Self: 'a;
2240
2241    extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2242        F::async_call_mut(self, args)
2243    }
2244}
2245
2246#[stable(feature = "async_closure", since = "1.85.0")]
2247impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2248    extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2249        F::async_call(self, args)
2250    }
2251}
2252
2253#[unstable(feature = "coerce_unsized", issue = "18598")]
2254impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2255
2256#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2257unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2258
2259// It is quite crucial that we only allow the `Global` allocator here.
2260// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2261// would need a lot of codegen and interpreter adjustments.
2262#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2263impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2264
2265#[stable(feature = "box_borrow", since = "1.1.0")]
2266impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2267    fn borrow(&self) -> &T {
2268        &**self
2269    }
2270}
2271
2272#[stable(feature = "box_borrow", since = "1.1.0")]
2273impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2274    fn borrow_mut(&mut self) -> &mut T {
2275        &mut **self
2276    }
2277}
2278
2279#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2280impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2281    fn as_ref(&self) -> &T {
2282        &**self
2283    }
2284}
2285
2286#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2287impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2288    fn as_mut(&mut self) -> &mut T {
2289        &mut **self
2290    }
2291}
2292
2293/* Nota bene
2294 *
2295 *  We could have chosen not to add this impl, and instead have written a
2296 *  function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2297 *  because Box<T> implements Unpin even when T does not, as a result of
2298 *  this impl.
2299 *
2300 *  We chose this API instead of the alternative for a few reasons:
2301 *      - Logically, it is helpful to understand pinning in regard to the
2302 *        memory region being pointed to. For this reason none of the
2303 *        standard library pointer types support projecting through a pin
2304 *        (Box<T> is the only pointer type in std for which this would be
2305 *        safe.)
2306 *      - It is in practice very useful to have Box<T> be unconditionally
2307 *        Unpin because of trait objects, for which the structural auto
2308 *        trait functionality does not apply (e.g., Box<dyn Foo> would
2309 *        otherwise not be Unpin).
2310 *
2311 *  Another type with the same semantics as Box but only a conditional
2312 *  implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2313 *  could have a method to project a Pin<T> from it.
2314 */
2315#[stable(feature = "pin", since = "1.33.0")]
2316impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2317
2318#[unstable(feature = "coroutine_trait", issue = "43122")]
2319impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2320    type Yield = G::Yield;
2321    type Return = G::Return;
2322
2323    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2324        G::resume(Pin::new(&mut *self), arg)
2325    }
2326}
2327
2328#[unstable(feature = "coroutine_trait", issue = "43122")]
2329impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2330where
2331    A: 'static,
2332{
2333    type Yield = G::Yield;
2334    type Return = G::Return;
2335
2336    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2337        G::resume((*self).as_mut(), arg)
2338    }
2339}
2340
2341#[stable(feature = "futures_api", since = "1.36.0")]
2342impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2343    type Output = F::Output;
2344
2345    fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2346        F::poll(Pin::new(&mut *self), cx)
2347    }
2348}
2349
2350#[stable(feature = "box_error", since = "1.8.0")]
2351impl<E: Error> Error for Box<E> {
2352    #[allow(deprecated)]
2353    fn cause(&self) -> Option<&dyn Error> {
2354        Error::cause(&**self)
2355    }
2356
2357    fn source(&self) -> Option<&(dyn Error + 'static)> {
2358        Error::source(&**self)
2359    }
2360
2361    fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2362        Error::provide(&**self, request);
2363    }
2364}
2365
2366#[unstable(feature = "allocator_api", issue = "32838")]
2367unsafe impl<T: ?Sized + Allocator, A: Allocator> Allocator for Box<T, A> {
2368    #[inline]
2369    fn allocate(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2370        (**self).allocate(layout)
2371    }
2372
2373    #[inline]
2374    fn allocate_zeroed(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2375        (**self).allocate_zeroed(layout)
2376    }
2377
2378    #[inline]
2379    unsafe fn deallocate(&self, ptr: NonNull<u8>, layout: Layout) {
2380        // SAFETY: the safety contract must be upheld by the caller
2381        unsafe { (**self).deallocate(ptr, layout) }
2382    }
2383
2384    #[inline]
2385    unsafe fn grow(
2386        &self,
2387        ptr: NonNull<u8>,
2388        old_layout: Layout,
2389        new_layout: Layout,
2390    ) -> Result<NonNull<[u8]>, AllocError> {
2391        // SAFETY: the safety contract must be upheld by the caller
2392        unsafe { (**self).grow(ptr, old_layout, new_layout) }
2393    }
2394
2395    #[inline]
2396    unsafe fn grow_zeroed(
2397        &self,
2398        ptr: NonNull<u8>,
2399        old_layout: Layout,
2400        new_layout: Layout,
2401    ) -> Result<NonNull<[u8]>, AllocError> {
2402        // SAFETY: the safety contract must be upheld by the caller
2403        unsafe { (**self).grow_zeroed(ptr, old_layout, new_layout) }
2404    }
2405
2406    #[inline]
2407    unsafe fn shrink(
2408        &self,
2409        ptr: NonNull<u8>,
2410        old_layout: Layout,
2411        new_layout: Layout,
2412    ) -> Result<NonNull<[u8]>, AllocError> {
2413        // SAFETY: the safety contract must be upheld by the caller
2414        unsafe { (**self).shrink(ptr, old_layout, new_layout) }
2415    }
2416}