alloc/boxed.rs
1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//! Cons(T, Box<List<T>>),
31//! Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//! Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//! let (i, x): (usize, &i32) = item;
155//! println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//! let (i, x): (usize, &i32) = item;
161//! println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//! let (i, x): (usize, i32) = item;
167//! println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187use core::clone::CloneToUninit;
188use core::cmp::Ordering;
189use core::error::{self, Error};
190use core::fmt;
191use core::future::Future;
192use core::hash::{Hash, Hasher};
193use core::marker::{Tuple, Unsize};
194#[cfg(not(no_global_oom_handling))]
195use core::mem::MaybeUninit;
196use core::mem::{self, SizedTypeProperties};
197use core::ops::{
198 AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
199 DerefPure, DispatchFromDyn, LegacyReceiver,
200};
201#[cfg(not(no_global_oom_handling))]
202use core::ops::{Residual, Try};
203use core::pin::{Pin, PinCoerceUnsized};
204use core::ptr::{self, NonNull, Unique};
205use core::task::{Context, Poll};
206
207#[cfg(not(no_global_oom_handling))]
208use crate::alloc::handle_alloc_error;
209use crate::alloc::{AllocError, Allocator, Global, Layout};
210use crate::raw_vec::RawVec;
211#[cfg(not(no_global_oom_handling))]
212use crate::str::from_boxed_utf8_unchecked;
213
214/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
215mod convert;
216/// Iterator related impls for `Box<_>`.
217mod iter;
218/// [`ThinBox`] implementation.
219mod thin;
220
221#[unstable(feature = "thin_box", issue = "92791")]
222pub use thin::ThinBox;
223
224/// A pointer type that uniquely owns a heap allocation of type `T`.
225///
226/// See the [module-level documentation](../../std/boxed/index.html) for more.
227#[lang = "owned_box"]
228#[fundamental]
229#[stable(feature = "rust1", since = "1.0.0")]
230#[rustc_insignificant_dtor]
231#[doc(search_unbox)]
232// The declaration of the `Box` struct must be kept in sync with the
233// compiler or ICEs will happen.
234pub struct Box<
235 T: ?Sized,
236 #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
237>(Unique<T>, A);
238
239/// Monomorphic function for allocating an uninit `Box`.
240///
241/// # Safety
242///
243/// size and align need to be safe for `Layout::from_size_align_unchecked`.
244#[inline]
245#[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
246#[cfg(not(no_global_oom_handling))]
247unsafe fn box_new_uninit(size: usize, align: usize) -> *mut u8 {
248 let layout = unsafe { Layout::from_size_align_unchecked(size, align) };
249 match Global.allocate(layout) {
250 Ok(ptr) => ptr.as_mut_ptr(),
251 Err(_) => handle_alloc_error(layout),
252 }
253}
254
255/// Helper for `vec!`.
256///
257/// This is unsafe, but has to be marked as safe or else we couldn't use it in `vec!`.
258#[doc(hidden)]
259#[unstable(feature = "liballoc_internals", issue = "none")]
260#[inline(always)]
261#[cfg(not(no_global_oom_handling))]
262#[rustc_diagnostic_item = "box_assume_init_into_vec_unsafe"]
263pub fn box_assume_init_into_vec_unsafe<T, const N: usize>(
264 b: Box<MaybeUninit<[T; N]>>,
265) -> crate::vec::Vec<T> {
266 unsafe { (b.assume_init() as Box<[T]>).into_vec() }
267}
268
269impl<T> Box<T> {
270 /// Allocates memory on the heap and then places `x` into it.
271 ///
272 /// This doesn't actually allocate if `T` is zero-sized.
273 ///
274 /// # Examples
275 ///
276 /// ```
277 /// let five = Box::new(5);
278 /// ```
279 #[cfg(not(no_global_oom_handling))]
280 #[inline(always)]
281 #[stable(feature = "rust1", since = "1.0.0")]
282 #[must_use]
283 #[rustc_diagnostic_item = "box_new"]
284 #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
285 pub fn new(x: T) -> Self {
286 // This is `Box::new_uninit` but inlined to avoid build time regressions.
287 // SAFETY: The size and align of a valid type `T` are always valid for `Layout`.
288 let ptr = unsafe {
289 box_new_uninit(<T as SizedTypeProperties>::SIZE, <T as SizedTypeProperties>::ALIGN)
290 } as *mut T;
291 // Nothing below can panic so we do not have to worry about deallocating `ptr`.
292 // SAFETY: we just allocated the box to store `x`.
293 unsafe { core::intrinsics::write_via_move(ptr, x) };
294 // SAFETY: we just initialized `b`.
295 unsafe { mem::transmute(ptr) }
296 }
297
298 /// Constructs a new box with uninitialized contents.
299 ///
300 /// # Examples
301 ///
302 /// ```
303 /// let mut five = Box::<u32>::new_uninit();
304 /// // Deferred initialization:
305 /// five.write(5);
306 /// let five = unsafe { five.assume_init() };
307 ///
308 /// assert_eq!(*five, 5)
309 /// ```
310 #[cfg(not(no_global_oom_handling))]
311 #[stable(feature = "new_uninit", since = "1.82.0")]
312 #[must_use]
313 #[inline(always)]
314 #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
315 pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
316 // This is the same as `Self::new_uninit_in(Global)`, but manually inlined (just like
317 // `Box::new`).
318
319 // SAFETY:
320 // - The size and align of a valid type `T` are always valid for `Layout`.
321 // - If `allocate` succeeds, the returned pointer exactly matches what `Box` needs.
322 unsafe {
323 mem::transmute(box_new_uninit(
324 <T as SizedTypeProperties>::SIZE,
325 <T as SizedTypeProperties>::ALIGN,
326 ))
327 }
328 }
329
330 /// Constructs a new `Box` with uninitialized contents, with the memory
331 /// being filled with `0` bytes.
332 ///
333 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
334 /// of this method.
335 ///
336 /// # Examples
337 ///
338 /// ```
339 /// let zero = Box::<u32>::new_zeroed();
340 /// let zero = unsafe { zero.assume_init() };
341 ///
342 /// assert_eq!(*zero, 0)
343 /// ```
344 ///
345 /// [zeroed]: mem::MaybeUninit::zeroed
346 #[cfg(not(no_global_oom_handling))]
347 #[inline]
348 #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
349 #[must_use]
350 pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
351 Self::new_zeroed_in(Global)
352 }
353
354 /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
355 /// `x` will be pinned in memory and unable to be moved.
356 ///
357 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
358 /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
359 /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
360 /// construct a (pinned) `Box` in a different way than with [`Box::new`].
361 #[cfg(not(no_global_oom_handling))]
362 #[stable(feature = "pin", since = "1.33.0")]
363 #[must_use]
364 #[inline(always)]
365 pub fn pin(x: T) -> Pin<Box<T>> {
366 Box::new(x).into()
367 }
368
369 /// Allocates memory on the heap then places `x` into it,
370 /// returning an error if the allocation fails
371 ///
372 /// This doesn't actually allocate if `T` is zero-sized.
373 ///
374 /// # Examples
375 ///
376 /// ```
377 /// #![feature(allocator_api)]
378 ///
379 /// let five = Box::try_new(5)?;
380 /// # Ok::<(), std::alloc::AllocError>(())
381 /// ```
382 #[unstable(feature = "allocator_api", issue = "32838")]
383 #[inline]
384 pub fn try_new(x: T) -> Result<Self, AllocError> {
385 Self::try_new_in(x, Global)
386 }
387
388 /// Constructs a new box with uninitialized contents on the heap,
389 /// returning an error if the allocation fails
390 ///
391 /// # Examples
392 ///
393 /// ```
394 /// #![feature(allocator_api)]
395 ///
396 /// let mut five = Box::<u32>::try_new_uninit()?;
397 /// // Deferred initialization:
398 /// five.write(5);
399 /// let five = unsafe { five.assume_init() };
400 ///
401 /// assert_eq!(*five, 5);
402 /// # Ok::<(), std::alloc::AllocError>(())
403 /// ```
404 #[unstable(feature = "allocator_api", issue = "32838")]
405 #[inline]
406 pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
407 Box::try_new_uninit_in(Global)
408 }
409
410 /// Constructs a new `Box` with uninitialized contents, with the memory
411 /// being filled with `0` bytes on the heap
412 ///
413 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
414 /// of this method.
415 ///
416 /// # Examples
417 ///
418 /// ```
419 /// #![feature(allocator_api)]
420 ///
421 /// let zero = Box::<u32>::try_new_zeroed()?;
422 /// let zero = unsafe { zero.assume_init() };
423 ///
424 /// assert_eq!(*zero, 0);
425 /// # Ok::<(), std::alloc::AllocError>(())
426 /// ```
427 ///
428 /// [zeroed]: mem::MaybeUninit::zeroed
429 #[unstable(feature = "allocator_api", issue = "32838")]
430 #[inline]
431 pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
432 Box::try_new_zeroed_in(Global)
433 }
434
435 /// Maps the value in a box, reusing the allocation if possible.
436 ///
437 /// `f` is called on the value in the box, and the result is returned, also boxed.
438 ///
439 /// Note: this is an associated function, which means that you have
440 /// to call it as `Box::map(b, f)` instead of `b.map(f)`. This
441 /// is so that there is no conflict with a method on the inner type.
442 ///
443 /// # Examples
444 ///
445 /// ```
446 /// #![feature(smart_pointer_try_map)]
447 ///
448 /// let b = Box::new(7);
449 /// let new = Box::map(b, |i| i + 7);
450 /// assert_eq!(*new, 14);
451 /// ```
452 #[cfg(not(no_global_oom_handling))]
453 #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
454 pub fn map<U>(this: Self, f: impl FnOnce(T) -> U) -> Box<U> {
455 if size_of::<T>() == size_of::<U>() && align_of::<T>() == align_of::<U>() {
456 let (value, allocation) = Box::take(this);
457 Box::write(
458 unsafe { mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<U>>>(allocation) },
459 f(value),
460 )
461 } else {
462 Box::new(f(*this))
463 }
464 }
465
466 /// Attempts to map the value in a box, reusing the allocation if possible.
467 ///
468 /// `f` is called on the value in the box, and if the operation succeeds, the result is
469 /// returned, also boxed.
470 ///
471 /// Note: this is an associated function, which means that you have
472 /// to call it as `Box::try_map(b, f)` instead of `b.try_map(f)`. This
473 /// is so that there is no conflict with a method on the inner type.
474 ///
475 /// # Examples
476 ///
477 /// ```
478 /// #![feature(smart_pointer_try_map)]
479 ///
480 /// let b = Box::new(7);
481 /// let new = Box::try_map(b, u32::try_from).unwrap();
482 /// assert_eq!(*new, 7);
483 /// ```
484 #[cfg(not(no_global_oom_handling))]
485 #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
486 pub fn try_map<R>(
487 this: Self,
488 f: impl FnOnce(T) -> R,
489 ) -> <R::Residual as Residual<Box<R::Output>>>::TryType
490 where
491 R: Try,
492 R::Residual: Residual<Box<R::Output>>,
493 {
494 if size_of::<T>() == size_of::<R::Output>() && align_of::<T>() == align_of::<R::Output>() {
495 let (value, allocation) = Box::take(this);
496 try {
497 Box::write(
498 unsafe {
499 mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<R::Output>>>(
500 allocation,
501 )
502 },
503 f(value)?,
504 )
505 }
506 } else {
507 try { Box::new(f(*this)?) }
508 }
509 }
510}
511
512impl<T, A: Allocator> Box<T, A> {
513 /// Allocates memory in the given allocator then places `x` into it.
514 ///
515 /// This doesn't actually allocate if `T` is zero-sized.
516 ///
517 /// # Examples
518 ///
519 /// ```
520 /// #![feature(allocator_api)]
521 ///
522 /// use std::alloc::System;
523 ///
524 /// let five = Box::new_in(5, System);
525 /// ```
526 #[cfg(not(no_global_oom_handling))]
527 #[unstable(feature = "allocator_api", issue = "32838")]
528 #[must_use]
529 #[inline]
530 pub fn new_in(x: T, alloc: A) -> Self
531 where
532 A: Allocator,
533 {
534 let mut boxed = Self::new_uninit_in(alloc);
535 boxed.write(x);
536 unsafe { boxed.assume_init() }
537 }
538
539 /// Allocates memory in the given allocator then places `x` into it,
540 /// returning an error if the allocation fails
541 ///
542 /// This doesn't actually allocate if `T` is zero-sized.
543 ///
544 /// # Examples
545 ///
546 /// ```
547 /// #![feature(allocator_api)]
548 ///
549 /// use std::alloc::System;
550 ///
551 /// let five = Box::try_new_in(5, System)?;
552 /// # Ok::<(), std::alloc::AllocError>(())
553 /// ```
554 #[unstable(feature = "allocator_api", issue = "32838")]
555 #[inline]
556 pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
557 where
558 A: Allocator,
559 {
560 let mut boxed = Self::try_new_uninit_in(alloc)?;
561 boxed.write(x);
562 unsafe { Ok(boxed.assume_init()) }
563 }
564
565 /// Constructs a new box with uninitialized contents in the provided allocator.
566 ///
567 /// # Examples
568 ///
569 /// ```
570 /// #![feature(allocator_api)]
571 ///
572 /// use std::alloc::System;
573 ///
574 /// let mut five = Box::<u32, _>::new_uninit_in(System);
575 /// // Deferred initialization:
576 /// five.write(5);
577 /// let five = unsafe { five.assume_init() };
578 ///
579 /// assert_eq!(*five, 5)
580 /// ```
581 #[unstable(feature = "allocator_api", issue = "32838")]
582 #[cfg(not(no_global_oom_handling))]
583 #[must_use]
584 pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
585 where
586 A: Allocator,
587 {
588 let layout = Layout::new::<mem::MaybeUninit<T>>();
589 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
590 // That would make code size bigger.
591 match Box::try_new_uninit_in(alloc) {
592 Ok(m) => m,
593 Err(_) => handle_alloc_error(layout),
594 }
595 }
596
597 /// Constructs a new box with uninitialized contents in the provided allocator,
598 /// returning an error if the allocation fails
599 ///
600 /// # Examples
601 ///
602 /// ```
603 /// #![feature(allocator_api)]
604 ///
605 /// use std::alloc::System;
606 ///
607 /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
608 /// // Deferred initialization:
609 /// five.write(5);
610 /// let five = unsafe { five.assume_init() };
611 ///
612 /// assert_eq!(*five, 5);
613 /// # Ok::<(), std::alloc::AllocError>(())
614 /// ```
615 #[unstable(feature = "allocator_api", issue = "32838")]
616 pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
617 where
618 A: Allocator,
619 {
620 let ptr = if T::IS_ZST {
621 NonNull::dangling()
622 } else {
623 let layout = Layout::new::<mem::MaybeUninit<T>>();
624 alloc.allocate(layout)?.cast()
625 };
626 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
627 }
628
629 /// Constructs a new `Box` with uninitialized contents, with the memory
630 /// being filled with `0` bytes in the provided allocator.
631 ///
632 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
633 /// of this method.
634 ///
635 /// # Examples
636 ///
637 /// ```
638 /// #![feature(allocator_api)]
639 ///
640 /// use std::alloc::System;
641 ///
642 /// let zero = Box::<u32, _>::new_zeroed_in(System);
643 /// let zero = unsafe { zero.assume_init() };
644 ///
645 /// assert_eq!(*zero, 0)
646 /// ```
647 ///
648 /// [zeroed]: mem::MaybeUninit::zeroed
649 #[unstable(feature = "allocator_api", issue = "32838")]
650 #[cfg(not(no_global_oom_handling))]
651 #[must_use]
652 pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
653 where
654 A: Allocator,
655 {
656 let layout = Layout::new::<mem::MaybeUninit<T>>();
657 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
658 // That would make code size bigger.
659 match Box::try_new_zeroed_in(alloc) {
660 Ok(m) => m,
661 Err(_) => handle_alloc_error(layout),
662 }
663 }
664
665 /// Constructs a new `Box` with uninitialized contents, with the memory
666 /// being filled with `0` bytes in the provided allocator,
667 /// returning an error if the allocation fails,
668 ///
669 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
670 /// of this method.
671 ///
672 /// # Examples
673 ///
674 /// ```
675 /// #![feature(allocator_api)]
676 ///
677 /// use std::alloc::System;
678 ///
679 /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
680 /// let zero = unsafe { zero.assume_init() };
681 ///
682 /// assert_eq!(*zero, 0);
683 /// # Ok::<(), std::alloc::AllocError>(())
684 /// ```
685 ///
686 /// [zeroed]: mem::MaybeUninit::zeroed
687 #[unstable(feature = "allocator_api", issue = "32838")]
688 pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
689 where
690 A: Allocator,
691 {
692 let ptr = if T::IS_ZST {
693 NonNull::dangling()
694 } else {
695 let layout = Layout::new::<mem::MaybeUninit<T>>();
696 alloc.allocate_zeroed(layout)?.cast()
697 };
698 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
699 }
700
701 /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
702 /// `x` will be pinned in memory and unable to be moved.
703 ///
704 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
705 /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
706 /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
707 /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
708 #[cfg(not(no_global_oom_handling))]
709 #[unstable(feature = "allocator_api", issue = "32838")]
710 #[must_use]
711 #[inline(always)]
712 pub fn pin_in(x: T, alloc: A) -> Pin<Self>
713 where
714 A: 'static + Allocator,
715 {
716 Self::into_pin(Self::new_in(x, alloc))
717 }
718
719 /// Converts a `Box<T>` into a `Box<[T]>`
720 ///
721 /// This conversion does not allocate on the heap and happens in place.
722 #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
723 pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
724 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
725 unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
726 }
727
728 /// Consumes the `Box`, returning the wrapped value.
729 ///
730 /// # Examples
731 ///
732 /// ```
733 /// #![feature(box_into_inner)]
734 ///
735 /// let c = Box::new(5);
736 ///
737 /// assert_eq!(Box::into_inner(c), 5);
738 /// ```
739 #[unstable(feature = "box_into_inner", issue = "80437")]
740 #[inline]
741 pub fn into_inner(boxed: Self) -> T {
742 *boxed
743 }
744
745 /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
746 /// to the uninitialized memory where the wrapped value used to live.
747 ///
748 /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
749 /// boxed values.
750 ///
751 /// # Examples
752 ///
753 /// ```
754 /// #![feature(box_take)]
755 ///
756 /// let c = Box::new(5);
757 ///
758 /// // take the value out of the box
759 /// let (value, uninit) = Box::take(c);
760 /// assert_eq!(value, 5);
761 ///
762 /// // reuse the box for a second value
763 /// let c = Box::write(uninit, 6);
764 /// assert_eq!(*c, 6);
765 /// ```
766 #[unstable(feature = "box_take", issue = "147212")]
767 pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
768 unsafe {
769 let (raw, alloc) = Box::into_non_null_with_allocator(boxed);
770 let value = raw.read();
771 let uninit = Box::from_non_null_in(raw.cast_uninit(), alloc);
772 (value, uninit)
773 }
774 }
775}
776
777impl<T: ?Sized + CloneToUninit> Box<T> {
778 /// Allocates memory on the heap then clones `src` into it.
779 ///
780 /// This doesn't actually allocate if `src` is zero-sized.
781 ///
782 /// # Examples
783 ///
784 /// ```
785 /// #![feature(clone_from_ref)]
786 ///
787 /// let hello: Box<str> = Box::clone_from_ref("hello");
788 /// ```
789 #[cfg(not(no_global_oom_handling))]
790 #[unstable(feature = "clone_from_ref", issue = "149075")]
791 #[must_use]
792 #[inline]
793 pub fn clone_from_ref(src: &T) -> Box<T> {
794 Box::clone_from_ref_in(src, Global)
795 }
796
797 /// Allocates memory on the heap then clones `src` into it, returning an error if allocation fails.
798 ///
799 /// This doesn't actually allocate if `src` is zero-sized.
800 ///
801 /// # Examples
802 ///
803 /// ```
804 /// #![feature(clone_from_ref)]
805 /// #![feature(allocator_api)]
806 ///
807 /// let hello: Box<str> = Box::try_clone_from_ref("hello")?;
808 /// # Ok::<(), std::alloc::AllocError>(())
809 /// ```
810 #[unstable(feature = "clone_from_ref", issue = "149075")]
811 //#[unstable(feature = "allocator_api", issue = "32838")]
812 #[must_use]
813 #[inline]
814 pub fn try_clone_from_ref(src: &T) -> Result<Box<T>, AllocError> {
815 Box::try_clone_from_ref_in(src, Global)
816 }
817}
818
819impl<T: ?Sized + CloneToUninit, A: Allocator> Box<T, A> {
820 /// Allocates memory in the given allocator then clones `src` into it.
821 ///
822 /// This doesn't actually allocate if `src` is zero-sized.
823 ///
824 /// # Examples
825 ///
826 /// ```
827 /// #![feature(clone_from_ref)]
828 /// #![feature(allocator_api)]
829 ///
830 /// use std::alloc::System;
831 ///
832 /// let hello: Box<str, System> = Box::clone_from_ref_in("hello", System);
833 /// ```
834 #[cfg(not(no_global_oom_handling))]
835 #[unstable(feature = "clone_from_ref", issue = "149075")]
836 //#[unstable(feature = "allocator_api", issue = "32838")]
837 #[must_use]
838 #[inline]
839 pub fn clone_from_ref_in(src: &T, alloc: A) -> Box<T, A> {
840 let layout = Layout::for_value::<T>(src);
841 match Box::try_clone_from_ref_in(src, alloc) {
842 Ok(bx) => bx,
843 Err(_) => handle_alloc_error(layout),
844 }
845 }
846
847 /// Allocates memory in the given allocator then clones `src` into it, returning an error if allocation fails.
848 ///
849 /// This doesn't actually allocate if `src` is zero-sized.
850 ///
851 /// # Examples
852 ///
853 /// ```
854 /// #![feature(clone_from_ref)]
855 /// #![feature(allocator_api)]
856 ///
857 /// use std::alloc::System;
858 ///
859 /// let hello: Box<str, System> = Box::try_clone_from_ref_in("hello", System)?;
860 /// # Ok::<(), std::alloc::AllocError>(())
861 /// ```
862 #[unstable(feature = "clone_from_ref", issue = "149075")]
863 //#[unstable(feature = "allocator_api", issue = "32838")]
864 #[must_use]
865 #[inline]
866 pub fn try_clone_from_ref_in(src: &T, alloc: A) -> Result<Box<T, A>, AllocError> {
867 struct DeallocDropGuard<'a, A: Allocator>(Layout, &'a A, NonNull<u8>);
868 impl<'a, A: Allocator> Drop for DeallocDropGuard<'a, A> {
869 fn drop(&mut self) {
870 let &mut DeallocDropGuard(layout, alloc, ptr) = self;
871 // Safety: `ptr` was allocated by `*alloc` with layout `layout`
872 unsafe {
873 alloc.deallocate(ptr, layout);
874 }
875 }
876 }
877 let layout = Layout::for_value::<T>(src);
878 let (ptr, guard) = if layout.size() == 0 {
879 (layout.dangling_ptr(), None)
880 } else {
881 // Safety: layout is non-zero-sized
882 let ptr = alloc.allocate(layout)?.cast();
883 (ptr, Some(DeallocDropGuard(layout, &alloc, ptr)))
884 };
885 let ptr = ptr.as_ptr();
886 // Safety: `*ptr` is newly allocated, correctly aligned to `align_of_val(src)`,
887 // and is valid for writes for `size_of_val(src)`.
888 // If this panics, then `guard` will deallocate for us (if allocation occuured)
889 unsafe {
890 <T as CloneToUninit>::clone_to_uninit(src, ptr);
891 }
892 // Defuse the deallocate guard
893 core::mem::forget(guard);
894 // Safety: We just initialized `*ptr` as a clone of `src`
895 Ok(unsafe { Box::from_raw_in(ptr.with_metadata_of(src), alloc) })
896 }
897}
898
899impl<T> Box<[T]> {
900 /// Constructs a new boxed slice with uninitialized contents.
901 ///
902 /// # Examples
903 ///
904 /// ```
905 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
906 /// // Deferred initialization:
907 /// values[0].write(1);
908 /// values[1].write(2);
909 /// values[2].write(3);
910 /// let values = unsafe { values.assume_init() };
911 ///
912 /// assert_eq!(*values, [1, 2, 3])
913 /// ```
914 #[cfg(not(no_global_oom_handling))]
915 #[stable(feature = "new_uninit", since = "1.82.0")]
916 #[must_use]
917 pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
918 unsafe { RawVec::with_capacity(len).into_box(len) }
919 }
920
921 /// Constructs a new boxed slice with uninitialized contents, with the memory
922 /// being filled with `0` bytes.
923 ///
924 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
925 /// of this method.
926 ///
927 /// # Examples
928 ///
929 /// ```
930 /// let values = Box::<[u32]>::new_zeroed_slice(3);
931 /// let values = unsafe { values.assume_init() };
932 ///
933 /// assert_eq!(*values, [0, 0, 0])
934 /// ```
935 ///
936 /// [zeroed]: mem::MaybeUninit::zeroed
937 #[cfg(not(no_global_oom_handling))]
938 #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
939 #[must_use]
940 pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
941 unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
942 }
943
944 /// Constructs a new boxed slice with uninitialized contents. Returns an error if
945 /// the allocation fails.
946 ///
947 /// # Examples
948 ///
949 /// ```
950 /// #![feature(allocator_api)]
951 ///
952 /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
953 /// // Deferred initialization:
954 /// values[0].write(1);
955 /// values[1].write(2);
956 /// values[2].write(3);
957 /// let values = unsafe { values.assume_init() };
958 ///
959 /// assert_eq!(*values, [1, 2, 3]);
960 /// # Ok::<(), std::alloc::AllocError>(())
961 /// ```
962 #[unstable(feature = "allocator_api", issue = "32838")]
963 #[inline]
964 pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
965 let ptr = if T::IS_ZST || len == 0 {
966 NonNull::dangling()
967 } else {
968 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
969 Ok(l) => l,
970 Err(_) => return Err(AllocError),
971 };
972 Global.allocate(layout)?.cast()
973 };
974 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
975 }
976
977 /// Constructs a new boxed slice with uninitialized contents, with the memory
978 /// being filled with `0` bytes. Returns an error if the allocation fails.
979 ///
980 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
981 /// of this method.
982 ///
983 /// # Examples
984 ///
985 /// ```
986 /// #![feature(allocator_api)]
987 ///
988 /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
989 /// let values = unsafe { values.assume_init() };
990 ///
991 /// assert_eq!(*values, [0, 0, 0]);
992 /// # Ok::<(), std::alloc::AllocError>(())
993 /// ```
994 ///
995 /// [zeroed]: mem::MaybeUninit::zeroed
996 #[unstable(feature = "allocator_api", issue = "32838")]
997 #[inline]
998 pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
999 let ptr = if T::IS_ZST || len == 0 {
1000 NonNull::dangling()
1001 } else {
1002 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1003 Ok(l) => l,
1004 Err(_) => return Err(AllocError),
1005 };
1006 Global.allocate_zeroed(layout)?.cast()
1007 };
1008 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
1009 }
1010
1011 /// Converts the boxed slice into a boxed array.
1012 ///
1013 /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
1014 ///
1015 /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
1016 #[unstable(feature = "alloc_slice_into_array", issue = "148082")]
1017 #[inline]
1018 #[must_use]
1019 pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
1020 if self.len() == N {
1021 let ptr = Self::into_raw(self) as *mut [T; N];
1022
1023 // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
1024 let me = unsafe { Box::from_raw(ptr) };
1025 Some(me)
1026 } else {
1027 None
1028 }
1029 }
1030}
1031
1032impl<T, A: Allocator> Box<[T], A> {
1033 /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
1034 ///
1035 /// # Examples
1036 ///
1037 /// ```
1038 /// #![feature(allocator_api)]
1039 ///
1040 /// use std::alloc::System;
1041 ///
1042 /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
1043 /// // Deferred initialization:
1044 /// values[0].write(1);
1045 /// values[1].write(2);
1046 /// values[2].write(3);
1047 /// let values = unsafe { values.assume_init() };
1048 ///
1049 /// assert_eq!(*values, [1, 2, 3])
1050 /// ```
1051 #[cfg(not(no_global_oom_handling))]
1052 #[unstable(feature = "allocator_api", issue = "32838")]
1053 #[must_use]
1054 pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1055 unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
1056 }
1057
1058 /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
1059 /// with the memory being filled with `0` bytes.
1060 ///
1061 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1062 /// of this method.
1063 ///
1064 /// # Examples
1065 ///
1066 /// ```
1067 /// #![feature(allocator_api)]
1068 ///
1069 /// use std::alloc::System;
1070 ///
1071 /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
1072 /// let values = unsafe { values.assume_init() };
1073 ///
1074 /// assert_eq!(*values, [0, 0, 0])
1075 /// ```
1076 ///
1077 /// [zeroed]: mem::MaybeUninit::zeroed
1078 #[cfg(not(no_global_oom_handling))]
1079 #[unstable(feature = "allocator_api", issue = "32838")]
1080 #[must_use]
1081 pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1082 unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
1083 }
1084
1085 /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
1086 /// the allocation fails.
1087 ///
1088 /// # Examples
1089 ///
1090 /// ```
1091 /// #![feature(allocator_api)]
1092 ///
1093 /// use std::alloc::System;
1094 ///
1095 /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
1096 /// // Deferred initialization:
1097 /// values[0].write(1);
1098 /// values[1].write(2);
1099 /// values[2].write(3);
1100 /// let values = unsafe { values.assume_init() };
1101 ///
1102 /// assert_eq!(*values, [1, 2, 3]);
1103 /// # Ok::<(), std::alloc::AllocError>(())
1104 /// ```
1105 #[unstable(feature = "allocator_api", issue = "32838")]
1106 #[inline]
1107 pub fn try_new_uninit_slice_in(
1108 len: usize,
1109 alloc: A,
1110 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1111 let ptr = if T::IS_ZST || len == 0 {
1112 NonNull::dangling()
1113 } else {
1114 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1115 Ok(l) => l,
1116 Err(_) => return Err(AllocError),
1117 };
1118 alloc.allocate(layout)?.cast()
1119 };
1120 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1121 }
1122
1123 /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
1124 /// being filled with `0` bytes. Returns an error if the allocation fails.
1125 ///
1126 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1127 /// of this method.
1128 ///
1129 /// # Examples
1130 ///
1131 /// ```
1132 /// #![feature(allocator_api)]
1133 ///
1134 /// use std::alloc::System;
1135 ///
1136 /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
1137 /// let values = unsafe { values.assume_init() };
1138 ///
1139 /// assert_eq!(*values, [0, 0, 0]);
1140 /// # Ok::<(), std::alloc::AllocError>(())
1141 /// ```
1142 ///
1143 /// [zeroed]: mem::MaybeUninit::zeroed
1144 #[unstable(feature = "allocator_api", issue = "32838")]
1145 #[inline]
1146 pub fn try_new_zeroed_slice_in(
1147 len: usize,
1148 alloc: A,
1149 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1150 let ptr = if T::IS_ZST || len == 0 {
1151 NonNull::dangling()
1152 } else {
1153 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1154 Ok(l) => l,
1155 Err(_) => return Err(AllocError),
1156 };
1157 alloc.allocate_zeroed(layout)?.cast()
1158 };
1159 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1160 }
1161}
1162
1163impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
1164 /// Converts to `Box<T, A>`.
1165 ///
1166 /// # Safety
1167 ///
1168 /// As with [`MaybeUninit::assume_init`],
1169 /// it is up to the caller to guarantee that the value
1170 /// really is in an initialized state.
1171 /// Calling this when the content is not yet fully initialized
1172 /// causes immediate undefined behavior.
1173 ///
1174 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1175 ///
1176 /// # Examples
1177 ///
1178 /// ```
1179 /// let mut five = Box::<u32>::new_uninit();
1180 /// // Deferred initialization:
1181 /// five.write(5);
1182 /// let five: Box<u32> = unsafe { five.assume_init() };
1183 ///
1184 /// assert_eq!(*five, 5)
1185 /// ```
1186 #[stable(feature = "new_uninit", since = "1.82.0")]
1187 #[inline(always)]
1188 pub unsafe fn assume_init(self) -> Box<T, A> {
1189 // This is used in the `vec!` macro, so we optimize for minimal IR generation
1190 // even in debug builds.
1191 // SAFETY: `Box<T>` and `Box<MaybeUninit<T>>` have the same layout.
1192 unsafe { core::intrinsics::transmute_unchecked(self) }
1193 }
1194
1195 /// Writes the value and converts to `Box<T, A>`.
1196 ///
1197 /// This method converts the box similarly to [`Box::assume_init`] but
1198 /// writes `value` into it before conversion thus guaranteeing safety.
1199 /// In some scenarios use of this method may improve performance because
1200 /// the compiler may be able to optimize copying from stack.
1201 ///
1202 /// # Examples
1203 ///
1204 /// ```
1205 /// let big_box = Box::<[usize; 1024]>::new_uninit();
1206 ///
1207 /// let mut array = [0; 1024];
1208 /// for (i, place) in array.iter_mut().enumerate() {
1209 /// *place = i;
1210 /// }
1211 ///
1212 /// // The optimizer may be able to elide this copy, so previous code writes
1213 /// // to heap directly.
1214 /// let big_box = Box::write(big_box, array);
1215 ///
1216 /// for (i, x) in big_box.iter().enumerate() {
1217 /// assert_eq!(*x, i);
1218 /// }
1219 /// ```
1220 #[stable(feature = "box_uninit_write", since = "1.87.0")]
1221 #[inline]
1222 pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
1223 unsafe {
1224 (*boxed).write(value);
1225 boxed.assume_init()
1226 }
1227 }
1228}
1229
1230impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
1231 /// Converts to `Box<[T], A>`.
1232 ///
1233 /// # Safety
1234 ///
1235 /// As with [`MaybeUninit::assume_init`],
1236 /// it is up to the caller to guarantee that the values
1237 /// really are in an initialized state.
1238 /// Calling this when the content is not yet fully initialized
1239 /// causes immediate undefined behavior.
1240 ///
1241 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1242 ///
1243 /// # Examples
1244 ///
1245 /// ```
1246 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1247 /// // Deferred initialization:
1248 /// values[0].write(1);
1249 /// values[1].write(2);
1250 /// values[2].write(3);
1251 /// let values = unsafe { values.assume_init() };
1252 ///
1253 /// assert_eq!(*values, [1, 2, 3])
1254 /// ```
1255 #[stable(feature = "new_uninit", since = "1.82.0")]
1256 #[inline]
1257 pub unsafe fn assume_init(self) -> Box<[T], A> {
1258 let (raw, alloc) = Box::into_raw_with_allocator(self);
1259 unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1260 }
1261}
1262
1263impl<T: ?Sized> Box<T> {
1264 /// Constructs a box from a raw pointer.
1265 ///
1266 /// After calling this function, the raw pointer is owned by the
1267 /// resulting `Box`. Specifically, the `Box` destructor will call
1268 /// the destructor of `T` and free the allocated memory. For this
1269 /// to be safe, the memory must have been allocated in accordance
1270 /// with the [memory layout] used by `Box` .
1271 ///
1272 /// # Safety
1273 ///
1274 /// This function is unsafe because improper use may lead to
1275 /// memory problems. For example, a double-free may occur if the
1276 /// function is called twice on the same raw pointer.
1277 ///
1278 /// The raw pointer must point to a block of memory allocated by the global allocator.
1279 ///
1280 /// The safety conditions are described in the [memory layout] section.
1281 ///
1282 /// # Examples
1283 ///
1284 /// Recreate a `Box` which was previously converted to a raw pointer
1285 /// using [`Box::into_raw`]:
1286 /// ```
1287 /// let x = Box::new(5);
1288 /// let ptr = Box::into_raw(x);
1289 /// let x = unsafe { Box::from_raw(ptr) };
1290 /// ```
1291 /// Manually create a `Box` from scratch by using the global allocator:
1292 /// ```
1293 /// use std::alloc::{alloc, Layout};
1294 ///
1295 /// unsafe {
1296 /// let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1297 /// // In general .write is required to avoid attempting to destruct
1298 /// // the (uninitialized) previous contents of `ptr`, though for this
1299 /// // simple example `*ptr = 5` would have worked as well.
1300 /// ptr.write(5);
1301 /// let x = Box::from_raw(ptr);
1302 /// }
1303 /// ```
1304 ///
1305 /// [memory layout]: self#memory-layout
1306 #[stable(feature = "box_raw", since = "1.4.0")]
1307 #[inline]
1308 #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1309 pub unsafe fn from_raw(raw: *mut T) -> Self {
1310 unsafe { Self::from_raw_in(raw, Global) }
1311 }
1312
1313 /// Constructs a box from a `NonNull` pointer.
1314 ///
1315 /// After calling this function, the `NonNull` pointer is owned by
1316 /// the resulting `Box`. Specifically, the `Box` destructor will call
1317 /// the destructor of `T` and free the allocated memory. For this
1318 /// to be safe, the memory must have been allocated in accordance
1319 /// with the [memory layout] used by `Box` .
1320 ///
1321 /// # Safety
1322 ///
1323 /// This function is unsafe because improper use may lead to
1324 /// memory problems. For example, a double-free may occur if the
1325 /// function is called twice on the same `NonNull` pointer.
1326 ///
1327 /// The non-null pointer must point to a block of memory allocated by the global allocator.
1328 ///
1329 /// The safety conditions are described in the [memory layout] section.
1330 ///
1331 /// # Examples
1332 ///
1333 /// Recreate a `Box` which was previously converted to a `NonNull`
1334 /// pointer using [`Box::into_non_null`]:
1335 /// ```
1336 /// #![feature(box_vec_non_null)]
1337 ///
1338 /// let x = Box::new(5);
1339 /// let non_null = Box::into_non_null(x);
1340 /// let x = unsafe { Box::from_non_null(non_null) };
1341 /// ```
1342 /// Manually create a `Box` from scratch by using the global allocator:
1343 /// ```
1344 /// #![feature(box_vec_non_null)]
1345 ///
1346 /// use std::alloc::{alloc, Layout};
1347 /// use std::ptr::NonNull;
1348 ///
1349 /// unsafe {
1350 /// let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1351 /// .expect("allocation failed");
1352 /// // In general .write is required to avoid attempting to destruct
1353 /// // the (uninitialized) previous contents of `non_null`.
1354 /// non_null.write(5);
1355 /// let x = Box::from_non_null(non_null);
1356 /// }
1357 /// ```
1358 ///
1359 /// [memory layout]: self#memory-layout
1360 #[unstable(feature = "box_vec_non_null", issue = "130364")]
1361 #[inline]
1362 #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1363 pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1364 unsafe { Self::from_raw(ptr.as_ptr()) }
1365 }
1366
1367 /// Consumes the `Box`, returning a wrapped raw pointer.
1368 ///
1369 /// The pointer will be properly aligned and non-null.
1370 ///
1371 /// After calling this function, the caller is responsible for the
1372 /// memory previously managed by the `Box`. In particular, the
1373 /// caller should properly destroy `T` and release the memory, taking
1374 /// into account the [memory layout] used by `Box`. The easiest way to
1375 /// do this is to convert the raw pointer back into a `Box` with the
1376 /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1377 /// the cleanup.
1378 ///
1379 /// Note: this is an associated function, which means that you have
1380 /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1381 /// is so that there is no conflict with a method on the inner type.
1382 ///
1383 /// # Examples
1384 /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1385 /// for automatic cleanup:
1386 /// ```
1387 /// let x = Box::new(String::from("Hello"));
1388 /// let ptr = Box::into_raw(x);
1389 /// let x = unsafe { Box::from_raw(ptr) };
1390 /// ```
1391 /// Manual cleanup by explicitly running the destructor and deallocating
1392 /// the memory:
1393 /// ```
1394 /// use std::alloc::{dealloc, Layout};
1395 /// use std::ptr;
1396 ///
1397 /// let x = Box::new(String::from("Hello"));
1398 /// let ptr = Box::into_raw(x);
1399 /// unsafe {
1400 /// ptr::drop_in_place(ptr);
1401 /// dealloc(ptr as *mut u8, Layout::new::<String>());
1402 /// }
1403 /// ```
1404 /// Note: This is equivalent to the following:
1405 /// ```
1406 /// let x = Box::new(String::from("Hello"));
1407 /// let ptr = Box::into_raw(x);
1408 /// unsafe {
1409 /// drop(Box::from_raw(ptr));
1410 /// }
1411 /// ```
1412 ///
1413 /// [memory layout]: self#memory-layout
1414 #[must_use = "losing the pointer will leak memory"]
1415 #[stable(feature = "box_raw", since = "1.4.0")]
1416 #[inline]
1417 pub fn into_raw(b: Self) -> *mut T {
1418 // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1419 let mut b = mem::ManuallyDrop::new(b);
1420 // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1421 // operation for it's alias tracking.
1422 &raw mut **b
1423 }
1424
1425 /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1426 ///
1427 /// The pointer will be properly aligned.
1428 ///
1429 /// After calling this function, the caller is responsible for the
1430 /// memory previously managed by the `Box`. In particular, the
1431 /// caller should properly destroy `T` and release the memory, taking
1432 /// into account the [memory layout] used by `Box`. The easiest way to
1433 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1434 /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1435 /// perform the cleanup.
1436 ///
1437 /// Note: this is an associated function, which means that you have
1438 /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1439 /// This is so that there is no conflict with a method on the inner type.
1440 ///
1441 /// # Examples
1442 /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1443 /// for automatic cleanup:
1444 /// ```
1445 /// #![feature(box_vec_non_null)]
1446 ///
1447 /// let x = Box::new(String::from("Hello"));
1448 /// let non_null = Box::into_non_null(x);
1449 /// let x = unsafe { Box::from_non_null(non_null) };
1450 /// ```
1451 /// Manual cleanup by explicitly running the destructor and deallocating
1452 /// the memory:
1453 /// ```
1454 /// #![feature(box_vec_non_null)]
1455 ///
1456 /// use std::alloc::{dealloc, Layout};
1457 ///
1458 /// let x = Box::new(String::from("Hello"));
1459 /// let non_null = Box::into_non_null(x);
1460 /// unsafe {
1461 /// non_null.drop_in_place();
1462 /// dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1463 /// }
1464 /// ```
1465 /// Note: This is equivalent to the following:
1466 /// ```
1467 /// #![feature(box_vec_non_null)]
1468 ///
1469 /// let x = Box::new(String::from("Hello"));
1470 /// let non_null = Box::into_non_null(x);
1471 /// unsafe {
1472 /// drop(Box::from_non_null(non_null));
1473 /// }
1474 /// ```
1475 ///
1476 /// [memory layout]: self#memory-layout
1477 #[must_use = "losing the pointer will leak memory"]
1478 #[unstable(feature = "box_vec_non_null", issue = "130364")]
1479 #[inline]
1480 pub fn into_non_null(b: Self) -> NonNull<T> {
1481 // SAFETY: `Box` is guaranteed to be non-null.
1482 unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1483 }
1484}
1485
1486impl<T: ?Sized, A: Allocator> Box<T, A> {
1487 /// Constructs a box from a raw pointer in the given allocator.
1488 ///
1489 /// After calling this function, the raw pointer is owned by the
1490 /// resulting `Box`. Specifically, the `Box` destructor will call
1491 /// the destructor of `T` and free the allocated memory. For this
1492 /// to be safe, the memory must have been allocated in accordance
1493 /// with the [memory layout] used by `Box` .
1494 ///
1495 /// # Safety
1496 ///
1497 /// This function is unsafe because improper use may lead to
1498 /// memory problems. For example, a double-free may occur if the
1499 /// function is called twice on the same raw pointer.
1500 ///
1501 /// The raw pointer must point to a block of memory allocated by `alloc`.
1502 ///
1503 /// # Examples
1504 ///
1505 /// Recreate a `Box` which was previously converted to a raw pointer
1506 /// using [`Box::into_raw_with_allocator`]:
1507 /// ```
1508 /// #![feature(allocator_api)]
1509 ///
1510 /// use std::alloc::System;
1511 ///
1512 /// let x = Box::new_in(5, System);
1513 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1514 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1515 /// ```
1516 /// Manually create a `Box` from scratch by using the system allocator:
1517 /// ```
1518 /// #![feature(allocator_api, slice_ptr_get)]
1519 ///
1520 /// use std::alloc::{Allocator, Layout, System};
1521 ///
1522 /// unsafe {
1523 /// let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1524 /// // In general .write is required to avoid attempting to destruct
1525 /// // the (uninitialized) previous contents of `ptr`, though for this
1526 /// // simple example `*ptr = 5` would have worked as well.
1527 /// ptr.write(5);
1528 /// let x = Box::from_raw_in(ptr, System);
1529 /// }
1530 /// # Ok::<(), std::alloc::AllocError>(())
1531 /// ```
1532 ///
1533 /// [memory layout]: self#memory-layout
1534 #[unstable(feature = "allocator_api", issue = "32838")]
1535 #[inline]
1536 pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1537 Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1538 }
1539
1540 /// Constructs a box from a `NonNull` pointer in the given allocator.
1541 ///
1542 /// After calling this function, the `NonNull` pointer is owned by
1543 /// the resulting `Box`. Specifically, the `Box` destructor will call
1544 /// the destructor of `T` and free the allocated memory. For this
1545 /// to be safe, the memory must have been allocated in accordance
1546 /// with the [memory layout] used by `Box` .
1547 ///
1548 /// # Safety
1549 ///
1550 /// This function is unsafe because improper use may lead to
1551 /// memory problems. For example, a double-free may occur if the
1552 /// function is called twice on the same raw pointer.
1553 ///
1554 /// The non-null pointer must point to a block of memory allocated by `alloc`.
1555 ///
1556 /// # Examples
1557 ///
1558 /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1559 /// using [`Box::into_non_null_with_allocator`]:
1560 /// ```
1561 /// #![feature(allocator_api)]
1562 ///
1563 /// use std::alloc::System;
1564 ///
1565 /// let x = Box::new_in(5, System);
1566 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1567 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1568 /// ```
1569 /// Manually create a `Box` from scratch by using the system allocator:
1570 /// ```
1571 /// #![feature(allocator_api)]
1572 ///
1573 /// use std::alloc::{Allocator, Layout, System};
1574 ///
1575 /// unsafe {
1576 /// let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1577 /// // In general .write is required to avoid attempting to destruct
1578 /// // the (uninitialized) previous contents of `non_null`.
1579 /// non_null.write(5);
1580 /// let x = Box::from_non_null_in(non_null, System);
1581 /// }
1582 /// # Ok::<(), std::alloc::AllocError>(())
1583 /// ```
1584 ///
1585 /// [memory layout]: self#memory-layout
1586 #[unstable(feature = "allocator_api", issue = "32838")]
1587 // #[unstable(feature = "box_vec_non_null", issue = "130364")]
1588 #[inline]
1589 pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1590 // SAFETY: guaranteed by the caller.
1591 unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1592 }
1593
1594 /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1595 ///
1596 /// The pointer will be properly aligned and non-null.
1597 ///
1598 /// After calling this function, the caller is responsible for the
1599 /// memory previously managed by the `Box`. In particular, the
1600 /// caller should properly destroy `T` and release the memory, taking
1601 /// into account the [memory layout] used by `Box`. The easiest way to
1602 /// do this is to convert the raw pointer back into a `Box` with the
1603 /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1604 /// the cleanup.
1605 ///
1606 /// Note: this is an associated function, which means that you have
1607 /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1608 /// is so that there is no conflict with a method on the inner type.
1609 ///
1610 /// # Examples
1611 /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1612 /// for automatic cleanup:
1613 /// ```
1614 /// #![feature(allocator_api)]
1615 ///
1616 /// use std::alloc::System;
1617 ///
1618 /// let x = Box::new_in(String::from("Hello"), System);
1619 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1620 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1621 /// ```
1622 /// Manual cleanup by explicitly running the destructor and deallocating
1623 /// the memory:
1624 /// ```
1625 /// #![feature(allocator_api)]
1626 ///
1627 /// use std::alloc::{Allocator, Layout, System};
1628 /// use std::ptr::{self, NonNull};
1629 ///
1630 /// let x = Box::new_in(String::from("Hello"), System);
1631 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1632 /// unsafe {
1633 /// ptr::drop_in_place(ptr);
1634 /// let non_null = NonNull::new_unchecked(ptr);
1635 /// alloc.deallocate(non_null.cast(), Layout::new::<String>());
1636 /// }
1637 /// ```
1638 ///
1639 /// [memory layout]: self#memory-layout
1640 #[must_use = "losing the pointer will leak memory"]
1641 #[unstable(feature = "allocator_api", issue = "32838")]
1642 #[inline]
1643 pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1644 let mut b = mem::ManuallyDrop::new(b);
1645 // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1646 // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1647 // want *no* aliasing requirements here!
1648 // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1649 // works around that.
1650 let ptr = &raw mut **b;
1651 let alloc = unsafe { ptr::read(&b.1) };
1652 (ptr, alloc)
1653 }
1654
1655 /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1656 ///
1657 /// The pointer will be properly aligned.
1658 ///
1659 /// After calling this function, the caller is responsible for the
1660 /// memory previously managed by the `Box`. In particular, the
1661 /// caller should properly destroy `T` and release the memory, taking
1662 /// into account the [memory layout] used by `Box`. The easiest way to
1663 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1664 /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1665 /// perform the cleanup.
1666 ///
1667 /// Note: this is an associated function, which means that you have
1668 /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1669 /// `b.into_non_null_with_allocator()`. This is so that there is no
1670 /// conflict with a method on the inner type.
1671 ///
1672 /// # Examples
1673 /// Converting the `NonNull` pointer back into a `Box` with
1674 /// [`Box::from_non_null_in`] for automatic cleanup:
1675 /// ```
1676 /// #![feature(allocator_api)]
1677 ///
1678 /// use std::alloc::System;
1679 ///
1680 /// let x = Box::new_in(String::from("Hello"), System);
1681 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1682 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1683 /// ```
1684 /// Manual cleanup by explicitly running the destructor and deallocating
1685 /// the memory:
1686 /// ```
1687 /// #![feature(allocator_api)]
1688 ///
1689 /// use std::alloc::{Allocator, Layout, System};
1690 ///
1691 /// let x = Box::new_in(String::from("Hello"), System);
1692 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1693 /// unsafe {
1694 /// non_null.drop_in_place();
1695 /// alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1696 /// }
1697 /// ```
1698 ///
1699 /// [memory layout]: self#memory-layout
1700 #[must_use = "losing the pointer will leak memory"]
1701 #[unstable(feature = "allocator_api", issue = "32838")]
1702 // #[unstable(feature = "box_vec_non_null", issue = "130364")]
1703 #[inline]
1704 pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1705 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1706 // SAFETY: `Box` is guaranteed to be non-null.
1707 unsafe { (NonNull::new_unchecked(ptr), alloc) }
1708 }
1709
1710 #[unstable(
1711 feature = "ptr_internals",
1712 issue = "none",
1713 reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1714 )]
1715 #[inline]
1716 #[doc(hidden)]
1717 pub fn into_unique(b: Self) -> (Unique<T>, A) {
1718 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1719 unsafe { (Unique::from(&mut *ptr), alloc) }
1720 }
1721
1722 /// Returns a raw mutable pointer to the `Box`'s contents.
1723 ///
1724 /// The caller must ensure that the `Box` outlives the pointer this
1725 /// function returns, or else it will end up dangling.
1726 ///
1727 /// This method guarantees that for the purpose of the aliasing model, this method
1728 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1729 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1730 /// Note that calling other methods that materialize references to the memory
1731 /// may still invalidate this pointer.
1732 /// See the example below for how this guarantee can be used.
1733 ///
1734 /// # Examples
1735 ///
1736 /// Due to the aliasing guarantee, the following code is legal:
1737 ///
1738 /// ```rust
1739 /// #![feature(box_as_ptr)]
1740 ///
1741 /// unsafe {
1742 /// let mut b = Box::new(0);
1743 /// let ptr1 = Box::as_mut_ptr(&mut b);
1744 /// ptr1.write(1);
1745 /// let ptr2 = Box::as_mut_ptr(&mut b);
1746 /// ptr2.write(2);
1747 /// // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1748 /// ptr1.write(3);
1749 /// }
1750 /// ```
1751 ///
1752 /// [`as_mut_ptr`]: Self::as_mut_ptr
1753 /// [`as_ptr`]: Self::as_ptr
1754 #[unstable(feature = "box_as_ptr", issue = "129090")]
1755 #[rustc_never_returns_null_ptr]
1756 #[rustc_as_ptr]
1757 #[inline]
1758 pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1759 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1760 // any references.
1761 &raw mut **b
1762 }
1763
1764 /// Returns a raw pointer to the `Box`'s contents.
1765 ///
1766 /// The caller must ensure that the `Box` outlives the pointer this
1767 /// function returns, or else it will end up dangling.
1768 ///
1769 /// The caller must also ensure that the memory the pointer (non-transitively) points to
1770 /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1771 /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1772 ///
1773 /// This method guarantees that for the purpose of the aliasing model, this method
1774 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1775 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1776 /// Note that calling other methods that materialize mutable references to the memory,
1777 /// as well as writing to this memory, may still invalidate this pointer.
1778 /// See the example below for how this guarantee can be used.
1779 ///
1780 /// # Examples
1781 ///
1782 /// Due to the aliasing guarantee, the following code is legal:
1783 ///
1784 /// ```rust
1785 /// #![feature(box_as_ptr)]
1786 ///
1787 /// unsafe {
1788 /// let mut v = Box::new(0);
1789 /// let ptr1 = Box::as_ptr(&v);
1790 /// let ptr2 = Box::as_mut_ptr(&mut v);
1791 /// let _val = ptr2.read();
1792 /// // No write to this memory has happened yet, so `ptr1` is still valid.
1793 /// let _val = ptr1.read();
1794 /// // However, once we do a write...
1795 /// ptr2.write(1);
1796 /// // ... `ptr1` is no longer valid.
1797 /// // This would be UB: let _val = ptr1.read();
1798 /// }
1799 /// ```
1800 ///
1801 /// [`as_mut_ptr`]: Self::as_mut_ptr
1802 /// [`as_ptr`]: Self::as_ptr
1803 #[unstable(feature = "box_as_ptr", issue = "129090")]
1804 #[rustc_never_returns_null_ptr]
1805 #[rustc_as_ptr]
1806 #[inline]
1807 pub fn as_ptr(b: &Self) -> *const T {
1808 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1809 // any references.
1810 &raw const **b
1811 }
1812
1813 /// Returns a reference to the underlying allocator.
1814 ///
1815 /// Note: this is an associated function, which means that you have
1816 /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1817 /// is so that there is no conflict with a method on the inner type.
1818 #[unstable(feature = "allocator_api", issue = "32838")]
1819 #[inline]
1820 pub fn allocator(b: &Self) -> &A {
1821 &b.1
1822 }
1823
1824 /// Consumes and leaks the `Box`, returning a mutable reference,
1825 /// `&'a mut T`.
1826 ///
1827 /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1828 /// has only static references, or none at all, then this may be chosen to be
1829 /// `'static`.
1830 ///
1831 /// This function is mainly useful for data that lives for the remainder of
1832 /// the program's life. Dropping the returned reference will cause a memory
1833 /// leak. If this is not acceptable, the reference should first be wrapped
1834 /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1835 /// then be dropped which will properly destroy `T` and release the
1836 /// allocated memory.
1837 ///
1838 /// Note: this is an associated function, which means that you have
1839 /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1840 /// is so that there is no conflict with a method on the inner type.
1841 ///
1842 /// # Examples
1843 ///
1844 /// Simple usage:
1845 ///
1846 /// ```
1847 /// let x = Box::new(41);
1848 /// let static_ref: &'static mut usize = Box::leak(x);
1849 /// *static_ref += 1;
1850 /// assert_eq!(*static_ref, 42);
1851 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1852 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1853 /// # drop(unsafe { Box::from_raw(static_ref) });
1854 /// ```
1855 ///
1856 /// Unsized data:
1857 ///
1858 /// ```
1859 /// let x = vec![1, 2, 3].into_boxed_slice();
1860 /// let static_ref = Box::leak(x);
1861 /// static_ref[0] = 4;
1862 /// assert_eq!(*static_ref, [4, 2, 3]);
1863 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1864 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1865 /// # drop(unsafe { Box::from_raw(static_ref) });
1866 /// ```
1867 #[stable(feature = "box_leak", since = "1.26.0")]
1868 #[inline]
1869 pub fn leak<'a>(b: Self) -> &'a mut T
1870 where
1871 A: 'a,
1872 {
1873 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1874 mem::forget(alloc);
1875 unsafe { &mut *ptr }
1876 }
1877
1878 /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1879 /// `*boxed` will be pinned in memory and unable to be moved.
1880 ///
1881 /// This conversion does not allocate on the heap and happens in place.
1882 ///
1883 /// This is also available via [`From`].
1884 ///
1885 /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1886 /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1887 /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1888 /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1889 ///
1890 /// # Notes
1891 ///
1892 /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1893 /// as it'll introduce an ambiguity when calling `Pin::from`.
1894 /// A demonstration of such a poor impl is shown below.
1895 ///
1896 /// ```compile_fail
1897 /// # use std::pin::Pin;
1898 /// struct Foo; // A type defined in this crate.
1899 /// impl From<Box<()>> for Pin<Foo> {
1900 /// fn from(_: Box<()>) -> Pin<Foo> {
1901 /// Pin::new(Foo)
1902 /// }
1903 /// }
1904 ///
1905 /// let foo = Box::new(());
1906 /// let bar = Pin::from(foo);
1907 /// ```
1908 #[stable(feature = "box_into_pin", since = "1.63.0")]
1909 pub fn into_pin(boxed: Self) -> Pin<Self>
1910 where
1911 A: 'static,
1912 {
1913 // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1914 // when `T: !Unpin`, so it's safe to pin it directly without any
1915 // additional requirements.
1916 unsafe { Pin::new_unchecked(boxed) }
1917 }
1918}
1919
1920#[stable(feature = "rust1", since = "1.0.0")]
1921unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1922 #[inline]
1923 fn drop(&mut self) {
1924 // the T in the Box is dropped by the compiler before the destructor is run
1925
1926 let ptr = self.0;
1927
1928 unsafe {
1929 let layout = Layout::for_value_raw(ptr.as_ptr());
1930 if layout.size() != 0 {
1931 self.1.deallocate(From::from(ptr.cast()), layout);
1932 }
1933 }
1934 }
1935}
1936
1937#[cfg(not(no_global_oom_handling))]
1938#[stable(feature = "rust1", since = "1.0.0")]
1939impl<T: Default> Default for Box<T> {
1940 /// Creates a `Box<T>`, with the `Default` value for `T`.
1941 #[inline]
1942 fn default() -> Self {
1943 let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1944 unsafe {
1945 // SAFETY: `x` is valid for writing and has the same layout as `T`.
1946 // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1947 // does not have a destructor.
1948 //
1949 // We use `ptr::write` as `MaybeUninit::write` creates
1950 // extra stack copies of `T` in debug mode.
1951 //
1952 // See https://github.com/rust-lang/rust/issues/136043 for more context.
1953 ptr::write(&raw mut *x as *mut T, T::default());
1954 // SAFETY: `x` was just initialized above.
1955 x.assume_init()
1956 }
1957 }
1958}
1959
1960#[cfg(not(no_global_oom_handling))]
1961#[stable(feature = "rust1", since = "1.0.0")]
1962impl<T> Default for Box<[T]> {
1963 /// Creates an empty `[T]` inside a `Box`.
1964 #[inline]
1965 fn default() -> Self {
1966 let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1967 Box(ptr, Global)
1968 }
1969}
1970
1971#[cfg(not(no_global_oom_handling))]
1972#[stable(feature = "default_box_extra", since = "1.17.0")]
1973impl Default for Box<str> {
1974 #[inline]
1975 fn default() -> Self {
1976 // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1977 let ptr: Unique<str> = unsafe {
1978 let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1979 Unique::new_unchecked(bytes.as_ptr() as *mut str)
1980 };
1981 Box(ptr, Global)
1982 }
1983}
1984
1985#[cfg(not(no_global_oom_handling))]
1986#[stable(feature = "pin_default_impls", since = "1.91.0")]
1987impl<T> Default for Pin<Box<T>>
1988where
1989 T: ?Sized,
1990 Box<T>: Default,
1991{
1992 #[inline]
1993 fn default() -> Self {
1994 Box::into_pin(Box::<T>::default())
1995 }
1996}
1997
1998#[cfg(not(no_global_oom_handling))]
1999#[stable(feature = "rust1", since = "1.0.0")]
2000impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
2001 /// Returns a new box with a `clone()` of this box's contents.
2002 ///
2003 /// # Examples
2004 ///
2005 /// ```
2006 /// let x = Box::new(5);
2007 /// let y = x.clone();
2008 ///
2009 /// // The value is the same
2010 /// assert_eq!(x, y);
2011 ///
2012 /// // But they are unique objects
2013 /// assert_ne!(&*x as *const i32, &*y as *const i32);
2014 /// ```
2015 #[inline]
2016 fn clone(&self) -> Self {
2017 // Pre-allocate memory to allow writing the cloned value directly.
2018 let mut boxed = Self::new_uninit_in(self.1.clone());
2019 unsafe {
2020 (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
2021 boxed.assume_init()
2022 }
2023 }
2024
2025 /// Copies `source`'s contents into `self` without creating a new allocation.
2026 ///
2027 /// # Examples
2028 ///
2029 /// ```
2030 /// let x = Box::new(5);
2031 /// let mut y = Box::new(10);
2032 /// let yp: *const i32 = &*y;
2033 ///
2034 /// y.clone_from(&x);
2035 ///
2036 /// // The value is the same
2037 /// assert_eq!(x, y);
2038 ///
2039 /// // And no allocation occurred
2040 /// assert_eq!(yp, &*y);
2041 /// ```
2042 #[inline]
2043 fn clone_from(&mut self, source: &Self) {
2044 (**self).clone_from(&(**source));
2045 }
2046}
2047
2048#[cfg(not(no_global_oom_handling))]
2049#[stable(feature = "box_slice_clone", since = "1.3.0")]
2050impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
2051 fn clone(&self) -> Self {
2052 let alloc = Box::allocator(self).clone();
2053 self.to_vec_in(alloc).into_boxed_slice()
2054 }
2055
2056 /// Copies `source`'s contents into `self` without creating a new allocation,
2057 /// so long as the two are of the same length.
2058 ///
2059 /// # Examples
2060 ///
2061 /// ```
2062 /// let x = Box::new([5, 6, 7]);
2063 /// let mut y = Box::new([8, 9, 10]);
2064 /// let yp: *const [i32] = &*y;
2065 ///
2066 /// y.clone_from(&x);
2067 ///
2068 /// // The value is the same
2069 /// assert_eq!(x, y);
2070 ///
2071 /// // And no allocation occurred
2072 /// assert_eq!(yp, &*y);
2073 /// ```
2074 fn clone_from(&mut self, source: &Self) {
2075 if self.len() == source.len() {
2076 self.clone_from_slice(&source);
2077 } else {
2078 *self = source.clone();
2079 }
2080 }
2081}
2082
2083#[cfg(not(no_global_oom_handling))]
2084#[stable(feature = "box_slice_clone", since = "1.3.0")]
2085impl Clone for Box<str> {
2086 fn clone(&self) -> Self {
2087 // this makes a copy of the data
2088 let buf: Box<[u8]> = self.as_bytes().into();
2089 unsafe { from_boxed_utf8_unchecked(buf) }
2090 }
2091}
2092
2093#[stable(feature = "rust1", since = "1.0.0")]
2094impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
2095 #[inline]
2096 fn eq(&self, other: &Self) -> bool {
2097 PartialEq::eq(&**self, &**other)
2098 }
2099 #[inline]
2100 fn ne(&self, other: &Self) -> bool {
2101 PartialEq::ne(&**self, &**other)
2102 }
2103}
2104
2105#[stable(feature = "rust1", since = "1.0.0")]
2106impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
2107 #[inline]
2108 fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
2109 PartialOrd::partial_cmp(&**self, &**other)
2110 }
2111 #[inline]
2112 fn lt(&self, other: &Self) -> bool {
2113 PartialOrd::lt(&**self, &**other)
2114 }
2115 #[inline]
2116 fn le(&self, other: &Self) -> bool {
2117 PartialOrd::le(&**self, &**other)
2118 }
2119 #[inline]
2120 fn ge(&self, other: &Self) -> bool {
2121 PartialOrd::ge(&**self, &**other)
2122 }
2123 #[inline]
2124 fn gt(&self, other: &Self) -> bool {
2125 PartialOrd::gt(&**self, &**other)
2126 }
2127}
2128
2129#[stable(feature = "rust1", since = "1.0.0")]
2130impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
2131 #[inline]
2132 fn cmp(&self, other: &Self) -> Ordering {
2133 Ord::cmp(&**self, &**other)
2134 }
2135}
2136
2137#[stable(feature = "rust1", since = "1.0.0")]
2138impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
2139
2140#[stable(feature = "rust1", since = "1.0.0")]
2141impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
2142 fn hash<H: Hasher>(&self, state: &mut H) {
2143 (**self).hash(state);
2144 }
2145}
2146
2147#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
2148impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
2149 fn finish(&self) -> u64 {
2150 (**self).finish()
2151 }
2152 fn write(&mut self, bytes: &[u8]) {
2153 (**self).write(bytes)
2154 }
2155 fn write_u8(&mut self, i: u8) {
2156 (**self).write_u8(i)
2157 }
2158 fn write_u16(&mut self, i: u16) {
2159 (**self).write_u16(i)
2160 }
2161 fn write_u32(&mut self, i: u32) {
2162 (**self).write_u32(i)
2163 }
2164 fn write_u64(&mut self, i: u64) {
2165 (**self).write_u64(i)
2166 }
2167 fn write_u128(&mut self, i: u128) {
2168 (**self).write_u128(i)
2169 }
2170 fn write_usize(&mut self, i: usize) {
2171 (**self).write_usize(i)
2172 }
2173 fn write_i8(&mut self, i: i8) {
2174 (**self).write_i8(i)
2175 }
2176 fn write_i16(&mut self, i: i16) {
2177 (**self).write_i16(i)
2178 }
2179 fn write_i32(&mut self, i: i32) {
2180 (**self).write_i32(i)
2181 }
2182 fn write_i64(&mut self, i: i64) {
2183 (**self).write_i64(i)
2184 }
2185 fn write_i128(&mut self, i: i128) {
2186 (**self).write_i128(i)
2187 }
2188 fn write_isize(&mut self, i: isize) {
2189 (**self).write_isize(i)
2190 }
2191 fn write_length_prefix(&mut self, len: usize) {
2192 (**self).write_length_prefix(len)
2193 }
2194 fn write_str(&mut self, s: &str) {
2195 (**self).write_str(s)
2196 }
2197}
2198
2199#[stable(feature = "rust1", since = "1.0.0")]
2200impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
2201 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2202 fmt::Display::fmt(&**self, f)
2203 }
2204}
2205
2206#[stable(feature = "rust1", since = "1.0.0")]
2207impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
2208 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2209 fmt::Debug::fmt(&**self, f)
2210 }
2211}
2212
2213#[stable(feature = "rust1", since = "1.0.0")]
2214impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
2215 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2216 // It's not possible to extract the inner Uniq directly from the Box,
2217 // instead we cast it to a *const which aliases the Unique
2218 let ptr: *const T = &**self;
2219 fmt::Pointer::fmt(&ptr, f)
2220 }
2221}
2222
2223#[stable(feature = "rust1", since = "1.0.0")]
2224impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
2225 type Target = T;
2226
2227 fn deref(&self) -> &T {
2228 &**self
2229 }
2230}
2231
2232#[stable(feature = "rust1", since = "1.0.0")]
2233impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
2234 fn deref_mut(&mut self) -> &mut T {
2235 &mut **self
2236 }
2237}
2238
2239#[unstable(feature = "deref_pure_trait", issue = "87121")]
2240unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
2241
2242#[unstable(feature = "legacy_receiver_trait", issue = "none")]
2243impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
2244
2245#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2246impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2247 type Output = <F as FnOnce<Args>>::Output;
2248
2249 extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2250 <F as FnOnce<Args>>::call_once(*self, args)
2251 }
2252}
2253
2254#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2255impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2256 extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2257 <F as FnMut<Args>>::call_mut(self, args)
2258 }
2259}
2260
2261#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2262impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2263 extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2264 <F as Fn<Args>>::call(self, args)
2265 }
2266}
2267
2268#[stable(feature = "async_closure", since = "1.85.0")]
2269impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2270 type Output = F::Output;
2271 type CallOnceFuture = F::CallOnceFuture;
2272
2273 extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2274 F::async_call_once(*self, args)
2275 }
2276}
2277
2278#[stable(feature = "async_closure", since = "1.85.0")]
2279impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2280 type CallRefFuture<'a>
2281 = F::CallRefFuture<'a>
2282 where
2283 Self: 'a;
2284
2285 extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2286 F::async_call_mut(self, args)
2287 }
2288}
2289
2290#[stable(feature = "async_closure", since = "1.85.0")]
2291impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2292 extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2293 F::async_call(self, args)
2294 }
2295}
2296
2297#[unstable(feature = "coerce_unsized", issue = "18598")]
2298impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2299
2300#[unstable(feature = "pin_coerce_unsized_trait", issue = "150112")]
2301unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2302
2303// It is quite crucial that we only allow the `Global` allocator here.
2304// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2305// would need a lot of codegen and interpreter adjustments.
2306#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2307impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2308
2309#[stable(feature = "box_borrow", since = "1.1.0")]
2310impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2311 fn borrow(&self) -> &T {
2312 &**self
2313 }
2314}
2315
2316#[stable(feature = "box_borrow", since = "1.1.0")]
2317impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2318 fn borrow_mut(&mut self) -> &mut T {
2319 &mut **self
2320 }
2321}
2322
2323#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2324impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2325 fn as_ref(&self) -> &T {
2326 &**self
2327 }
2328}
2329
2330#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2331impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2332 fn as_mut(&mut self) -> &mut T {
2333 &mut **self
2334 }
2335}
2336
2337/* Nota bene
2338 *
2339 * We could have chosen not to add this impl, and instead have written a
2340 * function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2341 * because Box<T> implements Unpin even when T does not, as a result of
2342 * this impl.
2343 *
2344 * We chose this API instead of the alternative for a few reasons:
2345 * - Logically, it is helpful to understand pinning in regard to the
2346 * memory region being pointed to. For this reason none of the
2347 * standard library pointer types support projecting through a pin
2348 * (Box<T> is the only pointer type in std for which this would be
2349 * safe.)
2350 * - It is in practice very useful to have Box<T> be unconditionally
2351 * Unpin because of trait objects, for which the structural auto
2352 * trait functionality does not apply (e.g., Box<dyn Foo> would
2353 * otherwise not be Unpin).
2354 *
2355 * Another type with the same semantics as Box but only a conditional
2356 * implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2357 * could have a method to project a Pin<T> from it.
2358 */
2359#[stable(feature = "pin", since = "1.33.0")]
2360impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2361
2362#[unstable(feature = "coroutine_trait", issue = "43122")]
2363impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2364 type Yield = G::Yield;
2365 type Return = G::Return;
2366
2367 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2368 G::resume(Pin::new(&mut *self), arg)
2369 }
2370}
2371
2372#[unstable(feature = "coroutine_trait", issue = "43122")]
2373impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2374where
2375 A: 'static,
2376{
2377 type Yield = G::Yield;
2378 type Return = G::Return;
2379
2380 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2381 G::resume((*self).as_mut(), arg)
2382 }
2383}
2384
2385#[stable(feature = "futures_api", since = "1.36.0")]
2386impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2387 type Output = F::Output;
2388
2389 fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2390 F::poll(Pin::new(&mut *self), cx)
2391 }
2392}
2393
2394#[stable(feature = "box_error", since = "1.8.0")]
2395impl<E: Error> Error for Box<E> {
2396 #[allow(deprecated)]
2397 fn cause(&self) -> Option<&dyn Error> {
2398 Error::cause(&**self)
2399 }
2400
2401 fn source(&self) -> Option<&(dyn Error + 'static)> {
2402 Error::source(&**self)
2403 }
2404
2405 fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2406 Error::provide(&**self, request);
2407 }
2408}
2409
2410#[unstable(feature = "allocator_api", issue = "32838")]
2411unsafe impl<T: ?Sized + Allocator, A: Allocator> Allocator for Box<T, A> {
2412 #[inline]
2413 fn allocate(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2414 (**self).allocate(layout)
2415 }
2416
2417 #[inline]
2418 fn allocate_zeroed(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2419 (**self).allocate_zeroed(layout)
2420 }
2421
2422 #[inline]
2423 unsafe fn deallocate(&self, ptr: NonNull<u8>, layout: Layout) {
2424 // SAFETY: the safety contract must be upheld by the caller
2425 unsafe { (**self).deallocate(ptr, layout) }
2426 }
2427
2428 #[inline]
2429 unsafe fn grow(
2430 &self,
2431 ptr: NonNull<u8>,
2432 old_layout: Layout,
2433 new_layout: Layout,
2434 ) -> Result<NonNull<[u8]>, AllocError> {
2435 // SAFETY: the safety contract must be upheld by the caller
2436 unsafe { (**self).grow(ptr, old_layout, new_layout) }
2437 }
2438
2439 #[inline]
2440 unsafe fn grow_zeroed(
2441 &self,
2442 ptr: NonNull<u8>,
2443 old_layout: Layout,
2444 new_layout: Layout,
2445 ) -> Result<NonNull<[u8]>, AllocError> {
2446 // SAFETY: the safety contract must be upheld by the caller
2447 unsafe { (**self).grow_zeroed(ptr, old_layout, new_layout) }
2448 }
2449
2450 #[inline]
2451 unsafe fn shrink(
2452 &self,
2453 ptr: NonNull<u8>,
2454 old_layout: Layout,
2455 new_layout: Layout,
2456 ) -> Result<NonNull<[u8]>, AllocError> {
2457 // SAFETY: the safety contract must be upheld by the caller
2458 unsafe { (**self).shrink(ptr, old_layout, new_layout) }
2459 }
2460}