alloc/boxed.rs
1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//! Cons(T, Box<List<T>>),
31//! Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//! Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//! let (i, x): (usize, &i32) = item;
155//! println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//! let (i, x): (usize, &i32) = item;
161//! println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//! let (i, x): (usize, i32) = item;
167//! println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{Tuple, Unsize};
195use core::mem::{self, SizedTypeProperties};
196use core::ops::{
197 AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
198 DerefPure, DispatchFromDyn, LegacyReceiver,
199};
200use core::pin::{Pin, PinCoerceUnsized};
201use core::ptr::{self, NonNull, Unique};
202use core::task::{Context, Poll};
203
204#[cfg(not(no_global_oom_handling))]
205use crate::alloc::handle_alloc_error;
206use crate::alloc::{AllocError, Allocator, Global, Layout};
207use crate::raw_vec::RawVec;
208#[cfg(not(no_global_oom_handling))]
209use crate::str::from_boxed_utf8_unchecked;
210
211/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
212mod convert;
213/// Iterator related impls for `Box<_>`.
214mod iter;
215/// [`ThinBox`] implementation.
216mod thin;
217
218#[unstable(feature = "thin_box", issue = "92791")]
219pub use thin::ThinBox;
220
221/// A pointer type that uniquely owns a heap allocation of type `T`.
222///
223/// See the [module-level documentation](../../std/boxed/index.html) for more.
224#[lang = "owned_box"]
225#[fundamental]
226#[stable(feature = "rust1", since = "1.0.0")]
227#[rustc_insignificant_dtor]
228#[doc(search_unbox)]
229// The declaration of the `Box` struct must be kept in sync with the
230// compiler or ICEs will happen.
231pub struct Box<
232 T: ?Sized,
233 #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
234>(Unique<T>, A);
235
236/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
237/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
238///
239/// This is the surface syntax for `box <expr>` expressions.
240#[doc(hidden)]
241#[rustc_intrinsic]
242#[unstable(feature = "liballoc_internals", issue = "none")]
243pub fn box_new<T>(x: T) -> Box<T>;
244
245impl<T> Box<T> {
246 /// Allocates memory on the heap and then places `x` into it.
247 ///
248 /// This doesn't actually allocate if `T` is zero-sized.
249 ///
250 /// # Examples
251 ///
252 /// ```
253 /// let five = Box::new(5);
254 /// ```
255 #[cfg(not(no_global_oom_handling))]
256 #[inline(always)]
257 #[stable(feature = "rust1", since = "1.0.0")]
258 #[must_use]
259 #[rustc_diagnostic_item = "box_new"]
260 #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
261 pub fn new(x: T) -> Self {
262 return box_new(x);
263 }
264
265 /// Constructs a new box with uninitialized contents.
266 ///
267 /// # Examples
268 ///
269 /// ```
270 /// let mut five = Box::<u32>::new_uninit();
271 /// // Deferred initialization:
272 /// five.write(5);
273 /// let five = unsafe { five.assume_init() };
274 ///
275 /// assert_eq!(*five, 5)
276 /// ```
277 #[cfg(not(no_global_oom_handling))]
278 #[stable(feature = "new_uninit", since = "1.82.0")]
279 #[must_use]
280 #[inline]
281 pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
282 Self::new_uninit_in(Global)
283 }
284
285 /// Constructs a new `Box` with uninitialized contents, with the memory
286 /// being filled with `0` bytes.
287 ///
288 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
289 /// of this method.
290 ///
291 /// # Examples
292 ///
293 /// ```
294 /// let zero = Box::<u32>::new_zeroed();
295 /// let zero = unsafe { zero.assume_init() };
296 ///
297 /// assert_eq!(*zero, 0)
298 /// ```
299 ///
300 /// [zeroed]: mem::MaybeUninit::zeroed
301 #[cfg(not(no_global_oom_handling))]
302 #[inline]
303 #[stable(feature = "new_zeroed_alloc", since = "CURRENT_RUSTC_VERSION")]
304 #[must_use]
305 pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
306 Self::new_zeroed_in(Global)
307 }
308
309 /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
310 /// `x` will be pinned in memory and unable to be moved.
311 ///
312 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
313 /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
314 /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
315 /// construct a (pinned) `Box` in a different way than with [`Box::new`].
316 #[cfg(not(no_global_oom_handling))]
317 #[stable(feature = "pin", since = "1.33.0")]
318 #[must_use]
319 #[inline(always)]
320 pub fn pin(x: T) -> Pin<Box<T>> {
321 Box::new(x).into()
322 }
323
324 /// Allocates memory on the heap then places `x` into it,
325 /// returning an error if the allocation fails
326 ///
327 /// This doesn't actually allocate if `T` is zero-sized.
328 ///
329 /// # Examples
330 ///
331 /// ```
332 /// #![feature(allocator_api)]
333 ///
334 /// let five = Box::try_new(5)?;
335 /// # Ok::<(), std::alloc::AllocError>(())
336 /// ```
337 #[unstable(feature = "allocator_api", issue = "32838")]
338 #[inline]
339 pub fn try_new(x: T) -> Result<Self, AllocError> {
340 Self::try_new_in(x, Global)
341 }
342
343 /// Constructs a new box with uninitialized contents on the heap,
344 /// returning an error if the allocation fails
345 ///
346 /// # Examples
347 ///
348 /// ```
349 /// #![feature(allocator_api)]
350 ///
351 /// let mut five = Box::<u32>::try_new_uninit()?;
352 /// // Deferred initialization:
353 /// five.write(5);
354 /// let five = unsafe { five.assume_init() };
355 ///
356 /// assert_eq!(*five, 5);
357 /// # Ok::<(), std::alloc::AllocError>(())
358 /// ```
359 #[unstable(feature = "allocator_api", issue = "32838")]
360 #[inline]
361 pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
362 Box::try_new_uninit_in(Global)
363 }
364
365 /// Constructs a new `Box` with uninitialized contents, with the memory
366 /// being filled with `0` bytes on the heap
367 ///
368 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
369 /// of this method.
370 ///
371 /// # Examples
372 ///
373 /// ```
374 /// #![feature(allocator_api)]
375 ///
376 /// let zero = Box::<u32>::try_new_zeroed()?;
377 /// let zero = unsafe { zero.assume_init() };
378 ///
379 /// assert_eq!(*zero, 0);
380 /// # Ok::<(), std::alloc::AllocError>(())
381 /// ```
382 ///
383 /// [zeroed]: mem::MaybeUninit::zeroed
384 #[unstable(feature = "allocator_api", issue = "32838")]
385 #[inline]
386 pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
387 Box::try_new_zeroed_in(Global)
388 }
389}
390
391impl<T, A: Allocator> Box<T, A> {
392 /// Allocates memory in the given allocator then places `x` into it.
393 ///
394 /// This doesn't actually allocate if `T` is zero-sized.
395 ///
396 /// # Examples
397 ///
398 /// ```
399 /// #![feature(allocator_api)]
400 ///
401 /// use std::alloc::System;
402 ///
403 /// let five = Box::new_in(5, System);
404 /// ```
405 #[cfg(not(no_global_oom_handling))]
406 #[unstable(feature = "allocator_api", issue = "32838")]
407 #[must_use]
408 #[inline]
409 pub fn new_in(x: T, alloc: A) -> Self
410 where
411 A: Allocator,
412 {
413 let mut boxed = Self::new_uninit_in(alloc);
414 boxed.write(x);
415 unsafe { boxed.assume_init() }
416 }
417
418 /// Allocates memory in the given allocator then places `x` into it,
419 /// returning an error if the allocation fails
420 ///
421 /// This doesn't actually allocate if `T` is zero-sized.
422 ///
423 /// # Examples
424 ///
425 /// ```
426 /// #![feature(allocator_api)]
427 ///
428 /// use std::alloc::System;
429 ///
430 /// let five = Box::try_new_in(5, System)?;
431 /// # Ok::<(), std::alloc::AllocError>(())
432 /// ```
433 #[unstable(feature = "allocator_api", issue = "32838")]
434 #[inline]
435 pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
436 where
437 A: Allocator,
438 {
439 let mut boxed = Self::try_new_uninit_in(alloc)?;
440 boxed.write(x);
441 unsafe { Ok(boxed.assume_init()) }
442 }
443
444 /// Constructs a new box with uninitialized contents in the provided allocator.
445 ///
446 /// # Examples
447 ///
448 /// ```
449 /// #![feature(allocator_api)]
450 ///
451 /// use std::alloc::System;
452 ///
453 /// let mut five = Box::<u32, _>::new_uninit_in(System);
454 /// // Deferred initialization:
455 /// five.write(5);
456 /// let five = unsafe { five.assume_init() };
457 ///
458 /// assert_eq!(*five, 5)
459 /// ```
460 #[unstable(feature = "allocator_api", issue = "32838")]
461 #[cfg(not(no_global_oom_handling))]
462 #[must_use]
463 pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
464 where
465 A: Allocator,
466 {
467 let layout = Layout::new::<mem::MaybeUninit<T>>();
468 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
469 // That would make code size bigger.
470 match Box::try_new_uninit_in(alloc) {
471 Ok(m) => m,
472 Err(_) => handle_alloc_error(layout),
473 }
474 }
475
476 /// Constructs a new box with uninitialized contents in the provided allocator,
477 /// returning an error if the allocation fails
478 ///
479 /// # Examples
480 ///
481 /// ```
482 /// #![feature(allocator_api)]
483 ///
484 /// use std::alloc::System;
485 ///
486 /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
487 /// // Deferred initialization:
488 /// five.write(5);
489 /// let five = unsafe { five.assume_init() };
490 ///
491 /// assert_eq!(*five, 5);
492 /// # Ok::<(), std::alloc::AllocError>(())
493 /// ```
494 #[unstable(feature = "allocator_api", issue = "32838")]
495 pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
496 where
497 A: Allocator,
498 {
499 let ptr = if T::IS_ZST {
500 NonNull::dangling()
501 } else {
502 let layout = Layout::new::<mem::MaybeUninit<T>>();
503 alloc.allocate(layout)?.cast()
504 };
505 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
506 }
507
508 /// Constructs a new `Box` with uninitialized contents, with the memory
509 /// being filled with `0` bytes in the provided allocator.
510 ///
511 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
512 /// of this method.
513 ///
514 /// # Examples
515 ///
516 /// ```
517 /// #![feature(allocator_api)]
518 ///
519 /// use std::alloc::System;
520 ///
521 /// let zero = Box::<u32, _>::new_zeroed_in(System);
522 /// let zero = unsafe { zero.assume_init() };
523 ///
524 /// assert_eq!(*zero, 0)
525 /// ```
526 ///
527 /// [zeroed]: mem::MaybeUninit::zeroed
528 #[unstable(feature = "allocator_api", issue = "32838")]
529 #[cfg(not(no_global_oom_handling))]
530 #[must_use]
531 pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
532 where
533 A: Allocator,
534 {
535 let layout = Layout::new::<mem::MaybeUninit<T>>();
536 // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
537 // That would make code size bigger.
538 match Box::try_new_zeroed_in(alloc) {
539 Ok(m) => m,
540 Err(_) => handle_alloc_error(layout),
541 }
542 }
543
544 /// Constructs a new `Box` with uninitialized contents, with the memory
545 /// being filled with `0` bytes in the provided allocator,
546 /// returning an error if the allocation fails,
547 ///
548 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
549 /// of this method.
550 ///
551 /// # Examples
552 ///
553 /// ```
554 /// #![feature(allocator_api)]
555 ///
556 /// use std::alloc::System;
557 ///
558 /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
559 /// let zero = unsafe { zero.assume_init() };
560 ///
561 /// assert_eq!(*zero, 0);
562 /// # Ok::<(), std::alloc::AllocError>(())
563 /// ```
564 ///
565 /// [zeroed]: mem::MaybeUninit::zeroed
566 #[unstable(feature = "allocator_api", issue = "32838")]
567 pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
568 where
569 A: Allocator,
570 {
571 let ptr = if T::IS_ZST {
572 NonNull::dangling()
573 } else {
574 let layout = Layout::new::<mem::MaybeUninit<T>>();
575 alloc.allocate_zeroed(layout)?.cast()
576 };
577 unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
578 }
579
580 /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
581 /// `x` will be pinned in memory and unable to be moved.
582 ///
583 /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
584 /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
585 /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
586 /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
587 #[cfg(not(no_global_oom_handling))]
588 #[unstable(feature = "allocator_api", issue = "32838")]
589 #[must_use]
590 #[inline(always)]
591 pub fn pin_in(x: T, alloc: A) -> Pin<Self>
592 where
593 A: 'static + Allocator,
594 {
595 Self::into_pin(Self::new_in(x, alloc))
596 }
597
598 /// Converts a `Box<T>` into a `Box<[T]>`
599 ///
600 /// This conversion does not allocate on the heap and happens in place.
601 #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
602 pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
603 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
604 unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
605 }
606
607 /// Consumes the `Box`, returning the wrapped value.
608 ///
609 /// # Examples
610 ///
611 /// ```
612 /// #![feature(box_into_inner)]
613 ///
614 /// let c = Box::new(5);
615 ///
616 /// assert_eq!(Box::into_inner(c), 5);
617 /// ```
618 #[unstable(feature = "box_into_inner", issue = "80437")]
619 #[inline]
620 pub fn into_inner(boxed: Self) -> T {
621 *boxed
622 }
623
624 /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
625 /// to the uninitialized memory where the wrapped value used to live.
626 ///
627 /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
628 /// boxed values.
629 ///
630 /// # Examples
631 ///
632 /// ```
633 /// #![feature(box_take)]
634 ///
635 /// let c = Box::new(5);
636 ///
637 /// // take the value out of the box
638 /// let (value, uninit) = Box::take(c);
639 /// assert_eq!(value, 5);
640 ///
641 /// // reuse the box for a second value
642 /// let c = Box::write(uninit, 6);
643 /// assert_eq!(*c, 6);
644 /// ```
645 #[unstable(feature = "box_take", issue = "147212")]
646 pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
647 unsafe {
648 let (raw, alloc) = Box::into_raw_with_allocator(boxed);
649 let value = raw.read();
650 let uninit = Box::from_raw_in(raw.cast::<mem::MaybeUninit<T>>(), alloc);
651 (value, uninit)
652 }
653 }
654}
655
656impl<T> Box<[T]> {
657 /// Constructs a new boxed slice with uninitialized contents.
658 ///
659 /// # Examples
660 ///
661 /// ```
662 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
663 /// // Deferred initialization:
664 /// values[0].write(1);
665 /// values[1].write(2);
666 /// values[2].write(3);
667 /// let values = unsafe { values.assume_init() };
668 ///
669 /// assert_eq!(*values, [1, 2, 3])
670 /// ```
671 #[cfg(not(no_global_oom_handling))]
672 #[stable(feature = "new_uninit", since = "1.82.0")]
673 #[must_use]
674 pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
675 unsafe { RawVec::with_capacity(len).into_box(len) }
676 }
677
678 /// Constructs a new boxed slice with uninitialized contents, with the memory
679 /// being filled with `0` bytes.
680 ///
681 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
682 /// of this method.
683 ///
684 /// # Examples
685 ///
686 /// ```
687 /// let values = Box::<[u32]>::new_zeroed_slice(3);
688 /// let values = unsafe { values.assume_init() };
689 ///
690 /// assert_eq!(*values, [0, 0, 0])
691 /// ```
692 ///
693 /// [zeroed]: mem::MaybeUninit::zeroed
694 #[cfg(not(no_global_oom_handling))]
695 #[stable(feature = "new_zeroed_alloc", since = "CURRENT_RUSTC_VERSION")]
696 #[must_use]
697 pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
698 unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
699 }
700
701 /// Constructs a new boxed slice with uninitialized contents. Returns an error if
702 /// the allocation fails.
703 ///
704 /// # Examples
705 ///
706 /// ```
707 /// #![feature(allocator_api)]
708 ///
709 /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
710 /// // Deferred initialization:
711 /// values[0].write(1);
712 /// values[1].write(2);
713 /// values[2].write(3);
714 /// let values = unsafe { values.assume_init() };
715 ///
716 /// assert_eq!(*values, [1, 2, 3]);
717 /// # Ok::<(), std::alloc::AllocError>(())
718 /// ```
719 #[unstable(feature = "allocator_api", issue = "32838")]
720 #[inline]
721 pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
722 let ptr = if T::IS_ZST || len == 0 {
723 NonNull::dangling()
724 } else {
725 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
726 Ok(l) => l,
727 Err(_) => return Err(AllocError),
728 };
729 Global.allocate(layout)?.cast()
730 };
731 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
732 }
733
734 /// Constructs a new boxed slice with uninitialized contents, with the memory
735 /// being filled with `0` bytes. Returns an error if the allocation fails.
736 ///
737 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
738 /// of this method.
739 ///
740 /// # Examples
741 ///
742 /// ```
743 /// #![feature(allocator_api)]
744 ///
745 /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
746 /// let values = unsafe { values.assume_init() };
747 ///
748 /// assert_eq!(*values, [0, 0, 0]);
749 /// # Ok::<(), std::alloc::AllocError>(())
750 /// ```
751 ///
752 /// [zeroed]: mem::MaybeUninit::zeroed
753 #[unstable(feature = "allocator_api", issue = "32838")]
754 #[inline]
755 pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
756 let ptr = if T::IS_ZST || len == 0 {
757 NonNull::dangling()
758 } else {
759 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
760 Ok(l) => l,
761 Err(_) => return Err(AllocError),
762 };
763 Global.allocate_zeroed(layout)?.cast()
764 };
765 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
766 }
767
768 /// Converts the boxed slice into a boxed array.
769 ///
770 /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
771 ///
772 /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
773 #[unstable(feature = "slice_as_array", issue = "133508")]
774 #[inline]
775 #[must_use]
776 pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
777 if self.len() == N {
778 let ptr = Self::into_raw(self) as *mut [T; N];
779
780 // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
781 let me = unsafe { Box::from_raw(ptr) };
782 Some(me)
783 } else {
784 None
785 }
786 }
787}
788
789impl<T, A: Allocator> Box<[T], A> {
790 /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
791 ///
792 /// # Examples
793 ///
794 /// ```
795 /// #![feature(allocator_api)]
796 ///
797 /// use std::alloc::System;
798 ///
799 /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
800 /// // Deferred initialization:
801 /// values[0].write(1);
802 /// values[1].write(2);
803 /// values[2].write(3);
804 /// let values = unsafe { values.assume_init() };
805 ///
806 /// assert_eq!(*values, [1, 2, 3])
807 /// ```
808 #[cfg(not(no_global_oom_handling))]
809 #[unstable(feature = "allocator_api", issue = "32838")]
810 #[must_use]
811 pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
812 unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
813 }
814
815 /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
816 /// with the memory being filled with `0` bytes.
817 ///
818 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
819 /// of this method.
820 ///
821 /// # Examples
822 ///
823 /// ```
824 /// #![feature(allocator_api)]
825 ///
826 /// use std::alloc::System;
827 ///
828 /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
829 /// let values = unsafe { values.assume_init() };
830 ///
831 /// assert_eq!(*values, [0, 0, 0])
832 /// ```
833 ///
834 /// [zeroed]: mem::MaybeUninit::zeroed
835 #[cfg(not(no_global_oom_handling))]
836 #[unstable(feature = "allocator_api", issue = "32838")]
837 #[must_use]
838 pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
839 unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
840 }
841
842 /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
843 /// the allocation fails.
844 ///
845 /// # Examples
846 ///
847 /// ```
848 /// #![feature(allocator_api)]
849 ///
850 /// use std::alloc::System;
851 ///
852 /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
853 /// // Deferred initialization:
854 /// values[0].write(1);
855 /// values[1].write(2);
856 /// values[2].write(3);
857 /// let values = unsafe { values.assume_init() };
858 ///
859 /// assert_eq!(*values, [1, 2, 3]);
860 /// # Ok::<(), std::alloc::AllocError>(())
861 /// ```
862 #[unstable(feature = "allocator_api", issue = "32838")]
863 #[inline]
864 pub fn try_new_uninit_slice_in(
865 len: usize,
866 alloc: A,
867 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
868 let ptr = if T::IS_ZST || len == 0 {
869 NonNull::dangling()
870 } else {
871 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
872 Ok(l) => l,
873 Err(_) => return Err(AllocError),
874 };
875 alloc.allocate(layout)?.cast()
876 };
877 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
878 }
879
880 /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
881 /// being filled with `0` bytes. Returns an error if the allocation fails.
882 ///
883 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
884 /// of this method.
885 ///
886 /// # Examples
887 ///
888 /// ```
889 /// #![feature(allocator_api)]
890 ///
891 /// use std::alloc::System;
892 ///
893 /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
894 /// let values = unsafe { values.assume_init() };
895 ///
896 /// assert_eq!(*values, [0, 0, 0]);
897 /// # Ok::<(), std::alloc::AllocError>(())
898 /// ```
899 ///
900 /// [zeroed]: mem::MaybeUninit::zeroed
901 #[unstable(feature = "allocator_api", issue = "32838")]
902 #[inline]
903 pub fn try_new_zeroed_slice_in(
904 len: usize,
905 alloc: A,
906 ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
907 let ptr = if T::IS_ZST || len == 0 {
908 NonNull::dangling()
909 } else {
910 let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
911 Ok(l) => l,
912 Err(_) => return Err(AllocError),
913 };
914 alloc.allocate_zeroed(layout)?.cast()
915 };
916 unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
917 }
918}
919
920impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
921 /// Converts to `Box<T, A>`.
922 ///
923 /// # Safety
924 ///
925 /// As with [`MaybeUninit::assume_init`],
926 /// it is up to the caller to guarantee that the value
927 /// really is in an initialized state.
928 /// Calling this when the content is not yet fully initialized
929 /// causes immediate undefined behavior.
930 ///
931 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
932 ///
933 /// # Examples
934 ///
935 /// ```
936 /// let mut five = Box::<u32>::new_uninit();
937 /// // Deferred initialization:
938 /// five.write(5);
939 /// let five: Box<u32> = unsafe { five.assume_init() };
940 ///
941 /// assert_eq!(*five, 5)
942 /// ```
943 #[stable(feature = "new_uninit", since = "1.82.0")]
944 #[inline]
945 pub unsafe fn assume_init(self) -> Box<T, A> {
946 let (raw, alloc) = Box::into_raw_with_allocator(self);
947 unsafe { Box::from_raw_in(raw as *mut T, alloc) }
948 }
949
950 /// Writes the value and converts to `Box<T, A>`.
951 ///
952 /// This method converts the box similarly to [`Box::assume_init`] but
953 /// writes `value` into it before conversion thus guaranteeing safety.
954 /// In some scenarios use of this method may improve performance because
955 /// the compiler may be able to optimize copying from stack.
956 ///
957 /// # Examples
958 ///
959 /// ```
960 /// let big_box = Box::<[usize; 1024]>::new_uninit();
961 ///
962 /// let mut array = [0; 1024];
963 /// for (i, place) in array.iter_mut().enumerate() {
964 /// *place = i;
965 /// }
966 ///
967 /// // The optimizer may be able to elide this copy, so previous code writes
968 /// // to heap directly.
969 /// let big_box = Box::write(big_box, array);
970 ///
971 /// for (i, x) in big_box.iter().enumerate() {
972 /// assert_eq!(*x, i);
973 /// }
974 /// ```
975 #[stable(feature = "box_uninit_write", since = "1.87.0")]
976 #[inline]
977 pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
978 unsafe {
979 (*boxed).write(value);
980 boxed.assume_init()
981 }
982 }
983}
984
985impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
986 /// Converts to `Box<[T], A>`.
987 ///
988 /// # Safety
989 ///
990 /// As with [`MaybeUninit::assume_init`],
991 /// it is up to the caller to guarantee that the values
992 /// really are in an initialized state.
993 /// Calling this when the content is not yet fully initialized
994 /// causes immediate undefined behavior.
995 ///
996 /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
997 ///
998 /// # Examples
999 ///
1000 /// ```
1001 /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1002 /// // Deferred initialization:
1003 /// values[0].write(1);
1004 /// values[1].write(2);
1005 /// values[2].write(3);
1006 /// let values = unsafe { values.assume_init() };
1007 ///
1008 /// assert_eq!(*values, [1, 2, 3])
1009 /// ```
1010 #[stable(feature = "new_uninit", since = "1.82.0")]
1011 #[inline]
1012 pub unsafe fn assume_init(self) -> Box<[T], A> {
1013 let (raw, alloc) = Box::into_raw_with_allocator(self);
1014 unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1015 }
1016}
1017
1018impl<T: ?Sized> Box<T> {
1019 /// Constructs a box from a raw pointer.
1020 ///
1021 /// After calling this function, the raw pointer is owned by the
1022 /// resulting `Box`. Specifically, the `Box` destructor will call
1023 /// the destructor of `T` and free the allocated memory. For this
1024 /// to be safe, the memory must have been allocated in accordance
1025 /// with the [memory layout] used by `Box` .
1026 ///
1027 /// # Safety
1028 ///
1029 /// This function is unsafe because improper use may lead to
1030 /// memory problems. For example, a double-free may occur if the
1031 /// function is called twice on the same raw pointer.
1032 ///
1033 /// The raw pointer must point to a block of memory allocated by the global allocator.
1034 ///
1035 /// The safety conditions are described in the [memory layout] section.
1036 ///
1037 /// # Examples
1038 ///
1039 /// Recreate a `Box` which was previously converted to a raw pointer
1040 /// using [`Box::into_raw`]:
1041 /// ```
1042 /// let x = Box::new(5);
1043 /// let ptr = Box::into_raw(x);
1044 /// let x = unsafe { Box::from_raw(ptr) };
1045 /// ```
1046 /// Manually create a `Box` from scratch by using the global allocator:
1047 /// ```
1048 /// use std::alloc::{alloc, Layout};
1049 ///
1050 /// unsafe {
1051 /// let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1052 /// // In general .write is required to avoid attempting to destruct
1053 /// // the (uninitialized) previous contents of `ptr`, though for this
1054 /// // simple example `*ptr = 5` would have worked as well.
1055 /// ptr.write(5);
1056 /// let x = Box::from_raw(ptr);
1057 /// }
1058 /// ```
1059 ///
1060 /// [memory layout]: self#memory-layout
1061 #[stable(feature = "box_raw", since = "1.4.0")]
1062 #[inline]
1063 #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1064 pub unsafe fn from_raw(raw: *mut T) -> Self {
1065 unsafe { Self::from_raw_in(raw, Global) }
1066 }
1067
1068 /// Constructs a box from a `NonNull` pointer.
1069 ///
1070 /// After calling this function, the `NonNull` pointer is owned by
1071 /// the resulting `Box`. Specifically, the `Box` destructor will call
1072 /// the destructor of `T` and free the allocated memory. For this
1073 /// to be safe, the memory must have been allocated in accordance
1074 /// with the [memory layout] used by `Box` .
1075 ///
1076 /// # Safety
1077 ///
1078 /// This function is unsafe because improper use may lead to
1079 /// memory problems. For example, a double-free may occur if the
1080 /// function is called twice on the same `NonNull` pointer.
1081 ///
1082 /// The non-null pointer must point to a block of memory allocated by the global allocator.
1083 ///
1084 /// The safety conditions are described in the [memory layout] section.
1085 ///
1086 /// # Examples
1087 ///
1088 /// Recreate a `Box` which was previously converted to a `NonNull`
1089 /// pointer using [`Box::into_non_null`]:
1090 /// ```
1091 /// #![feature(box_vec_non_null)]
1092 ///
1093 /// let x = Box::new(5);
1094 /// let non_null = Box::into_non_null(x);
1095 /// let x = unsafe { Box::from_non_null(non_null) };
1096 /// ```
1097 /// Manually create a `Box` from scratch by using the global allocator:
1098 /// ```
1099 /// #![feature(box_vec_non_null)]
1100 ///
1101 /// use std::alloc::{alloc, Layout};
1102 /// use std::ptr::NonNull;
1103 ///
1104 /// unsafe {
1105 /// let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1106 /// .expect("allocation failed");
1107 /// // In general .write is required to avoid attempting to destruct
1108 /// // the (uninitialized) previous contents of `non_null`.
1109 /// non_null.write(5);
1110 /// let x = Box::from_non_null(non_null);
1111 /// }
1112 /// ```
1113 ///
1114 /// [memory layout]: self#memory-layout
1115 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1116 #[inline]
1117 #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1118 pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1119 unsafe { Self::from_raw(ptr.as_ptr()) }
1120 }
1121
1122 /// Consumes the `Box`, returning a wrapped raw pointer.
1123 ///
1124 /// The pointer will be properly aligned and non-null.
1125 ///
1126 /// After calling this function, the caller is responsible for the
1127 /// memory previously managed by the `Box`. In particular, the
1128 /// caller should properly destroy `T` and release the memory, taking
1129 /// into account the [memory layout] used by `Box`. The easiest way to
1130 /// do this is to convert the raw pointer back into a `Box` with the
1131 /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1132 /// the cleanup.
1133 ///
1134 /// Note: this is an associated function, which means that you have
1135 /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1136 /// is so that there is no conflict with a method on the inner type.
1137 ///
1138 /// # Examples
1139 /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1140 /// for automatic cleanup:
1141 /// ```
1142 /// let x = Box::new(String::from("Hello"));
1143 /// let ptr = Box::into_raw(x);
1144 /// let x = unsafe { Box::from_raw(ptr) };
1145 /// ```
1146 /// Manual cleanup by explicitly running the destructor and deallocating
1147 /// the memory:
1148 /// ```
1149 /// use std::alloc::{dealloc, Layout};
1150 /// use std::ptr;
1151 ///
1152 /// let x = Box::new(String::from("Hello"));
1153 /// let ptr = Box::into_raw(x);
1154 /// unsafe {
1155 /// ptr::drop_in_place(ptr);
1156 /// dealloc(ptr as *mut u8, Layout::new::<String>());
1157 /// }
1158 /// ```
1159 /// Note: This is equivalent to the following:
1160 /// ```
1161 /// let x = Box::new(String::from("Hello"));
1162 /// let ptr = Box::into_raw(x);
1163 /// unsafe {
1164 /// drop(Box::from_raw(ptr));
1165 /// }
1166 /// ```
1167 ///
1168 /// [memory layout]: self#memory-layout
1169 #[must_use = "losing the pointer will leak memory"]
1170 #[stable(feature = "box_raw", since = "1.4.0")]
1171 #[inline]
1172 pub fn into_raw(b: Self) -> *mut T {
1173 // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1174 let mut b = mem::ManuallyDrop::new(b);
1175 // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1176 // operation for it's alias tracking.
1177 &raw mut **b
1178 }
1179
1180 /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1181 ///
1182 /// The pointer will be properly aligned.
1183 ///
1184 /// After calling this function, the caller is responsible for the
1185 /// memory previously managed by the `Box`. In particular, the
1186 /// caller should properly destroy `T` and release the memory, taking
1187 /// into account the [memory layout] used by `Box`. The easiest way to
1188 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1189 /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1190 /// perform the cleanup.
1191 ///
1192 /// Note: this is an associated function, which means that you have
1193 /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1194 /// This is so that there is no conflict with a method on the inner type.
1195 ///
1196 /// # Examples
1197 /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1198 /// for automatic cleanup:
1199 /// ```
1200 /// #![feature(box_vec_non_null)]
1201 ///
1202 /// let x = Box::new(String::from("Hello"));
1203 /// let non_null = Box::into_non_null(x);
1204 /// let x = unsafe { Box::from_non_null(non_null) };
1205 /// ```
1206 /// Manual cleanup by explicitly running the destructor and deallocating
1207 /// the memory:
1208 /// ```
1209 /// #![feature(box_vec_non_null)]
1210 ///
1211 /// use std::alloc::{dealloc, Layout};
1212 ///
1213 /// let x = Box::new(String::from("Hello"));
1214 /// let non_null = Box::into_non_null(x);
1215 /// unsafe {
1216 /// non_null.drop_in_place();
1217 /// dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1218 /// }
1219 /// ```
1220 /// Note: This is equivalent to the following:
1221 /// ```
1222 /// #![feature(box_vec_non_null)]
1223 ///
1224 /// let x = Box::new(String::from("Hello"));
1225 /// let non_null = Box::into_non_null(x);
1226 /// unsafe {
1227 /// drop(Box::from_non_null(non_null));
1228 /// }
1229 /// ```
1230 ///
1231 /// [memory layout]: self#memory-layout
1232 #[must_use = "losing the pointer will leak memory"]
1233 #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1234 #[inline]
1235 pub fn into_non_null(b: Self) -> NonNull<T> {
1236 // SAFETY: `Box` is guaranteed to be non-null.
1237 unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1238 }
1239}
1240
1241impl<T: ?Sized, A: Allocator> Box<T, A> {
1242 /// Constructs a box from a raw pointer in the given allocator.
1243 ///
1244 /// After calling this function, the raw pointer is owned by the
1245 /// resulting `Box`. Specifically, the `Box` destructor will call
1246 /// the destructor of `T` and free the allocated memory. For this
1247 /// to be safe, the memory must have been allocated in accordance
1248 /// with the [memory layout] used by `Box` .
1249 ///
1250 /// # Safety
1251 ///
1252 /// This function is unsafe because improper use may lead to
1253 /// memory problems. For example, a double-free may occur if the
1254 /// function is called twice on the same raw pointer.
1255 ///
1256 /// The raw pointer must point to a block of memory allocated by `alloc`.
1257 ///
1258 /// # Examples
1259 ///
1260 /// Recreate a `Box` which was previously converted to a raw pointer
1261 /// using [`Box::into_raw_with_allocator`]:
1262 /// ```
1263 /// #![feature(allocator_api)]
1264 ///
1265 /// use std::alloc::System;
1266 ///
1267 /// let x = Box::new_in(5, System);
1268 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1269 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1270 /// ```
1271 /// Manually create a `Box` from scratch by using the system allocator:
1272 /// ```
1273 /// #![feature(allocator_api, slice_ptr_get)]
1274 ///
1275 /// use std::alloc::{Allocator, Layout, System};
1276 ///
1277 /// unsafe {
1278 /// let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1279 /// // In general .write is required to avoid attempting to destruct
1280 /// // the (uninitialized) previous contents of `ptr`, though for this
1281 /// // simple example `*ptr = 5` would have worked as well.
1282 /// ptr.write(5);
1283 /// let x = Box::from_raw_in(ptr, System);
1284 /// }
1285 /// # Ok::<(), std::alloc::AllocError>(())
1286 /// ```
1287 ///
1288 /// [memory layout]: self#memory-layout
1289 #[unstable(feature = "allocator_api", issue = "32838")]
1290 #[inline]
1291 pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1292 Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1293 }
1294
1295 /// Constructs a box from a `NonNull` pointer in the given allocator.
1296 ///
1297 /// After calling this function, the `NonNull` pointer is owned by
1298 /// the resulting `Box`. Specifically, the `Box` destructor will call
1299 /// the destructor of `T` and free the allocated memory. For this
1300 /// to be safe, the memory must have been allocated in accordance
1301 /// with the [memory layout] used by `Box` .
1302 ///
1303 /// # Safety
1304 ///
1305 /// This function is unsafe because improper use may lead to
1306 /// memory problems. For example, a double-free may occur if the
1307 /// function is called twice on the same raw pointer.
1308 ///
1309 /// The non-null pointer must point to a block of memory allocated by `alloc`.
1310 ///
1311 /// # Examples
1312 ///
1313 /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1314 /// using [`Box::into_non_null_with_allocator`]:
1315 /// ```
1316 /// #![feature(allocator_api, box_vec_non_null)]
1317 ///
1318 /// use std::alloc::System;
1319 ///
1320 /// let x = Box::new_in(5, System);
1321 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1322 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1323 /// ```
1324 /// Manually create a `Box` from scratch by using the system allocator:
1325 /// ```
1326 /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1327 ///
1328 /// use std::alloc::{Allocator, Layout, System};
1329 ///
1330 /// unsafe {
1331 /// let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1332 /// // In general .write is required to avoid attempting to destruct
1333 /// // the (uninitialized) previous contents of `non_null`.
1334 /// non_null.write(5);
1335 /// let x = Box::from_non_null_in(non_null, System);
1336 /// }
1337 /// # Ok::<(), std::alloc::AllocError>(())
1338 /// ```
1339 ///
1340 /// [memory layout]: self#memory-layout
1341 #[unstable(feature = "allocator_api", issue = "32838")]
1342 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1343 #[inline]
1344 pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1345 // SAFETY: guaranteed by the caller.
1346 unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1347 }
1348
1349 /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1350 ///
1351 /// The pointer will be properly aligned and non-null.
1352 ///
1353 /// After calling this function, the caller is responsible for the
1354 /// memory previously managed by the `Box`. In particular, the
1355 /// caller should properly destroy `T` and release the memory, taking
1356 /// into account the [memory layout] used by `Box`. The easiest way to
1357 /// do this is to convert the raw pointer back into a `Box` with the
1358 /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1359 /// the cleanup.
1360 ///
1361 /// Note: this is an associated function, which means that you have
1362 /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1363 /// is so that there is no conflict with a method on the inner type.
1364 ///
1365 /// # Examples
1366 /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1367 /// for automatic cleanup:
1368 /// ```
1369 /// #![feature(allocator_api)]
1370 ///
1371 /// use std::alloc::System;
1372 ///
1373 /// let x = Box::new_in(String::from("Hello"), System);
1374 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1375 /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1376 /// ```
1377 /// Manual cleanup by explicitly running the destructor and deallocating
1378 /// the memory:
1379 /// ```
1380 /// #![feature(allocator_api)]
1381 ///
1382 /// use std::alloc::{Allocator, Layout, System};
1383 /// use std::ptr::{self, NonNull};
1384 ///
1385 /// let x = Box::new_in(String::from("Hello"), System);
1386 /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1387 /// unsafe {
1388 /// ptr::drop_in_place(ptr);
1389 /// let non_null = NonNull::new_unchecked(ptr);
1390 /// alloc.deallocate(non_null.cast(), Layout::new::<String>());
1391 /// }
1392 /// ```
1393 ///
1394 /// [memory layout]: self#memory-layout
1395 #[must_use = "losing the pointer will leak memory"]
1396 #[unstable(feature = "allocator_api", issue = "32838")]
1397 #[inline]
1398 pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1399 let mut b = mem::ManuallyDrop::new(b);
1400 // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1401 // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1402 // want *no* aliasing requirements here!
1403 // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1404 // works around that.
1405 let ptr = &raw mut **b;
1406 let alloc = unsafe { ptr::read(&b.1) };
1407 (ptr, alloc)
1408 }
1409
1410 /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1411 ///
1412 /// The pointer will be properly aligned.
1413 ///
1414 /// After calling this function, the caller is responsible for the
1415 /// memory previously managed by the `Box`. In particular, the
1416 /// caller should properly destroy `T` and release the memory, taking
1417 /// into account the [memory layout] used by `Box`. The easiest way to
1418 /// do this is to convert the `NonNull` pointer back into a `Box` with the
1419 /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1420 /// perform the cleanup.
1421 ///
1422 /// Note: this is an associated function, which means that you have
1423 /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1424 /// `b.into_non_null_with_allocator()`. This is so that there is no
1425 /// conflict with a method on the inner type.
1426 ///
1427 /// # Examples
1428 /// Converting the `NonNull` pointer back into a `Box` with
1429 /// [`Box::from_non_null_in`] for automatic cleanup:
1430 /// ```
1431 /// #![feature(allocator_api, box_vec_non_null)]
1432 ///
1433 /// use std::alloc::System;
1434 ///
1435 /// let x = Box::new_in(String::from("Hello"), System);
1436 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1437 /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1438 /// ```
1439 /// Manual cleanup by explicitly running the destructor and deallocating
1440 /// the memory:
1441 /// ```
1442 /// #![feature(allocator_api, box_vec_non_null)]
1443 ///
1444 /// use std::alloc::{Allocator, Layout, System};
1445 ///
1446 /// let x = Box::new_in(String::from("Hello"), System);
1447 /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1448 /// unsafe {
1449 /// non_null.drop_in_place();
1450 /// alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1451 /// }
1452 /// ```
1453 ///
1454 /// [memory layout]: self#memory-layout
1455 #[must_use = "losing the pointer will leak memory"]
1456 #[unstable(feature = "allocator_api", issue = "32838")]
1457 // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1458 #[inline]
1459 pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1460 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1461 // SAFETY: `Box` is guaranteed to be non-null.
1462 unsafe { (NonNull::new_unchecked(ptr), alloc) }
1463 }
1464
1465 #[unstable(
1466 feature = "ptr_internals",
1467 issue = "none",
1468 reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1469 )]
1470 #[inline]
1471 #[doc(hidden)]
1472 pub fn into_unique(b: Self) -> (Unique<T>, A) {
1473 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1474 unsafe { (Unique::from(&mut *ptr), alloc) }
1475 }
1476
1477 /// Returns a raw mutable pointer to the `Box`'s contents.
1478 ///
1479 /// The caller must ensure that the `Box` outlives the pointer this
1480 /// function returns, or else it will end up dangling.
1481 ///
1482 /// This method guarantees that for the purpose of the aliasing model, this method
1483 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1484 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1485 /// Note that calling other methods that materialize references to the memory
1486 /// may still invalidate this pointer.
1487 /// See the example below for how this guarantee can be used.
1488 ///
1489 /// # Examples
1490 ///
1491 /// Due to the aliasing guarantee, the following code is legal:
1492 ///
1493 /// ```rust
1494 /// #![feature(box_as_ptr)]
1495 ///
1496 /// unsafe {
1497 /// let mut b = Box::new(0);
1498 /// let ptr1 = Box::as_mut_ptr(&mut b);
1499 /// ptr1.write(1);
1500 /// let ptr2 = Box::as_mut_ptr(&mut b);
1501 /// ptr2.write(2);
1502 /// // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1503 /// ptr1.write(3);
1504 /// }
1505 /// ```
1506 ///
1507 /// [`as_mut_ptr`]: Self::as_mut_ptr
1508 /// [`as_ptr`]: Self::as_ptr
1509 #[unstable(feature = "box_as_ptr", issue = "129090")]
1510 #[rustc_never_returns_null_ptr]
1511 #[rustc_as_ptr]
1512 #[inline]
1513 pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1514 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1515 // any references.
1516 &raw mut **b
1517 }
1518
1519 /// Returns a raw pointer to the `Box`'s contents.
1520 ///
1521 /// The caller must ensure that the `Box` outlives the pointer this
1522 /// function returns, or else it will end up dangling.
1523 ///
1524 /// The caller must also ensure that the memory the pointer (non-transitively) points to
1525 /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1526 /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1527 ///
1528 /// This method guarantees that for the purpose of the aliasing model, this method
1529 /// does not materialize a reference to the underlying memory, and thus the returned pointer
1530 /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1531 /// Note that calling other methods that materialize mutable references to the memory,
1532 /// as well as writing to this memory, may still invalidate this pointer.
1533 /// See the example below for how this guarantee can be used.
1534 ///
1535 /// # Examples
1536 ///
1537 /// Due to the aliasing guarantee, the following code is legal:
1538 ///
1539 /// ```rust
1540 /// #![feature(box_as_ptr)]
1541 ///
1542 /// unsafe {
1543 /// let mut v = Box::new(0);
1544 /// let ptr1 = Box::as_ptr(&v);
1545 /// let ptr2 = Box::as_mut_ptr(&mut v);
1546 /// let _val = ptr2.read();
1547 /// // No write to this memory has happened yet, so `ptr1` is still valid.
1548 /// let _val = ptr1.read();
1549 /// // However, once we do a write...
1550 /// ptr2.write(1);
1551 /// // ... `ptr1` is no longer valid.
1552 /// // This would be UB: let _val = ptr1.read();
1553 /// }
1554 /// ```
1555 ///
1556 /// [`as_mut_ptr`]: Self::as_mut_ptr
1557 /// [`as_ptr`]: Self::as_ptr
1558 #[unstable(feature = "box_as_ptr", issue = "129090")]
1559 #[rustc_never_returns_null_ptr]
1560 #[rustc_as_ptr]
1561 #[inline]
1562 pub fn as_ptr(b: &Self) -> *const T {
1563 // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1564 // any references.
1565 &raw const **b
1566 }
1567
1568 /// Returns a reference to the underlying allocator.
1569 ///
1570 /// Note: this is an associated function, which means that you have
1571 /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1572 /// is so that there is no conflict with a method on the inner type.
1573 #[unstable(feature = "allocator_api", issue = "32838")]
1574 #[inline]
1575 pub fn allocator(b: &Self) -> &A {
1576 &b.1
1577 }
1578
1579 /// Consumes and leaks the `Box`, returning a mutable reference,
1580 /// `&'a mut T`.
1581 ///
1582 /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1583 /// has only static references, or none at all, then this may be chosen to be
1584 /// `'static`.
1585 ///
1586 /// This function is mainly useful for data that lives for the remainder of
1587 /// the program's life. Dropping the returned reference will cause a memory
1588 /// leak. If this is not acceptable, the reference should first be wrapped
1589 /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1590 /// then be dropped which will properly destroy `T` and release the
1591 /// allocated memory.
1592 ///
1593 /// Note: this is an associated function, which means that you have
1594 /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1595 /// is so that there is no conflict with a method on the inner type.
1596 ///
1597 /// # Examples
1598 ///
1599 /// Simple usage:
1600 ///
1601 /// ```
1602 /// let x = Box::new(41);
1603 /// let static_ref: &'static mut usize = Box::leak(x);
1604 /// *static_ref += 1;
1605 /// assert_eq!(*static_ref, 42);
1606 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1607 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1608 /// # drop(unsafe { Box::from_raw(static_ref) });
1609 /// ```
1610 ///
1611 /// Unsized data:
1612 ///
1613 /// ```
1614 /// let x = vec![1, 2, 3].into_boxed_slice();
1615 /// let static_ref = Box::leak(x);
1616 /// static_ref[0] = 4;
1617 /// assert_eq!(*static_ref, [4, 2, 3]);
1618 /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1619 /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1620 /// # drop(unsafe { Box::from_raw(static_ref) });
1621 /// ```
1622 #[stable(feature = "box_leak", since = "1.26.0")]
1623 #[inline]
1624 pub fn leak<'a>(b: Self) -> &'a mut T
1625 where
1626 A: 'a,
1627 {
1628 let (ptr, alloc) = Box::into_raw_with_allocator(b);
1629 mem::forget(alloc);
1630 unsafe { &mut *ptr }
1631 }
1632
1633 /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1634 /// `*boxed` will be pinned in memory and unable to be moved.
1635 ///
1636 /// This conversion does not allocate on the heap and happens in place.
1637 ///
1638 /// This is also available via [`From`].
1639 ///
1640 /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1641 /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1642 /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1643 /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1644 ///
1645 /// # Notes
1646 ///
1647 /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1648 /// as it'll introduce an ambiguity when calling `Pin::from`.
1649 /// A demonstration of such a poor impl is shown below.
1650 ///
1651 /// ```compile_fail
1652 /// # use std::pin::Pin;
1653 /// struct Foo; // A type defined in this crate.
1654 /// impl From<Box<()>> for Pin<Foo> {
1655 /// fn from(_: Box<()>) -> Pin<Foo> {
1656 /// Pin::new(Foo)
1657 /// }
1658 /// }
1659 ///
1660 /// let foo = Box::new(());
1661 /// let bar = Pin::from(foo);
1662 /// ```
1663 #[stable(feature = "box_into_pin", since = "1.63.0")]
1664 pub fn into_pin(boxed: Self) -> Pin<Self>
1665 where
1666 A: 'static,
1667 {
1668 // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1669 // when `T: !Unpin`, so it's safe to pin it directly without any
1670 // additional requirements.
1671 unsafe { Pin::new_unchecked(boxed) }
1672 }
1673}
1674
1675#[stable(feature = "rust1", since = "1.0.0")]
1676unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1677 #[inline]
1678 fn drop(&mut self) {
1679 // the T in the Box is dropped by the compiler before the destructor is run
1680
1681 let ptr = self.0;
1682
1683 unsafe {
1684 let layout = Layout::for_value_raw(ptr.as_ptr());
1685 if layout.size() != 0 {
1686 self.1.deallocate(From::from(ptr.cast()), layout);
1687 }
1688 }
1689 }
1690}
1691
1692#[cfg(not(no_global_oom_handling))]
1693#[stable(feature = "rust1", since = "1.0.0")]
1694impl<T: Default> Default for Box<T> {
1695 /// Creates a `Box<T>`, with the `Default` value for `T`.
1696 #[inline]
1697 fn default() -> Self {
1698 let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1699 unsafe {
1700 // SAFETY: `x` is valid for writing and has the same layout as `T`.
1701 // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1702 // does not have a destructor.
1703 //
1704 // We use `ptr::write` as `MaybeUninit::write` creates
1705 // extra stack copies of `T` in debug mode.
1706 //
1707 // See https://github.com/rust-lang/rust/issues/136043 for more context.
1708 ptr::write(&raw mut *x as *mut T, T::default());
1709 // SAFETY: `x` was just initialized above.
1710 x.assume_init()
1711 }
1712 }
1713}
1714
1715#[cfg(not(no_global_oom_handling))]
1716#[stable(feature = "rust1", since = "1.0.0")]
1717impl<T> Default for Box<[T]> {
1718 /// Creates an empty `[T]` inside a `Box`.
1719 #[inline]
1720 fn default() -> Self {
1721 let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1722 Box(ptr, Global)
1723 }
1724}
1725
1726#[cfg(not(no_global_oom_handling))]
1727#[stable(feature = "default_box_extra", since = "1.17.0")]
1728impl Default for Box<str> {
1729 #[inline]
1730 fn default() -> Self {
1731 // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1732 let ptr: Unique<str> = unsafe {
1733 let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1734 Unique::new_unchecked(bytes.as_ptr() as *mut str)
1735 };
1736 Box(ptr, Global)
1737 }
1738}
1739
1740#[cfg(not(no_global_oom_handling))]
1741#[stable(feature = "pin_default_impls", since = "1.91.0")]
1742impl<T> Default for Pin<Box<T>>
1743where
1744 T: ?Sized,
1745 Box<T>: Default,
1746{
1747 #[inline]
1748 fn default() -> Self {
1749 Box::into_pin(Box::<T>::default())
1750 }
1751}
1752
1753#[cfg(not(no_global_oom_handling))]
1754#[stable(feature = "rust1", since = "1.0.0")]
1755impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1756 /// Returns a new box with a `clone()` of this box's contents.
1757 ///
1758 /// # Examples
1759 ///
1760 /// ```
1761 /// let x = Box::new(5);
1762 /// let y = x.clone();
1763 ///
1764 /// // The value is the same
1765 /// assert_eq!(x, y);
1766 ///
1767 /// // But they are unique objects
1768 /// assert_ne!(&*x as *const i32, &*y as *const i32);
1769 /// ```
1770 #[inline]
1771 fn clone(&self) -> Self {
1772 // Pre-allocate memory to allow writing the cloned value directly.
1773 let mut boxed = Self::new_uninit_in(self.1.clone());
1774 unsafe {
1775 (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1776 boxed.assume_init()
1777 }
1778 }
1779
1780 /// Copies `source`'s contents into `self` without creating a new allocation.
1781 ///
1782 /// # Examples
1783 ///
1784 /// ```
1785 /// let x = Box::new(5);
1786 /// let mut y = Box::new(10);
1787 /// let yp: *const i32 = &*y;
1788 ///
1789 /// y.clone_from(&x);
1790 ///
1791 /// // The value is the same
1792 /// assert_eq!(x, y);
1793 ///
1794 /// // And no allocation occurred
1795 /// assert_eq!(yp, &*y);
1796 /// ```
1797 #[inline]
1798 fn clone_from(&mut self, source: &Self) {
1799 (**self).clone_from(&(**source));
1800 }
1801}
1802
1803#[cfg(not(no_global_oom_handling))]
1804#[stable(feature = "box_slice_clone", since = "1.3.0")]
1805impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1806 fn clone(&self) -> Self {
1807 let alloc = Box::allocator(self).clone();
1808 self.to_vec_in(alloc).into_boxed_slice()
1809 }
1810
1811 /// Copies `source`'s contents into `self` without creating a new allocation,
1812 /// so long as the two are of the same length.
1813 ///
1814 /// # Examples
1815 ///
1816 /// ```
1817 /// let x = Box::new([5, 6, 7]);
1818 /// let mut y = Box::new([8, 9, 10]);
1819 /// let yp: *const [i32] = &*y;
1820 ///
1821 /// y.clone_from(&x);
1822 ///
1823 /// // The value is the same
1824 /// assert_eq!(x, y);
1825 ///
1826 /// // And no allocation occurred
1827 /// assert_eq!(yp, &*y);
1828 /// ```
1829 fn clone_from(&mut self, source: &Self) {
1830 if self.len() == source.len() {
1831 self.clone_from_slice(&source);
1832 } else {
1833 *self = source.clone();
1834 }
1835 }
1836}
1837
1838#[cfg(not(no_global_oom_handling))]
1839#[stable(feature = "box_slice_clone", since = "1.3.0")]
1840impl Clone for Box<str> {
1841 fn clone(&self) -> Self {
1842 // this makes a copy of the data
1843 let buf: Box<[u8]> = self.as_bytes().into();
1844 unsafe { from_boxed_utf8_unchecked(buf) }
1845 }
1846}
1847
1848#[stable(feature = "rust1", since = "1.0.0")]
1849impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1850 #[inline]
1851 fn eq(&self, other: &Self) -> bool {
1852 PartialEq::eq(&**self, &**other)
1853 }
1854 #[inline]
1855 fn ne(&self, other: &Self) -> bool {
1856 PartialEq::ne(&**self, &**other)
1857 }
1858}
1859
1860#[stable(feature = "rust1", since = "1.0.0")]
1861impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1862 #[inline]
1863 fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1864 PartialOrd::partial_cmp(&**self, &**other)
1865 }
1866 #[inline]
1867 fn lt(&self, other: &Self) -> bool {
1868 PartialOrd::lt(&**self, &**other)
1869 }
1870 #[inline]
1871 fn le(&self, other: &Self) -> bool {
1872 PartialOrd::le(&**self, &**other)
1873 }
1874 #[inline]
1875 fn ge(&self, other: &Self) -> bool {
1876 PartialOrd::ge(&**self, &**other)
1877 }
1878 #[inline]
1879 fn gt(&self, other: &Self) -> bool {
1880 PartialOrd::gt(&**self, &**other)
1881 }
1882}
1883
1884#[stable(feature = "rust1", since = "1.0.0")]
1885impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1886 #[inline]
1887 fn cmp(&self, other: &Self) -> Ordering {
1888 Ord::cmp(&**self, &**other)
1889 }
1890}
1891
1892#[stable(feature = "rust1", since = "1.0.0")]
1893impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1894
1895#[stable(feature = "rust1", since = "1.0.0")]
1896impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1897 fn hash<H: Hasher>(&self, state: &mut H) {
1898 (**self).hash(state);
1899 }
1900}
1901
1902#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1903impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1904 fn finish(&self) -> u64 {
1905 (**self).finish()
1906 }
1907 fn write(&mut self, bytes: &[u8]) {
1908 (**self).write(bytes)
1909 }
1910 fn write_u8(&mut self, i: u8) {
1911 (**self).write_u8(i)
1912 }
1913 fn write_u16(&mut self, i: u16) {
1914 (**self).write_u16(i)
1915 }
1916 fn write_u32(&mut self, i: u32) {
1917 (**self).write_u32(i)
1918 }
1919 fn write_u64(&mut self, i: u64) {
1920 (**self).write_u64(i)
1921 }
1922 fn write_u128(&mut self, i: u128) {
1923 (**self).write_u128(i)
1924 }
1925 fn write_usize(&mut self, i: usize) {
1926 (**self).write_usize(i)
1927 }
1928 fn write_i8(&mut self, i: i8) {
1929 (**self).write_i8(i)
1930 }
1931 fn write_i16(&mut self, i: i16) {
1932 (**self).write_i16(i)
1933 }
1934 fn write_i32(&mut self, i: i32) {
1935 (**self).write_i32(i)
1936 }
1937 fn write_i64(&mut self, i: i64) {
1938 (**self).write_i64(i)
1939 }
1940 fn write_i128(&mut self, i: i128) {
1941 (**self).write_i128(i)
1942 }
1943 fn write_isize(&mut self, i: isize) {
1944 (**self).write_isize(i)
1945 }
1946 fn write_length_prefix(&mut self, len: usize) {
1947 (**self).write_length_prefix(len)
1948 }
1949 fn write_str(&mut self, s: &str) {
1950 (**self).write_str(s)
1951 }
1952}
1953
1954#[stable(feature = "rust1", since = "1.0.0")]
1955impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
1956 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1957 fmt::Display::fmt(&**self, f)
1958 }
1959}
1960
1961#[stable(feature = "rust1", since = "1.0.0")]
1962impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
1963 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1964 fmt::Debug::fmt(&**self, f)
1965 }
1966}
1967
1968#[stable(feature = "rust1", since = "1.0.0")]
1969impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
1970 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1971 // It's not possible to extract the inner Uniq directly from the Box,
1972 // instead we cast it to a *const which aliases the Unique
1973 let ptr: *const T = &**self;
1974 fmt::Pointer::fmt(&ptr, f)
1975 }
1976}
1977
1978#[stable(feature = "rust1", since = "1.0.0")]
1979impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
1980 type Target = T;
1981
1982 fn deref(&self) -> &T {
1983 &**self
1984 }
1985}
1986
1987#[stable(feature = "rust1", since = "1.0.0")]
1988impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
1989 fn deref_mut(&mut self) -> &mut T {
1990 &mut **self
1991 }
1992}
1993
1994#[unstable(feature = "deref_pure_trait", issue = "87121")]
1995unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
1996
1997#[unstable(feature = "legacy_receiver_trait", issue = "none")]
1998impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
1999
2000#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2001impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2002 type Output = <F as FnOnce<Args>>::Output;
2003
2004 extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2005 <F as FnOnce<Args>>::call_once(*self, args)
2006 }
2007}
2008
2009#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2010impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2011 extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2012 <F as FnMut<Args>>::call_mut(self, args)
2013 }
2014}
2015
2016#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2017impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2018 extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2019 <F as Fn<Args>>::call(self, args)
2020 }
2021}
2022
2023#[stable(feature = "async_closure", since = "1.85.0")]
2024impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2025 type Output = F::Output;
2026 type CallOnceFuture = F::CallOnceFuture;
2027
2028 extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2029 F::async_call_once(*self, args)
2030 }
2031}
2032
2033#[stable(feature = "async_closure", since = "1.85.0")]
2034impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2035 type CallRefFuture<'a>
2036 = F::CallRefFuture<'a>
2037 where
2038 Self: 'a;
2039
2040 extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2041 F::async_call_mut(self, args)
2042 }
2043}
2044
2045#[stable(feature = "async_closure", since = "1.85.0")]
2046impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2047 extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2048 F::async_call(self, args)
2049 }
2050}
2051
2052#[unstable(feature = "coerce_unsized", issue = "18598")]
2053impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2054
2055#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2056unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2057
2058// It is quite crucial that we only allow the `Global` allocator here.
2059// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2060// would need a lot of codegen and interpreter adjustments.
2061#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2062impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2063
2064#[stable(feature = "box_borrow", since = "1.1.0")]
2065impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2066 fn borrow(&self) -> &T {
2067 &**self
2068 }
2069}
2070
2071#[stable(feature = "box_borrow", since = "1.1.0")]
2072impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2073 fn borrow_mut(&mut self) -> &mut T {
2074 &mut **self
2075 }
2076}
2077
2078#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2079impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2080 fn as_ref(&self) -> &T {
2081 &**self
2082 }
2083}
2084
2085#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2086impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2087 fn as_mut(&mut self) -> &mut T {
2088 &mut **self
2089 }
2090}
2091
2092/* Nota bene
2093 *
2094 * We could have chosen not to add this impl, and instead have written a
2095 * function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2096 * because Box<T> implements Unpin even when T does not, as a result of
2097 * this impl.
2098 *
2099 * We chose this API instead of the alternative for a few reasons:
2100 * - Logically, it is helpful to understand pinning in regard to the
2101 * memory region being pointed to. For this reason none of the
2102 * standard library pointer types support projecting through a pin
2103 * (Box<T> is the only pointer type in std for which this would be
2104 * safe.)
2105 * - It is in practice very useful to have Box<T> be unconditionally
2106 * Unpin because of trait objects, for which the structural auto
2107 * trait functionality does not apply (e.g., Box<dyn Foo> would
2108 * otherwise not be Unpin).
2109 *
2110 * Another type with the same semantics as Box but only a conditional
2111 * implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2112 * could have a method to project a Pin<T> from it.
2113 */
2114#[stable(feature = "pin", since = "1.33.0")]
2115impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2116
2117#[unstable(feature = "coroutine_trait", issue = "43122")]
2118impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2119 type Yield = G::Yield;
2120 type Return = G::Return;
2121
2122 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2123 G::resume(Pin::new(&mut *self), arg)
2124 }
2125}
2126
2127#[unstable(feature = "coroutine_trait", issue = "43122")]
2128impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2129where
2130 A: 'static,
2131{
2132 type Yield = G::Yield;
2133 type Return = G::Return;
2134
2135 fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2136 G::resume((*self).as_mut(), arg)
2137 }
2138}
2139
2140#[stable(feature = "futures_api", since = "1.36.0")]
2141impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2142 type Output = F::Output;
2143
2144 fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2145 F::poll(Pin::new(&mut *self), cx)
2146 }
2147}
2148
2149#[stable(feature = "box_error", since = "1.8.0")]
2150impl<E: Error> Error for Box<E> {
2151 #[allow(deprecated)]
2152 fn cause(&self) -> Option<&dyn Error> {
2153 Error::cause(&**self)
2154 }
2155
2156 fn source(&self) -> Option<&(dyn Error + 'static)> {
2157 Error::source(&**self)
2158 }
2159
2160 fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2161 Error::provide(&**self, request);
2162 }
2163}