Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

don't inhibit random field reordering on repr(packed(1)) #125360

Merged
merged 1 commit into from
May 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 6 additions & 4 deletions compiler/rustc_abi/src/layout.rs
Original file line number Diff line number Diff line change
Expand Up @@ -970,7 +970,7 @@ fn univariant<
let mut align = if pack.is_some() { dl.i8_align } else { dl.aggregate_align };
let mut max_repr_align = repr.align;
let mut inverse_memory_index: IndexVec<u32, FieldIdx> = fields.indices().collect();
let optimize = !repr.inhibit_struct_field_reordering_opt();
let optimize = !repr.inhibit_struct_field_reordering();
if optimize && fields.len() > 1 {
let end = if let StructKind::MaybeUnsized = kind { fields.len() - 1 } else { fields.len() };
let optimizing = &mut inverse_memory_index.raw[..end];
Expand Down Expand Up @@ -1007,13 +1007,15 @@ fn univariant<
// Calculates a sort key to group fields by their alignment or possibly some
// size-derived pseudo-alignment.
let alignment_group_key = |layout: &F| {
// The two branches here return values that cannot be meaningfully compared with
// each other. However, we know that consistently for all executions of
// `alignment_group_key`, one or the other branch will be taken, so this is okay.
if let Some(pack) = pack {
// Return the packed alignment in bytes.
layout.align.abi.min(pack).bytes()
} else {
// Returns `log2(effective-align)`. This is ok since `pack` applies to all
// fields equally. The calculation assumes that size is an integer multiple of
// align, except for ZSTs.
// Returns `log2(effective-align)`. The calculation assumes that size is an
// integer multiple of align, except for ZSTs.
let align = layout.align.abi.bytes();
let size = layout.size.bytes();
let niche_size = layout.largest_niche.map(|n| n.available(dl)).unwrap_or(0);
Expand Down
15 changes: 4 additions & 11 deletions compiler/rustc_abi/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -137,23 +137,16 @@ impl ReprOptions {
self.c() || self.int.is_some()
}

/// Returns `true` if this `#[repr()]` should inhibit struct field reordering
/// optimizations, such as with `repr(C)`, `repr(packed(1))`, or `repr(<int>)`.
pub fn inhibit_struct_field_reordering_opt(&self) -> bool {
if let Some(pack) = self.pack {
if pack.bytes() == 1 {
return true;
}
}

/// Returns `true` if this `#[repr()]` guarantees a fixed field order,
/// e.g. `repr(C)` or `repr(<int>)`.
pub fn inhibit_struct_field_reordering(&self) -> bool {
self.flags.intersects(ReprFlags::IS_UNOPTIMISABLE) || self.int.is_some()
}

/// Returns `true` if this type is valid for reordering and `-Z randomize-layout`
/// was enabled for its declaration crate.
pub fn can_randomize_type_layout(&self) -> bool {
!self.inhibit_struct_field_reordering_opt()
&& self.flags.contains(ReprFlags::RANDOMIZE_LAYOUT)
!self.inhibit_struct_field_reordering() && self.flags.contains(ReprFlags::RANDOMIZE_LAYOUT)
Copy link
Member Author

@RalfJung RalfJung May 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is still a bit silly, the only place where can_randomize_type_layout is called is effectively inside an if !self.inhibit_struct_field_reordering()... but I didn't want to do a larger refactoring here, not sure what even would be the better structure.

}

/// Returns `true` if this `#[repr()]` should inhibit union ABI optimisations.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -278,7 +278,7 @@ fn reduce_ty<'tcx>(cx: &LateContext<'tcx>, mut ty: Ty<'tcx>) -> ReducedTy<'tcx>
ty = sized_ty;
continue;
}
if def.repr().inhibit_struct_field_reordering_opt() {
if def.repr().inhibit_struct_field_reordering() {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW clippy already used to assume inhibit_struct_field_reordering_opt means that the field order is guaranteed, which was not the case before this PR.

ReducedTy::OrderedFields(Some(sized_ty))
} else {
ReducedTy::UnorderedFields(ty)
Expand Down
2 changes: 1 addition & 1 deletion src/tools/miri/tests/fail/reading_half_a_pointer.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#![allow(dead_code)]

// We use packed structs to get around alignment restrictions
#[repr(packed)]
#[repr(C, packed)]
struct Data {
pad: u8,
ptr: &'static i32,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ pub struct Aligned {
_pad: [u8; 11],
packed: Packed,
}
#[repr(packed)]
#[repr(C, packed)]
#[derive(Default, Copy, Clone)]
pub struct Packed {
_pad: [u8; 5],
Expand Down
2 changes: 1 addition & 1 deletion tests/mir-opt/const_allocation3.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ fn main() {
FOO;
}

#[repr(packed)]
#[repr(C, packed)]
struct Packed {
a: [u8; 28],
b: &'static i32,
Expand Down
Loading