Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault on Julia master #322

Closed
giordano opened this issue Oct 17, 2017 · 4 comments
Closed

Segmentation fault on Julia master #322

giordano opened this issue Oct 17, 2017 · 4 comments

Comments

@giordano
Copy link

giordano commented Oct 17, 2017

I'm not sure whether this is a bug in StaticArrays.jl or also this issue is actually a Julia bug, but it involves only StaticArrays, so let me first try reporting it here :-)

This code on Julia master causes a segmentation fault:

julia> using StaticArrays

julia> sin.((SMatrix{3,2}(1.0, 2.0, 3.0, 4.0, 5.0, 6.0) * SVector(1.0, 2.0)))
3-element SVector{3,Float64}:
  0.4121184852417566
 -0.5365729180004349
  0.6502878401571168

julia> SVector(1, -1, -4) - [1, -1, -4] # ← segfault happens here

signal (11): Segmentation fault
in expression starting at no file:0
_ZNK4llvm4Type9isEmptyTyEv at /home/mose/repo/julia/usr/bin/../lib/libLLVM-3.9.so (unknown line)
type_is_ghost at /home/mose/repo/julia/src/codegen.cpp:271 [inlined]
emit_bits_compare at /home/mose/repo/julia/src/codegen.cpp:2156
emit_f_is at /home/mose/repo/julia/src/codegen.cpp:2223 [inlined]
emit_builtin_call at /home/mose/repo/julia/src/codegen.cpp:2288
emit_call at /home/mose/repo/julia/src/codegen.cpp:3053
emit_expr at /home/mose/repo/julia/src/codegen.cpp:3767
emit_condition at /home/mose/repo/julia/src/codegen.cpp:3637 [inlined]
emit_function at /home/mose/repo/julia/src/codegen.cpp:5740
jl_compile_linfo at /home/mose/repo/julia/src/codegen.cpp:1155
jl_compile_for_dispatch at /home/mose/repo/julia/src/gf.c:1704
jl_compile_method_internal at /home/mose/repo/julia/src/julia_internal.h:334 [inlined]
jl_call_method_internal at /home/mose/repo/julia/src/julia_internal.h:381 [inlined]
jl_apply_generic at /home/mose/repo/julia/src/gf.c:2003
do_call at /home/mose/repo/julia/src/interpreter.c:70
eval at /home/mose/repo/julia/src/interpreter.c:262
jl_interpret_toplevel_expr_in at /home/mose/repo/julia/src/interpreter.c:50
jl_toplevel_eval_flex at /home/mose/repo/julia/src/toplevel.c:640
jl_toplevel_eval_in at /home/mose/repo/julia/src/builtins.c:626
eval at ./repl/REPL.jl:3
jl_call_fptr_internal at /home/mose/repo/julia/src/julia_internal.h:366 [inlined]
jl_call_method_internal at /home/mose/repo/julia/src/julia_internal.h:385 [inlined]
jl_apply_generic at /home/mose/repo/julia/src/gf.c:2003
eval_user_input at ./repl/REPL.jl:69
jl_call_fptr_internal at /home/mose/repo/julia/src/julia_internal.h:366 [inlined]
jl_call_method_internal at /home/mose/repo/julia/src/julia_internal.h:385 [inlined]
jl_apply_generic at /home/mose/repo/julia/src/gf.c:2003
macro expansion at ./repl/REPL.jl:100 [inlined]
#1 at ./event.jl:96
jl_call_fptr_internal at /home/mose/repo/julia/src/julia_internal.h:366 [inlined]
jl_call_method_internal at /home/mose/repo/julia/src/julia_internal.h:385 [inlined]
jl_apply_generic at /home/mose/repo/julia/src/gf.c:2003
jl_apply at /home/mose/repo/julia/src/julia.h:1451 [inlined]
start_task at /home/mose/repo/julia/src/task.c:268
unknown function (ip: 0xffffffffffffffff)
Allocations: 6406588 (Pool: 6406196; Big: 392); GC: 13

No problem on Julia 0.6, no problem with regular arrays.

@mschauer
Copy link
Collaborator

This is worth reporting (Julia shouldn't segfault), but of course can ultimately be caused by bad code in this package.

@c42f
Copy link
Member

c42f commented Oct 18, 2017

The segfault in type_is_ghost looks suspiciously familiar, I'm trying to remember where I've seen this before.

I think there's a reasonable chance this is a bug in julia master - it's been changing fairly rapidly and has been a bit unstable for me lately.

@giordano
Copy link
Author

Ok, ticket opened at JuliaLang/julia#24193

[julia master has] been changing fairly rapidly and has been a bit unstable for me lately.

For me too ;-)

@c42f
Copy link
Member

c42f commented Oct 21, 2017

Looks like this is fixed upstream now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants