Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

det(I) #23364

Merged
merged 12 commits into from
Aug 22, 2017
Merged

det(I) #23364

merged 12 commits into from
Aug 22, 2017

Conversation

ChrisRackauckas
Copy link
Member

The determinant for the UniformScaling operator was missing.

The determinant for the UniformScaling operator was missing.
@@ -167,6 +167,7 @@ end

inv(J::UniformScaling) = UniformScaling(inv(J.λ))
norm(J::UniformScaling, p::Real=2) = abs(J.λ)
det(J::UniformScaling) = 1.0
Copy link
Contributor

@musm musm Aug 20, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I don't think this handles correctly when \lambda is not 1

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah that's better.

@ararslan ararslan added domain:linear algebra Linear algebra needs tests Unit tests are required for this change labels Aug 20, 2017
@ararslan
Copy link
Member

Could also add logdet(::UniformScaling{T}) where {T} = log(one(T))

@ChrisRackauckas
Copy link
Member Author

I made it zero(T) instead.

@@ -167,6 +167,8 @@ end

inv(J::UniformScaling) = UniformScaling(inv(J.λ))
norm(J::UniformScaling, p::Real=2) = abs(J.λ)
det(J::UniformScaling{T}) where {T} = one(T)
Copy link
Contributor

@musm musm Aug 20, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right, but det (10*eye(n,n)) == 10^n not 1.0, so I don't think this will work.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh wait, forgot about the lambda. My bad.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess you could check if \lambda == 1 or else throw, but doesn't seem very useful ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, it is useful if you don't want to special case it. I opened this PR because I ran into that exact issue:

https://github.com/JuliaDiffEq/DiffEqBase.jl/blob/master/src/problems/dde_problems.jl#L36

@ararslan
Copy link
Member

zero(T) isn't right; typeof(log(det(eye(Int, 3, 3)))) is Float64.

@@ -167,6 +167,8 @@ end

inv(J::UniformScaling) = UniformScaling(inv(J.λ))
norm(J::UniformScaling, p::Real=2) = abs(J.λ)
det(J::UniformScaling{T}) where {T} = J.λ==one(T) ? one(T) : error("Determinant of UniformScaling is only well-defined when λ=1")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ArgumentError would be better than error

@@ -167,6 +167,8 @@ end

inv(J::UniformScaling) = UniformScaling(inv(J.λ))
norm(J::UniformScaling, p::Real=2) = abs(J.λ)
det(J::UniformScaling{T}) where {T} = J.λ==one(T) ? one(T) : ArgumentError("Determinant of UniformScaling is only well-defined when λ=1")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isone(J.λ) ? one(T) : throw(ArgumentError("Determinant of UniformScaling is only well-defined when λ = 1."))

@@ -167,6 +167,8 @@ end

inv(J::UniformScaling) = UniformScaling(inv(J.λ))
norm(J::UniformScaling, p::Real=2) = abs(J.λ)
det(J::UniformScaling{T}) where {T} = isone(J.λ) ? one(T) : throw(ArgumentError("Determinant of UniformScaling is only well-defined when λ=1."))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually one is wrong as well. det(eye(Int, 3, 3)) is Float64, whereas det(I) will be Int.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

det(eye(Int, 3, 3)) is a Float64 only because it is promoted to perform an LU factorization. In this case we don't need that so I think one(T) is perfectly fine. We have for example:

julia> det(Diagonal([1,2,3]))
6

for the same reason; no need to promote for LU.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0I should also be fine?

@@ -167,6 +167,8 @@ end

inv(J::UniformScaling) = UniformScaling(inv(J.λ))
norm(J::UniformScaling, p::Real=2) = abs(J.λ)
det(J::UniformScaling{T}) where {T} = isone(J.λ) ? one(T) : throw(ArgumentError("Determinant of UniformScaling is only well-defined when λ=1."))
logdet(::UniformScaling{T}) where {T} = isone(J.λ) ? log(one(T)) : throw(ArgumentError("Determinant of UniformScaling is only well-defined when λ=1."))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing J for the argument. But we actually do not need this since we have

logdet(A) = log(det(A))

@codecov-io
Copy link

Codecov Report

Merging #23364 into master will decrease coverage by <.01%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #23364      +/-   ##
==========================================
- Coverage   54.98%   54.98%   -0.01%     
==========================================
  Files         275      275              
  Lines       51130    51132       +2     
==========================================
- Hits        28115    28114       -1     
- Misses      23015    23018       +3
Impacted Files Coverage Δ
base/twiceprecision.jl 57.79% <100%> (+0.16%) ⬆️
base/distributed/managers.jl 16.75% <0%> (-2.24%) ⬇️
base/range.jl 49.37% <0%> (-0.13%) ⬇️
base/printf.jl 60.3% <0%> (+0.15%) ⬆️
base/channels.jl 47.89% <0%> (+0.84%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c54e7ac...241c3bf. Read the comment docs.

@giordano
Copy link
Contributor

zero(T) isn't right; typeof(log(det(eye(Int, 3, 3)))) is Float64.

What about zero(float(T))? Anyway, the case pointed out by @ararslan is be a good test to include.

@ChrisRackauckas
Copy link
Member Author

Why should it match eye in type? I don't see why a priori that should be the case. If it can return an Int that's better?

@fredrikekre
Copy link
Member

I think the current implementation is OK. There is a trailing whitespace (which fails CI) and would be great with some testcases :)

@test typeof(det(I)) == Int
@test typeof(det(1.0I)) == Float64
@test iszero(det(0I))
@test isone(logdet(I))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be iszero, shouldn't it?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops!

Copy link
Member

@fredrikekre fredrikekre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🇸🇪

@ararslan ararslan removed the needs tests Unit tests are required for this change label Aug 22, 2017
@@ -43,6 +43,15 @@ end
@test UniformScaling(α)./α == UniformScaling(1.0)
end

@testset "det and logdet" begin
@test isone(det(I))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we should actually change this to @test det(I) === 1 instead. Since isone(I) is true, this will still succeed if det(::UniformScaling) becomes a no-op :)

@andreasnoack andreasnoack merged commit bc66f02 into JuliaLang:master Aug 22, 2017
@ChrisRackauckas ChrisRackauckas deleted the patch-7 branch August 22, 2017 14:09
aviatesk added a commit that referenced this pull request Jun 6, 2024
When an input line like `[import|using] AAA: xxx` is entered, if `AAA`
is already loaded in the REPL session, it will now autocomplete to the
names available in `AAA` that start with `xxx`.
For example, it will autocomplete `using Base.Experimental: @ov|` to
`using Base.Experimental: @overlay`.

- fixes #23364
aviatesk added a commit that referenced this pull request Jun 6, 2024
When an input line like `[import|using] AAA: xxx` is entered, if `AAA`
is already loaded in the REPL session, it will now autocomplete to the
names available in `AAA` that start with `xxx`.
For example, it will autocomplete `using Base.Experimental: @ov|` to
`using Base.Experimental: @overlay`.

- fixes #23364
aviatesk added a commit that referenced this pull request Jun 7, 2024
When an input line like `[import|using] AAA: xxx` is entered, if `AAA`
is already loaded in the REPL session, it will now autocomplete to the
names available in `AAA` that start with `xxx`.
For example, it will autocomplete `using Base.Experimental: @ov|` to
`using Base.Experimental: @overlay`.

- fixes #23364
aviatesk added a commit that referenced this pull request Jun 7, 2024
When an input line like `[import|using] AAA: xxx` is entered, if `AAA`
is already loaded in the REPL session, it will now autocomplete to the
names available in `AAA` that start with `xxx`.
For example, it will autocomplete `using Base.Experimental: @ov|` to
`using Base.Experimental: @overlay`.

- fixes #23364
aviatesk added a commit that referenced this pull request Jun 8, 2024
When an input line like `[import|using] AAA: xxx` is entered, if `AAA`
is already loaded in the REPL session, it will now autocomplete to the
names available in `AAA` that start with `xxx`.
For example, it will autocomplete `using Base.Experimental: @ov|` to
`using Base.Experimental: @overlay`.

- fixes #23364
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants