Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Function to return any eigenvector #49885

Open
ParadaCarleton opened this issue May 18, 2023 · 10 comments
Open

Function to return any eigenvector #49885

ParadaCarleton opened this issue May 18, 2023 · 10 comments
Labels

Comments

@ParadaCarleton
Copy link
Contributor

Sometimes it's useful to compute just a single eigenvector (e.g. by the power method), and any eigenvector will do, e.g. I have an irreducible Markov chain, where the eigenvector is unique. I couldn't find any functions for this; should we add one to LinearAlgebra?

@oscardssmith
Copy link
Member

see #49487 which was my attempt to write something similar to this for making opnorm2 faster.

@cafaxo
Copy link
Contributor

cafaxo commented May 19, 2023

Packages that offer partial eigendecompositions:

I am not sure that such iterative methods fit into LinearAlgebra when we already have specialized packages for this.

@ParadaCarleton
Copy link
Contributor Author

Packages that offer partial eigendecompositions:

I am not sure that such iterative methods fit into LinearAlgebra when we already have specialized packages for this.

Definitely true for some of these! Fitting the entirety of KrylovKit.jl into LinearAlgebra (including the GPUArrays dependency) would definitely not be the best idea 😅

But in some cases, I think they'd fit in perfectly. The power method seems basic, small, and well-established enough that I think it'd be a good fit for LinearAlgebra; I was actually very surprised to learn it isn't included.

@stevengj
Copy link
Member

stevengj commented May 23, 2023

The power method seems basic, small, and well-established

Also mostly obsolete. Krylov methods are generally strictly better, even with restarting.

@stevengj stevengj added the domain:linear algebra Linear algebra label May 23, 2023
@ParadaCarleton
Copy link
Contributor Author

The power method seems basic, small, and well-established

Also mostly obsolete. Krylov methods are generally strictly better, even with restarting.

Huh, that's surprising, I thought the power method was still standard for the largest eigenvalue problem.

@aravindh-krishnamoorthy
Copy link
Contributor

Sometimes it's useful to compute just a single eigenvector (e.g. by the power method), and any eigenvector will do, e.g. I have an irreducible Markov chain, where the eigenvector is unique. I couldn't find any functions for this; should we add one to LinearAlgebra?

If your matrix is symmetric, kindly take a look at eigen(A::RealHermSymComplexHerm, irange::UnitRange) in symmetriceigen.jl, which can optionally return a single eigenvalue and associated eigenvector. It uses LAPACK's XSYEVR, which is quite efficient.

Huh, that's surprising, I thought the power method was still standard for the largest eigenvalue problem.

If deemed sufficient, a basic version of the power algorithm can be easily implemented by users themselves. Please take a look at the code in #49487 (mentioned in the earlier comments) for an example implementation. I feel it might be a great idea to add this algorithm to the Julia documentation.

@ParadaCarleton
Copy link
Contributor Author

ParadaCarleton commented Jun 9, 2023

If deemed sufficient, a basic version of the power algorithm can be easily implemented by users themselves.

Definitely can be, but didn't @stevengj just say it's obsolete? The largest eigenvalue problem is common enough that I'd think LinearAlgebra should include it, but I'm not sure what algorithm it should use.

@aravindh-krishnamoorthy
Copy link
Contributor

aravindh-krishnamoorthy commented Jun 19, 2023

If deemed sufficient, a basic version of the power algorithm can be easily implemented by users themselves.

Definitely can be, but didn't @stevengj just say it's obsolete?

It's a first algorithm taught in numerical linear algebra classes. The advantages and disadvantages are well known, please see, e.g., the Wiki page and references therein. The method works well when the eigenvalues are unique and sufficiently separated. However, since it is only beneficial for a relatively narrow class of matrices and non-convergent (or very slowly convergent) for others, it may not be suitable for stdlib, especially when several better alternatives targeting specialised applications are known. Hardly ever used in serious business, in my experience so far.

Edit: R has a slightly modified version with scaling in the matlib package aimed at teaching. So, may be worth looking at if there's sufficient interest. But, perhaps better done in a specialised package rather than stdlib.

@ParadaCarleton
Copy link
Contributor Author

ParadaCarleton commented Jun 20, 2023

Yeah, if it's only really useful for educational reasons, it should go in a package. But the largest eigenvalue problem is common enough that I'd guess there should be a good default option in LinearAlgebra, or people might just reach for the first algorithm they know (usually, power iteration).

@stevengj
Copy link
Member

stevengj commented Jun 20, 2023

We used to have an eigs function in base (to return extremal eigenvalues/vectors), but it was moved to a package: #27616

I don't really see us reversing this decision and putting Krylov iterative methods back into Base. There are now several good packages for this. The algorithms are qualitatively rather different from the dense linear algebra in LinearAlgebra, or even from the sparse-direct methods in SparseArrays.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants