Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

promote input data type to float64 #41

Open
floswald opened this issue Jul 3, 2019 · 1 comment
Open

promote input data type to float64 #41

floswald opened this issue Jul 3, 2019 · 1 comment

Comments

@floswald
Copy link
Contributor

floswald commented Jul 3, 2019

using GaussianMixtures, RDatasets
julia> f = dataset("datasets","faithful")
272×2 DataFrame
│ Row │ Eruptions │ Waiting │
│     │ Float64   │ Int64   │
├─────┼───────────┼─────────┤
│ 13.679      │
│ 21.854      │
│ 33.33374      │
│ 42.28362      │
│ 54.53385...

julia> GMM(2,f.Waiting)
[ Info: Initializing GMM, 2 Gaussians LinearAlgebra.diag covariance 1 dimensions using 272 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.350600e+04
      1       8.878464e+03      -4.627536e+03 |        2
      2       8.855791e+03      -2.267287e+01 |        0
      3       8.855791e+03       0.000000e+00 |        0
K-means converged with 3 iterations (objv = 8855.79069767458)
ERROR: InexactError: Int64(54.75)
Stacktrace:
 [1] Type at ./float.jl:703 [inlined]
 [2] convert at ./number.jl:7 [inlined]
 [3] setindex! at ./array.jl:767 [inlined]
 [4] copyto!(::IndexLinear, ::Array{Int64,2}, ::IndexCartesian, ::LinearAlgebra.Adjoint{Float64,Array{Float64,2}}) at ./abstractarray.jl:764
 [5] copyto! at ./abstractarray.jl:745 [inlined]
 [6] Type at ./array.jl:482 [inlined]
 [7] convert(::Type{Array{Int64,2}}, ::LinearAlgebra.Adjoint{Float64,Array{Float64,2}}) at ./array.jl:474
 [8] #GMMk#9(::Symbol, ::Int64, ::Int64, ::Int64, ::Function, ::Int64, ::Array{Int64,2}) at /Users/florian.oswald/.julia/dev/GaussianMixtures/src/train.jl:110
 [9] (::getfield(GaussianMixtures, Symbol("#kw##GMMk")))(::NamedTuple{(:kind, :nInit, :nIter, :sparse),Tuple{Symbol,Int64,Int64,Int64}}, ::typeof(GaussianMixtures.GMMk), ::Int64, ::Array{Int64,2}) at ./none:0
 [10] #GMM#7(::Symbol, ::Symbol, ::Int64, ::Int64, ::Int64, ::Int64, ::Type, ::Int64, ::Array{Int64,2}) at /Users/florian.oswald/.julia/dev/GaussianMixtures/src/train.jl:37
 [11] (::getfield(Core, Symbol("#kw#Type")))(::NamedTuple{(:method, :kind, :nInit, :nIter, :nFinal, :sparse),Tuple{Symbol,Symbol,Int64,Int64,Int64,Int64}}, ::Type{GMM}, ::Int64, ::Array{Int64,2}) at ./none:0
 [12] #GMM#8(::Symbol, ::Int64, ::Int64, ::Int64, ::Int64, ::Type, ::Int64, ::Array{Int64,1}) at /Users/florian.oswald/.julia/dev/GaussianMixtures/src/train.jl:43
 [13] GMM(::Int64, ::Array{Int64,1}) at /Users/florian.oswald/.julia/dev/GaussianMixtures/src/train.jl:43
 [14] top-level scope at none:0

julia> y = convert(Vector{Float64},f.Waiting)

julia> GMM(2,y)
[ Info: Initializing GMM, 2 Gaussians LinearAlgebra.diag covariance 1 dimensions using 272 data points
  Iters               objv        objv-change | affected 
-------------------------------------------------------------
      0       1.790200e+04
      1       8.942126e+03      -8.959874e+03 |        2
      2       8.858336e+03      -8.378957e+01 |        2
      3       8.855791e+03      -2.545360e+00 |        0
      4       8.855791e+03       0.000000e+00 |        0
K-means converged with 4 iterations (objv = 8855.79069767458)
┌ Info: K-means with 272 data points using 4 iterations
└ 68.0 data points per parameter
[ Info: Running 10 iterations EM on diag cov GMM with 2 Gaussians in 1 dimensions
[ Info: iteration 1, average log likelihood -3.802376
[ Info: iteration 2, average log likelihood -3.801642
[ Info: iteration 3, average log likelihood -3.801544
[ Info: iteration 4, average log likelihood -3.801506
[ Info: iteration 5, average log likelihood -3.801489
[ Info: iteration 6, average log likelihood -3.801482
[ Info: iteration 7, average log likelihood -3.801479
[ Info: iteration 8, average log likelihood -3.801478
[ Info: iteration 9, average log likelihood -3.801477
[ Info: iteration 10, average log likelihood -3.801477
┌ Info: EM with 272 data points 10 iterations avll -3.80147754.4 data points per parameter
GMM{Float64} with 2 components in 1 dimensions and diag covariance
Mix 1: weight 0.639023
  mean: [80.093]
  variance: [34.4079]
Mix 2: weight 0.360977
  mean: [54.6179]
  variance: [34.5016]

@davidavdav
Copy link
Owner

did the commit help?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants