Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: memory access out of bounds #16

Open
chidokun opened this issue Dec 29, 2018 · 3 comments
Open

RuntimeError: memory access out of bounds #16

chidokun opened this issue Dec 29, 2018 · 3 comments

Comments

@chidokun
Copy link

I have a big problem when loading a model string from file for the first run. libsvm can not deserialize model because of the following RuntimeError. But when I try training a dataset and then load my model from file, it's ok.

RuntimeError: memory access out of bounds
    at wasm-function[84]:51
    at wasm-function[256]:11
    at SVM.Module._deserialize_model (D:\Documents\Thesis\sources\verizone-api\node_modules\libsvm-js\out\wasm\libsvm.js:79:107637)
    at ccall (D:\Documents\Thesis\sources\verizone-api\node_modules\libsvm-js\out\wasm\libsvm.js:79:4914)
    at D:\Documents\Thesis\sources\verizone-api\node_modules\libsvm-js\out\wasm\libsvm.js:79:5295
    at Function.load (D:\Documents\Thesis\sources\verizone-api\node_modules\libsvm-js\src\loadSVM.js:252:19)
    at Function.<anonymous> (D:\Documents\Thesis\sources\verizone-api\src\ml\MLModel.ts:17:30)
    at Generator.next (<anonymous>)
    at fulfilled (D:\Documents\Thesis\sources\verizone-api\src\ml\MLModel.ts:4:58)

Any solution?

@stropitek
Copy link
Member

stropitek commented Jan 3, 2019

Hello @chidokun,

What exactly is the difference between the case that works and the one that doesn't?

Can you provide a model file and a code snippet that reproduces the error?

@AndreBreh
Copy link

AndreBreh commented Jan 18, 2019

i have the same error. The difference between the case that works and the one that doesn't is the file size.
I tested it with a serialized String of size 1,5mb and it works. Then i used a file with size 15mb and it doesnt work

@therealglazou
Copy link

I confirm this issue and the diagnostic above. Beyond a certain size, loading a model does not work any more. I'm unable to say if it's because serialization has generated an incorrect output or if it's a mem size issue on load. I would bet on the latter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants