Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make loading of OpenNLP model location configurable #11

Open
chrismattmann opened this issue Jul 8, 2017 · 2 comments
Open

Make loading of OpenNLP model location configurable #11

chrismattmann opened this issue Jul 8, 2017 · 2 comments
Assignees

Comments

@chrismattmann
Copy link
Contributor

Don't aways assume there is a model/* root level dir. Load from classpath.

@smadha
Copy link
Member

smadha commented Jul 8, 2017

@chrismattmann - #12 should fix this, I tested it yesterday. Will work on Tika to initialise an instance of AgePredicterLocal with additional parameters. Or do you think searching in classpath for model names is better ?

@smadha
Copy link
Member

smadha commented Jul 8, 2017

If we do it classpath way either we do something like

SentenceTokenizer.class.getResource("en-pos-maxent.bin")

and keep en-pos-maxent.bin in classpath at same package declaration as SentenceTokenizer which is opennlp.tools.tokenize
Or we can make a fully qualified package exclusively for storing models like opennlp.model and make

SentenceTokenizer.class.getResource("/opennlp/model/en-pos-maxent.bin")

Either way we'll have to keep model at a specific path, so I think making it configurable in AgePredicter is probably better. We can find classpath in Tika like we do now. What say?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants