-
-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameter optimization #187
Comments
Hey Bhargav, thanks for starting this discussion here. Yes, automatic hyperparameter searching is necessary for NN models and I also have a plan to implement such a feature for all models in PyPOTS. However, I so far prefer the tool NNI by Microsoft. Why? Because it's an advanced toolkit and provides visualization in your web browser, which could be more user-friendly. I used NNI in my previous research project SAITS to optimize hypermeters. You can see my code in the repository. I'm also investigating another framework Optuna for hypermeter optimization. Will also take a look at Skorch suggested by you. I can see your enthusiasm for open source 😊 especially in PyPOTS community here. I invite you to accomplish this feature together with me. Let's keep in touch on Slack. Regarding the loss functions in PyPOTS, yes, most losses used in PyPOTS are not with PyTorch criterion, because we have to take the missing part into consideration. For sure, we can make them PyTorch-compatible in the future, but this is not urgent. We also have a planned feature to enable users to customize their loss functions, refer to issue #137. This is an important feature, especially in imputation models. |
We've added the hyperparameter tuning functionality in PyPOTS v0.2. It's implemented with the Microsoft NNI framework. We're receiving feedback from the community to see how to improve this function. |
好像v0.3没有新增超参调参的内容哈?请问接下来的安排是咋样的,期待更新呐 |
Hi @levisocool, 新年快乐!v0.3的释出是为新论文做的准备,论文的代码会单独开源在另一个repo里面,超参搜索的配置文件是随着实际应用场景的数据改变的,所以不会放在PyPOTS的repo中。关于代码,请再耐心等几天,等工作完成了会第一时间放出来。如果着急的话请加我微信,我可以教你怎么使用调参的功能 |
@levisocool Please refer to the repo https://github.com/WenjieDu/Awesome_Imputation for learning about how to use PyPOTS tuning functionality. |
1. Feature description
Hyperparameter optimization functionality (grid/random search) using a library like skorch
2. Motivation
Clustering algorithm performance depends on selection of ideal values for hyperparameters like number of clusters, number of layers, nodes per layer, learning rate, batch size, etc.
The VaDER paper used prediction strength to select hyperparameter values. The authors mention that:
3. Your contribution
I am trying to implement this but having trouble since the PyPOTS clustering algos have their own internally designed loss functions rather than a standard PyTorch criterion (I think...) and also for unsupervised clustering problems there is no ground truth to call .fit() on (as in the skorch example). I'll keep thinking about it but maybe there's an obvious solution I'm not familiar with in these use cases.
The text was updated successfully, but these errors were encountered: