-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
comparison to xbatcher #38
Comments
Hi Dhruv, XBatcher (as I understand it) is for iteratively fetching different chunks of an xarray dataset. Each chunk is yielded by a python iterator, as defined by the XBatcher arguments. You can do what you want with each chunk as it comes; the main usage appears to be feeding different chunks of a dataset into an instance of an ML algorithm one at a time to save on memory etc. XCast is specifically for fitting ML / AI / statistics models at each spatial grid point represented in an Xarray object - to be clear, fitting one instance of MLR, ELM, MLP etc at each lat/long point, rather than fitting one model across all lat/long points with xbatcher. XCast does this efficiently by using Dask multiprocessing to distribute each geographically contiguous chunk to a different processor / core, where models are trained at each point within the contiguous region. In a nutshell, XCast distributes the training of different models to different cores, whereas XBatcher seems to use dask's future API / dask delayed patterns to facilitate iterating over Xarray objects in fancy ways. Does that clear things up? |
That does clear things up, thank you for the description. I had definitely not picked up the distinction of one model for all grid points (xbatcher) vs one model per grid point (xcast). |
Glad I could help! It also seems like XBatcher doesn't actually implement the models, whereas XCast does implement some and provides wrapper classes for third party ones. If its alright with you, I'll close this issue |
I was wondering how the scope of xcast differs from xbatcher?
The text was updated successfully, but these errors were encountered: