-
Notifications
You must be signed in to change notification settings - Fork 196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loss function #121
Comments
"but as the Loss is zero I guess Error is used?" In this API, ATM, loss in calculated only for classification tasks where the last layer is SoftMax. As I work a lot with classification problems, classification accuracy is the metric that I use for comparison more frequently. Other metrics are: number of epochs for convergence, validation/test accuracy, price/performance on certain hardware and speed to process one sample. Regarding RMSE, I'll treat this as a feature request (it's a good idea - thank you). Any function that is calculated on the top of the predicted and the desired output could be calculated as you can get the NN raw output with GetOuput method. In the case that you compare this API with tensorflow, please share your results good or bad. |
The Hypotenuse output shows an Accuracy, an Error and a Loss. In Tensorflow the 'loss function' is used to train the neural network, but as the Loss is zero I guess Error is used? Accuracy and Loss are for information only? The default error is abs(desired_output - neural_network_output)? How can I define another loss function like RMSE or percentile so that I can compare the neural-api output to previous TensorFlow runs?
The text was updated successfully, but these errors were encountered: