We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
基本网上这张图, 输入为x,输出为f(x)+x, 其中f(x)叫残差, 也就是预测值 - 观测值, 很多文章都提到对等映射, 所以一开始就进入了一个误区f(x)+x = x,怎么可能呢? 视频里一直提梯度消失, 结合多个视频我猜想:
是不是当梯度消失的时候f(x)接近于0, 所以近似于f(x)+x = x, 这时候就是对等映射了, 如果不消失那么就不用管了, 就正常对f(x)+x反向求梯度就行了. 暂时这么理解吧, 反正能自洽了.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
基本网上这张图, 输入为x,输出为f(x)+x, 其中f(x)叫残差, 也就是预测值 - 观测值, 很多文章都提到对等映射, 所以一开始就进入了一个误区f(x)+x = x,怎么可能呢? 视频里一直提梯度消失, 结合多个视频我猜想:
是不是当梯度消失的时候f(x)接近于0, 所以近似于f(x)+x = x, 这时候就是对等映射了, 如果不消失那么就不用管了, 就正常对f(x)+x反向求梯度就行了. 暂时这么理解吧, 反正能自洽了.
The text was updated successfully, but these errors were encountered: