Check differentiability of custom loss function before training#17753
Check differentiability of custom loss function before training#17753Frightera wants to merge 13 commits intokeras-team:masterfrom
Conversation
|
will continue the discussion on the issue keras-team/tf-keras#52 to re-scope this PR. |
|
@haifeng-jin Can you take a look at it again? This now supports custom layers aswell. While checking custom layers (can throw any error while checking), it uses nested try-except blocks which may not be the best practice. |
|
Hi @haifeng-jin, Is there an update for this? I see the questions of new users (new to Keras) tries to use a non-differentiable loss function every two weeks in Stackoverflow. |
|
Need an review from @qlzh727 since I do not have enough knowledge to review this PR. |
|
Hello, Thank you for submitting a pull request. We're currently in the process of migrating the new |
Feature request was made in keras-team/tf-keras#52.
This is a very common mistake when users define custom loss function / classes which are not differentiable which leads getting
Nonegradients in fitting process. This check makes it easier for users to interpret the problem.Example usage:
Raises:
You can see the other usages from this gist.