Notice that we had an overall accuracy greater than 96% in the training data, but the overall accuracy was lower in the test data. This can happen often if we overtrain. In fact, it could be the case that a single feature is not the best choice. For example, a combination of features might be optimal. Using a single feature and optimizing the cutoff as we did on our training data can lead to overfitting. Given that we know the test data, we can treat it like we did our training data to see if the same feature with a different cutoff will optimize our predictions. Which feature best optimizes our overall accuracy
+3
Answers (1)
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Notice that we had an overall accuracy greater than 96% in the training data, but the overall accuracy was lower in the test data. This can ...” in 📘 Mathematics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Home » Mathematics » Notice that we had an overall accuracy greater than 96% in the training data, but the overall accuracy was lower in the test data. This can happen often if we overtrain. In fact, it could be the case that a single feature is not the best choice.