A good point. Confidence and certainty when you have the required knowledge is a virtue. What’s far more common today is a faux-certainty expressed without knowing what the f*** you are talking about. Jumping to conclusions based on limited or even no evidence, but on a mental projection of what you want the facts to mean. You see that most clearly when you read the comments in say a Stuff article, particularly when it’s a subject matter you happen to know a bit of detail about.
The sign of true intelligence is the ability to say "That's outside my area of expertise". Humans are hard-wired to want answers, not questions, and the ability to say that you don't have the answers is just about the rarest ability one can have.
Putting an answer down on paper during a test that may well be wrong, but you're hoping may be right is one thing. Pretending in discussion that something you're positing is certainly correct, when you really have no idea is another. You can posit an uncertain hypotheses, or even a guess, and still qualify it by explaining your degree of certainty or the assumptions it's based on.
To pretend you're certain is far worse than simply saying "I don't know". For one thing, others will eventually learn your expressions of certainty can't be trusted, and won't believe you even when you really are certain. Secondly and possibly more importantly, you never establish in your own mind a distinction between uncertainty and certainty, and what's required to shift from one to the other. Therefore your path to actual knowledge and certainty is blocked. You're not only trying to fool others, you end up sabotaging yourself.
There is a counterpoint to this general theme however, and a balance to be found. In business meetings for instance, you'll sometimes see those who express their views confidentially get their way, even when in reality their conclusions are wrong. At the other extreme those who tend to agonise over being 100% sure of something before speaking up will get left behind, even though they're closer to the truth. Reality eventually reasserts itself, and the ones full of bluster who mostly get it wrong will get found out over time and regarded accordingly. But you sometimes have to back yourself, move forward and make a decision whilst still uncertain. In many instances in life, particularly business, a confident prompt decision that has an 80% chance of being right is better than delaying the decision until you're 100% certain. To put it another way: you shouldn't let the perfect plan stop you from implementing a good plan.
Mark, I completely agree with all you have written, and I even agree with what Prof Feynman says above.
But I think I may not have made myself very clear:
All through our education system (a system in which Prof Feynman excelled and contributed greatly) not knowing is equated with being wrong. If a lecturer posits a question in class or in a test, the reward (of a good grade or public recognition) goes to those who know, and know confidently.
How can you expect any other result after 12 to 21 years of this education than that people will be indoctrinated to trust implicitly those who "know" and do so confidently?
So my point is: Prof Feynman, and every other educator, by promoting this behavior and behaving this way themselves have contributed greatly to a society in which uncertainty and skepticism is frowned upon, and considered a weakness.
I know that I am perhaps rather harsh on Prof Feynman, but I find this quote at the tail end of a long array of discussions with those in tertiary education who on the one hand lament the lack of inquisitiveness and skepticism of their students, while simultaneously acting in a way that actively undermines those same traits.
One more quick note, at the risk of veering verbose: Notice how public scientific discourse today no longer promotes skepticism but rather trust in established facts.
Speakers like Neil deGrasse Tyson and Brian Cox do not lament a lack of questioning, but rather a lack of pre-existing knowledge, showing a strong preference for an audience that knows, rather than questions.
To be fair, there is a general lack of scientific knowledge among the general population today (the resurgence of flat earth belief being a strong indicator), but more rote learning of established facts is not the answer to that problem, in my humble opinion.
I’m not sure I follow Roedolf. If you’re asked during an exam to answer a question, and your response is “I don’t know”, you deserve to be scored 0%, because you’ve exhibited zero knowledge on the subject. If instead you posit an uncertain response that’s a mixture of right and wrong, you might get 50%. Alternatively you might be completely wrong, but frame your argument in a coherent way, so you deserve some credit from that. Are you saying that approach is wrong, and if so what are the alternatives?
7 comments:
A good point. Confidence and certainty when you have the required knowledge is a virtue. What’s far more common today is a faux-certainty expressed without knowing what the f*** you are talking about. Jumping to conclusions based on limited or even no evidence, but on a mental projection of what you want the facts to mean. You see that most clearly when you read the comments in say a Stuff article, particularly when it’s a subject matter you happen to know a bit of detail about.
The sign of true intelligence is the ability to say "That's outside my area of expertise". Humans are hard-wired to want answers, not questions, and the ability to say that you don't have the answers is just about the rarest ability one can have.
I do wonder, if one of Feynman's students answered "I don't know" in a paper he taught, if he would have accepted it?
The entire educational system (including the classes Feynman taught) teaches kids that not knowing gets the same marks as being wrong.
So to Feynman Id like to paraphrase Emerson: "What you do speaks so loudly I can't hear what you're saying."
Putting an answer down on paper during a test that may well be wrong, but you're hoping may be right is one thing. Pretending in discussion that something you're positing is certainly correct, when you really have no idea is another. You can posit an uncertain hypotheses, or even a guess, and still qualify it by explaining your degree of certainty or the assumptions it's based on.
To pretend you're certain is far worse than simply saying "I don't know". For one thing, others will eventually learn your expressions of certainty can't be trusted, and won't believe you even when you really are certain. Secondly and possibly more importantly, you never establish in your own mind a distinction between uncertainty and certainty, and what's required to shift from one to the other. Therefore your path to actual knowledge and certainty is blocked. You're not only trying to fool others, you end up sabotaging yourself.
There is a counterpoint to this general theme however, and a balance to be found. In business meetings for instance, you'll sometimes see those who express their views confidentially get their way, even when in reality their conclusions are wrong. At the other extreme those who tend to agonise over being 100% sure of something before speaking up will get left behind, even though they're closer to the truth. Reality eventually reasserts itself, and the ones full of bluster who mostly get it wrong will get found out over time and regarded accordingly. But you sometimes have to back yourself, move forward and make a decision whilst still uncertain. In many instances in life, particularly business, a confident prompt decision that has an 80% chance of being right is better than delaying the decision until you're 100% certain. To put it another way: you shouldn't let the perfect plan stop you from implementing a good plan.
Mark, I completely agree with all you have written, and I even agree with what Prof Feynman says above.
But I think I may not have made myself very clear:
All through our education system (a system in which Prof Feynman excelled and contributed greatly) not knowing is equated with being wrong. If a lecturer posits a question in class or in a test, the reward (of a good grade or public recognition) goes to those who know, and know confidently.
How can you expect any other result after 12 to 21 years of this education than that people will be indoctrinated to trust implicitly those who "know" and do so confidently?
So my point is: Prof Feynman, and every other educator, by promoting this behavior and behaving this way themselves have contributed greatly to a society in which uncertainty and skepticism is frowned upon, and considered a weakness.
I know that I am perhaps rather harsh on Prof Feynman, but I find this quote at the tail end of a long array of discussions with those in tertiary education who on the one hand lament the lack of inquisitiveness and skepticism of their students, while simultaneously acting in a way that actively undermines those same traits.
One more quick note, at the risk of veering verbose: Notice how public scientific discourse today no longer promotes skepticism but rather trust in established facts.
Speakers like Neil deGrasse Tyson and Brian Cox do not lament a lack of questioning, but rather a lack of pre-existing knowledge, showing a strong preference for an audience that knows, rather than questions.
To be fair, there is a general lack of scientific knowledge among the general population today (the resurgence of flat earth belief being a strong indicator), but more rote learning of established facts is not the answer to that problem, in my humble opinion.
I’m not sure I follow Roedolf. If you’re asked during an exam to answer a question, and your response is “I don’t know”, you deserve to be scored 0%, because you’ve exhibited zero knowledge on the subject. If instead you posit an uncertain response that’s a mixture of right and wrong, you might get 50%. Alternatively you might be completely wrong, but frame your argument in a coherent way, so you deserve some credit from that. Are you saying that approach is wrong, and if so what are the alternatives?
Post a Comment