A recent discussion about programming languages reminded me of a story not often told. Namely how I was turned off from learning C#. We were considering it as a replacement for Java because it was much better integrated in Windows, which would have allowed my employer at the time (a web agency) to expand into desktop software with an attractive line-up that required fewer explanations.
So I sat down and figured out the basics. The way Microsoft had chosen to "fix Java" was quirky at best, but C# did have some welcome features added back in. And one of them just happened to be a good fit for our very first application.
My one team-mate on this project balked. "What the hell is this? I have no clue what you're doing here."
It was a delegate. A goddamn delegate. A core feature of C#, and one of its major selling points at the time. (C++ didn't get lambdas until 2011; this was years earlier.) But my colleague seemed to have taken a shortcut: instead of properly learning the language, he was using the official IDE as a glorified Visual Basic, filling in the blanks with the simplest possible code to get his mouse-designed forms working at all...
Not my colleague. He was a very pragmatic person, you see. Always pointing out he didn't really enjoy programming, but only saw it as means to an end. Guess he had better things to think about, too, like a wife and kids. So he'd only bother to learn the absolute minimum he could get away with.
Sherlock Holmes, too, would purge from his mind all knowledge that didn't help his detective work, such as the fact that Earth orbits the sun.
So how do you suppose he dealt with crime that spanned timezones?