Lately, while talking to a friend about programming, we keep ending up at the topic of programmer errors versus compiler errors. In other words: if I tell the compiler to add 1 and 2 when I mean subtract, that's obviously my problem. If I tell it to use a non-existent library, it better stop and complain right away. But if I tell it to add 1 and "2"? How far do you go to protect people from themselves?
The question is serious. Back in my days as a pro web developer, I found that the most common bug in PHP appeared to be a misspelled variable name. Started working with notices enabled, and bam! No more mysterious null values.
At the opposite end there's Go, where if you import a module but don't use it, that's an error. If you define a variable or function but don't use it, that's an error. If you try to access a field of an interface type outside of the correct branch in a type switch... that's an error.
PHP assumes programmers don't make typos. Go assumes programmers are literal code monkeys who can't be trusted to have a clue. Good job banging on the keys, Koko! Have a banana.
I prefer dynamic languages, yet I'm always careful and explicit about types. Not so much having to reassure the compiler that yes, if my code says
var i = 0 that really does mean an integer. (Swearing on my pinky, teach. Fingers crossed behind my back.) If a language can infer that for me, who cares that it's not actually dynamic. Until it gets too fussy, anyway. But what counts as too fussy? Is it wrong if an empty string evaluates to boolean false? What about the floating point number 0.0? It matters. Space rockets fall from the sky when the programmer means one thing and the compiler understands another.
Context matters. That's why I prefer languages where "+" means addition in the arithmetic sense, and there's another, different operator to join strings, be it "&", ".." or whatever.
Erecting prison bars around people "for their own safety" is twice a lie. Guardrails save lives.