Developers put a lot of thought into the languages, tools, and frameworks they use. All this discussion and attention to detail is better spent on what you're actually making. The "best tools for the job" mentality doesn't apply as much as developers want it to: You can always use a different code editor, IDE, a different framework, etc. and get the same results. It's just a matter of translation between different environments - computers all behave the same way, with the same basic operations, regardless of what syntactic sugar the developer chooses to use.
Sure, I'm completely ignoring a bunch of factors like scaling, performance, etc. But in all the situations I've been in as a web developer, (i.e., not working at a big company like Facebook or Google), those factors should be ignored, because they don't matter in the cases I've seen.
This rule of thumb works in my experience, which is limited to small, self-contained applications. If you're working on something with hundreds of developers on it, maybe there's something to say for using TypeScript, the fancier features of ES6, making your own virtual machine like HHVM, who knows. I can't really say. Basically, I love functional programming as much as anyone who's used Haskell, but for putting together a small, interactive UI application, there's no problem with using the standard languages that work in the same imperative paradigm as the hardware beneath all the abstractions.
That said, personally I've loved diving down rabbit holes related to programming languages, maybe not for the same reasons. For example, things like L.in.oleum, developed by some obscure developer with an ambitious idea completely disconnected from the mainstream, these are essential parts of what makes the web interesting.