As I do my coding I sometimes wonder if I’m doing things the best way or just the way it’s always been done. Does what I’m doing make sense anymore?
For example, declaring all your variables at the top of the function. If I try to declare it twice or below where I start using it my IDE will bark at me at design time – so what’s the big deal? It seems like it would make more sense to declare the variables right above the block where they’d be used.
Another one would be hungarian notation. I hate that all my variables related to a particular object are scattered throughout my intellisense.
With modern advancements in frameworks and IDE’s, are there some coding practices that don’t really apply anymore and others that may be just plain wrong now?
Don’t declare variables above the block where they’ll be used – declare them in the narrowest scope available, at the point of first use, assuming that’s feasible in your language.
Hungarian notation will depend on the conventions for your language/platform. It also depends on which variety of Hungarian you’re using – the sensible one (which I’m still not fond of) or the version which only restates the type information already available.
One thing to watch out for: when you take up a new language, make sure you take up the idioms for it at the same time, particularly the naming conventions. This will help your code fit in with the new language, rather than with your old (probably unrelated) code. I find it also helps me to think in tune with the new language as well, rather than fighting against it.
But yes, it’s certainly worth revisiting coding practices periodically. If you can’t decide why something’s a good idea, try doing without it for a while…