I am a programmer who writes a lot of code for desktop applications, now started considering cross-platform apps as an issue but at work I write C# apps and I come from C++ and CS background and of course, I wrote several things in QT/C++. But now I am kinda confused about web applications, I have done some work on PHP and I know how things go there, I was a gmail and google docs user for a lot of time and I have seen how much web applications were improved with new web 2.0 technology including Ajax, XML so on. And my confusion is that should I start looking forward for web application development? and continue exploring the power of web 2.0 or I have to just stick with my old world where I feel very comfortable on parallelism and other stuff? Because believe me I had too many offers to work as a web application developer but I didn’t realize this opportunity and now I am kinda confused whether I must start writing web apps. Have you been writing desktop applications and switched to web? or have somebody experience in this scenario?
Thank you.
The boundaries between desktop and web applications have really blurred. Whilst once upon a time the nature of developing for the web was totally different to developing for the desktop, nowadays you find the same concepts (such as parallelism which you referred to) cropping up in both. Don’t think of developing web applications as taking a huge step away from traditional software development as you’ll employ just as many skills and concepts as you already use. You wouldn’t need to learn a whole lot more to get involved in web development if you have C# experience, as you could code backends to web applications in a very similar way to how you currently work. If you wanted/needed to get involved in the UI side of things, there are new technologies you’d need to pick up, but they’re not essential to get a job in web development (as long as you weren’t looking for a frontend role obviously).
To follow up Dustman’s comments about companies wanting to keep tight control of their data etc; bear in mind that not all ‘web applications’ involve the use of the internet. Really all the term means is ‘applications developed on web-based technologies’ and as well as being deployed publically on the web, they’re commonly deployed on intranets and other closed-access environments. I work for a software company which develops ‘web applications’ but a large number of systems are hosted by clients for use on their internal networks for the very reasons Dustman refers to – they want to keep tight control of their data. The beauty of web based technologies is that you can achieve this whilst still reaping the benefits of a centralised system, meaning there is no need to manage deployment across 100s of workstations, no need to worry too much about the specifications of client devices, the ability to access the system across different types of device (mobile etc), regular and easily deployed updates, and so I could continue.