Most software these days is delivered in the form of web applications, and the move towards cloud computing will only emphasize this trend.
Web apps consist of client and server parts, where the client part has been getting bigger lately to deliver a richer user experience.
This split has implications for developers, because the technologies used on the client and server parts are often different.
Disadvantages of Different Client and Server Technologies
Developers of web applications risk becoming either specialists confined to a single part of the stack or polyglot programmers.
Polyglot programming is the practice of knowing and using many programming languages. There are both advantages and disadvantages associated with polyglot programming. I believe the overriding disadvantage is the context switching involved, which degrades productivity and opens the doors to extra bugs.
Being a specialist has advantages and disadvantages as well. A big disadvantage I see is the “us versus them”, or “not my problem” culture that can arise. In general, Agile teams prefer generalists.
Bringing Server Technologies to the Client
Many attempts have been made at bridging the gap between client and server. Most of these attempts were about bringing server-side technologies to the client.
Java on the client has failed to reached widespread adoption, and now that many people advice to disable Java applets altogether because of security reasons it seems increasingly unlikely that it ever will.
Bringing .NET to the client has likewise failed as Silverlight adoption continues to drop.
All in all I don’t feel there currently is a satisfactory way of using server technologies on the client.
Bringing Client Technologies to the Server
What do you think?
If I had to put my money on any unification approach, it would be Node.js.
Do you agree? What needs to happen to make this a common way of developing web apps? Please let me know your thoughts in the comments.