There are a couple of potential problems with with making the client of a client-server environment as heavy as possible.

  • You may not trust the client. In network games, cheats using altered clients are notorious. On Web sites, doing all your data validation using client-side scripting leaves the server vulnerable to bogus data and exploits such as buffer overflows and SQL Injection from malicious clients.

Validating data on the client side is generally a good idea, as it makes it a lot faster to catch the majority of honest clients making the usual mistakes. But relying on this validation is not a good idea. The server should also check the input.

  • It is easier to reverse-engineer a client. The great thing about PHP (and, I suppose, ASP and other server-side scripting languages), is that the client never sees my source, only the output, so they never know how I am generating that output. I can store sensitive information like the database login name and password that the script needs in the .php file.
  • Jetifi pointed out that there have been exploits that expose the script source to the client, and gives the wise advice "It's better to have files containing usercodes/passwords outside of the http root directory for this reason."

    But these are not the norm, they are errors and security holes. With client-side scripting, it is a required first step that the server send the script source to the client so that the client can run it.

  • The client is not a controlled environment. Leaving aside malicious clients, legitimate clients vary a lot, and you usually have less control over the hardware and software on them than on the server. Even in a locked-down corporate environment, it is almost impossible to make and keep hundreds of machines identical. On the internet you could be dealing with any hardware, any OS, any browser. This is why many websites have those disclamers: "best viewed with ...". This drives the client side towards the lowest common denominator.
  • For instance, client-side Javascript gives me endless headaches - when it works in Mozilla clients, it doesn't work in IE clients. When it works in IE5, it fails in IE4. In contrast, my PHP works fine on my server. End of story. I am begining to think that Javascript is the Antichrist.

  • It can be faster to do it on the server. In 1990's pre-internet style client-server database applications, where a single server serves tens or at most a few hundred clients, the common wisdom is that it is faster to do the heavy lifting on the server.

    In my experience, this is entirely correct, as network bandwidth is usually more of a bottleneck than processor speed or disk IO. If you are going to update a thousand rows, it is far faster for the client to call a stored procedure that does it all on the server than to have the server send the rows to the client, process them there and send them all back again. Or even worse - the client might fetch a row, process it and send it back, then go on to the next row, incuring a network round-trip for each row.

    In a heavy-load internet-based scenario with potentially thousands of clients, you are better off making a three-tiered design, with the hard work done at a middle-tier, possibly distributed across a pool of servers located close to the database backend and with high-speed access to it.

  • Log in or register to write something here or to contact authors.