[LINK] Bad government web sites

Rick Welykochy rick at praxis.com.au
Mon Aug 20 10:04:31 AEST 2007


Ivan Trundle wrote:
> 
> On 20/08/2007, at 9:20 AM, Rick Welykochy wrote:
> 
>> Of course this problem should go away if XHTML became the lingua franca
>> since XML should not accepted by an application unless it is well formed.
>> That's the XML policy.
>>
>> Unfortunately web browsers do not enforce this policy and do accept
>> crap XHTML.
> 
> Now wouldn't *that* be an interesting dilemma?
> 
> If browser developers enforced the policy, then people would be quickly 
> annoyed by poor coding, but who would they blame? And would they just 
> switch to another browser that worked, instead?

I would hope that if XHTML was enforced by the browser, then the
a website would never see the light of day until it passes
testing. Surely the web developer would be forced into fixing
broken XHTML code *before* it sees the light of the Internet.

Of course, it is apparent to me that many sites are not even 
proofed properly and a lot of bugs get through, especially
in CGI applications.

It's not rocket surgery, but in the main it's not done properly.

There is one area that computers are completely and utterly
unforgiving and the user does get annoyed: addressing and other
terminal exceptions that occur because of bad software coding,
i.e. the dreaded Blue Screen of Death on Windows, the total
freeze on *nix (Mac, Linux, Unix) systems. Yes, the latter do
happen.

'Twould the same apply to crap coded websites ;)

cheers
rickw





More information about the Link mailing list