Andrew Pollack's Blog

Technology, Family, Entertainment, Politics, and Random Noise

A legacy of bad standards

By Andrew Pollack on 08/28/2003 at 03:33 PM EDT

Here's a long rant. I KNOW I'm going to get flamed for some of this -- particularly the anti-Java stuff. Oh well. I've been told I'm an idiot before, and I didn't melt then. I guess I'll live through it again.

In the weeks leading up to the opening of this blog, I've worked to come up to speed on some of the more 'web oriented' protocols and 'languages'. I use quotes in that case because many of these are barely valid enough to be called languages. Below are my impressions of some of these key new technologies and why I think we're all doomed if we can't do better going forward. Overall, the first standards were exactly what they were supposed to be. They were simple methods for the grass roots user community to share development and information. Gopher, html, cgi, mp3 -- all of these work in the way they were meant to work if you stay within the limits for which they were intended. Then along comes the dot-com hyper-evolution with its influx of more cash than was healthy for the 'ecosystem' of grass roots development. I'll talk more about that in another rant, but for now lets just look at the result.

Suddenly we had everyone in agreement that standards would be the fuel for massive new integration and development, but big money declaring that standards were only good if you could also 'protect your own investment' and 'synergistically leverage the standard to maximize your market share'.

Translation: If you're a little company with a neat idea, standards let you reach the mass market; but if you're a success, the first thing you have to do is find a way to be proprietary or everyone else will eat lunch at your table. Into that mix, toss publishers who make more money where there are competing standards (it was often joked that the best way to get press for your standard was to use another name and publish a competing standard) and platform religious wars pitting the 'everything must be free' zealots against the 'capitalism is good' zealots. What we're left with is a mass of barely workable, poorly defined, self important, over engineered technology definitions that are so badly supported as to be nearly useless for anything significant. Of course, there are some gems in that mix too.

The Web Browser as a Standard

The promise of the browser was the 'universal interface'. No more would software have to be distributed to desktops! No more would millions of man hours have to be spent programming the dialog boxes and panel screens for applications. Any user could sit at any desk and interface with anything! It could almost be like one of those movies, where the archetypal 'hacker' sits down to a dos prompt and in 15 minutes has accessed every computer on the planet and gotten or changed all the information related to everything anywhere. Wow. Cool stuff. I'd settle for being able to access oracle without a 45 minute install process -- or DB2 without 3 days figuring out which CD or download to install in the first place.

The reality, is that the browser is great for things like a blog, advertising, support (remember getting drivers before the mid 90's?) -- but for an actual application, web browsers stink. They're a lousy interface. You can't develop really compelling applications with them. As a result, the things we call compelling applications have drastically changed. The technology has driven the business processes -- never a good thing.

Of the new standards, or so called standards, I'll touch on just a few that I've had to work with lately....

XML - 6 points out of 10. Cool, but not revolutionary by any stretch. On the plus side, it handles nested data better than most previous methods and it can handle different character sets and strongly typed data. On the minus side it's bulky and obnoxious to work with, creates huge files that must be read sequentially, and is grossly over engineered. Virtually every real-world xml use I've seen ends up using less than 1% of the capabilities "designed in" -- and its always the same 1%. Of course, you still need this magic "DTD" which is a proprietary format for the data to be stored in the xml, though of course, there are some published DTD's out there.

RSS:XML - 9 points out of 10. A solid, simple (as simple as anything in XML can be) standard that does exactly what it set out to do. Bravo.

Soap:XML - 7 points out of 10. Interesting, great idea. This is what XML was supposed to do. By adding a WSDL file, Soap:XML becomes the self - identifying communications tool we were promised. In practice, it is both grossly over engineered -- a result of hubris in my opinion -- and under defined. Its great for passing a quick function call to get city and state that match a zip code, to find out how many Pepsi's are left in the machine, or to check the weather; but just try to use it to pass a complex object with multiple arrays and get back a useable result with multiple responses consisting of a complex type. Oh, its capable. It can be done. In fact, reading the spec, I could see at least a dozen ways it could be done. Which is right? Well, that depends entirely on an agreement between the two sides of the transaction on what WSDL file definition will actually match. Ick. In practice, you create the object in your language first, then let that language (admit it, Visual Studio.NET) create the WSDL and sample XML for you, then your work backwards.

JavaScript - 8 points out of 10. Why? It does just what it was supposed to do. It provides a lightweight fully flexible scripting language to manipulate objects on the fly. It would get more points if the implementers could agree on what objects it should have, and how those objects should react to it, but then that wouldn't be "protecting the investment."

CSS - 4 points out of 10. Only 4? Yeah. It was supposed to separate the data from the layout. It doesn't -- at least not much. It does have value, as it fills a great many gaps in the HTML itself, but that's really it. Its really powerful features are so badly supported by browser versions that they're unusable. Oh, and it is a moronic failure of forethought to come up with a language that won't allow the definition of constants or macros at the top of the code page. Hell, even XML can do that -- though its remarkably difficult and nobody does (Its in the 99% that nobody uses).

That brings me to Java. The rant there is long enough, that I've moved it to a new topic.


There are  - loading -  comments....

re: A legacy of bad standardsBy Ben Langhinrichs on 06/23/2004 at 11:21 AM EDT
Wow.  I agree with almost everything you wrote, except that I think CSS is
higher up on the usability scale than you do, and SOAP is lower.  Heck, even
XML is probably lower, but I'll not quibble (for once).  Nice rant!:

------------------------------------------------------------------------------
in response to
------------------------------------------------------------------------------
Here's a long rant. I KNOW I'm going to get flamed for some of this --
particularly the anti-Java stuff. Oh well. I've been told I'm an idiot before,
and I didn't melt then. I guess I'll live through it again. In the weeks
leading up to the opening of this blog, I've worked to come up to speed on some
of the more 'web oriented' protocols and 'languages'. I use quotes in that case
because many of these are barely valid enough to be called languages. Below are
my impressions of some of these key new technologies and why I think we're all
doomed if we can't do better going forward. Overall, the first standards were
exactly what they were supposed to be. They were simple methods for the grass
roots user community to share development and information. Gopher, html, cgi,
mp3 -- all of these work in the way they were meant to work if you stay within
the limits for which they were intended. Then along comes the dot-com
hyper-evolution with its influx of more cash than was healthy for the
'ecosystem' of grass roots development. I'll talk more about that in another
rant, but for now lets just look at the result. Suddenly we had everyone in
agreement that standards would be the fuel for massive new integration and
development, but big money declaring that standards were only good if you could
also 'protect your own investment' and 'synergistically leverage the standard
to maximize your market share'. Translation:  If you're a little company with a
neat idea, standards let you reach the mass market; but if you're a success,
the first thing you have to do is find a way to be proprietary or everyone else
will eat lunch at your table. Into that mix, toss publishers who make more
money where there are competing standards (it was often joked that the best way
to get press for your standard was to use another name and publish a competing
standard) and platform religious wars pitting the 'everything must be free'
zealots against the 'capitalism is good' zealots. What we're left with is a
mass of barely workable, poorly defined, self important, over engineered
technology definitions that are so badly supported as to be nearly useless for
anything significant. Of course, there are some gems in that mix too. The Web
Browser as a Standard The promise of the browser was the 'universal interface'.
No more would software have to be distributed to desktops! No more would
millions of man hours have to be spent programming the dialog boxes and panel
screens for applications. Any user could sit at any desk and interface with
anything! It could almost be like one of those movies, where the archetypal
'hacker' sits down to a dos prompt and in 15 minutes has accessed every
computer on the planet and gotten or changed all the information related to
everything anywhere. Wow. Cool stuff. I'd settle for being able to access
oracle without a 45 minute install process -- or DB2 without 3 days figuring
out which CD or download to install in the first place. The reality, is that
the browser is great for things like a blog, advertising, support (remember
getting drivers before the mid 90's?) -- but for an actual application, web
browsers stink. They're a lousy interface. You can't develop really compelling
applications with them. As a result, the things we call compelling applications
have drastically changed. The technology has driven the business processes --
never a good thing. Of the new standards, or so called standards, I'll touch on
just a few that I've had to work with lately.... XML - 6 points out of 10.
Cool, but not revolutionary by any stretch. On the plus side, it handles nested
data better than most previous methods and it can handle different character
sets and strongly typed data. On the minus side it's bulky and obnoxious to
work with, creates huge files that must be read sequentially, and is grossly
over engineered. Virtually every real-world xml use I've seen ends up using
less than 1% of the capabilities "designed in" -- and its always the same 1%.
Of course, you still need this magic "DTD" which is a proprietary format for
the data to be stored in the xml, though of course, there are some published
DTD's out there. RSS:XML - 9 points out of 10. A solid, simple (as simple as
anything in XML can be) standard that does exactly what it set out to do.
Bravo. Soap:XML - 7 points out of 10. Interesting, great idea. This is what XML
was supposed to do. By adding a WSDL file, Soap:XML becomes the self -
identifying communications tool we were promised. In practice, it is both
grossly over engineered -- a result of hubris in my opinion -- and under
defined. Its great for passing a quick function call to get city and state that
match a zip code, to find out how many Pepsi's are left in the machine, or to
check the weather; but just try to use it to pass a complex object with
multiple arrays and get back a useable result with multiple responses consiting
of a complex type. Oh, its capable. It can be done. In fact, reading the spec,
I could see at least a dozen ways it could be done. Which is right? Well, that
depends entirely on an agreement between the two sides of the transaction on
what WSDL file definition will actually match. Ick. In practice, you create the
object in your language first, then let that language (admit it, Visual
Studio.NET) create the WSDL and sample XML for you, then your work backwards.
JavaScript - 8 points out of 10. Why? It does just what it was supposed to do.
It provides a lightweight fully flexible scripting language to manipulate
objects on the fly. It would get more points if the implementers could agree on
what objects it should have, and how those objects should react to it, but then
that wouldn't be "protecting the investment." CSS - 4 points out of 10. Only 4?
Yeah. It was supposed to separate the data from the layout. It doesn't -- at
least not much. It does have value, as it fills a great many gaps in the HTML
itself, but that's really it. Its really powerful features are so badly
supported by browser versions that they're unusable. Oh, and it is a moronic
failure of forethought to come up with a language that won't allow the
definition of constants or macros at the top of the code page. Hell, even XML
can do that -- though its remarkably difficult and nobody does (Its in the 99%
that nobody uses). That brings me to Java. This is a favorite topic of mine,
and will be the subject of a whole different blog at some point. To quote
Dennis Miller, "I don't want to get off on a rant here, but......"  Suffice it
to say that this is proof that the idea of a universal byte code which requires
the user to maintain on his local machine the specific and "correct" version of
the virtual run time environment simply does not work. The promise of cross
platform functionality was never truly realized. Well meaning engineers spent
countless hours in the vain attempt to use Java to turn the application hostile
web environment into a functional entity and ended up with slow, unreliable,
barely functional workstation applications that required more memory and
processor than any ever had. They succeeded in beating Microsoft -- but only in
the race to see who could demand more memory and processor time for the
smallest of applications. MS Word was the undefeated champion for a long time,
but your basic "Hello World" in Java comes darn close, and anything with a real
U.I. becomes nearly unworkable on the average desktop. To introduce "state" and
"continuity" they came up with CORBA. Wow. Ever try to build a CORBA based
application that would be used by a set of users over whom you do not have
total environmental control? No? Don't. It won't work. The last time I tried
(using prepackaged CORBA code that had been fully QA'ed) 75% of developer's
workstations running what was at the time state of the art hardware and all the
latest software failed. Just imagine your average cable modem subscriber. BTW,
 when it did work, we finally got almost as much functionality out of the web
browser as we had in 1983 with the IBM 3270 Color Terminals. So, if it's so
bad, why is it so popular? Simple. It isn't Microsoft. It is true that if you
build the back end of your application with Java, and run it with an operating
system on top of an operating system (a J2EE server) you've escaped Microsoft.
Laudable as that may be, it isn't a business model. On the other hand, the
countless hours and much counted dollars spent to develop the J2EE servers (the
last refuge of Java) has in fact produced something interesting. A caching
server that doesn't just cache what it wants, but instead lets you write code
that stays resident, so you can cache your own data, and share it directly
between many connections without first writing it to disk somewhere. Yes,
that's all a servlet is. Does a Domino agent, a perl script, or a cgi
executable handle many users? Sure it does. Can you pass data from one user to
another? Sure you can, but you've got to write it somewhere first. Sevlets
don't. Big deal. Write it somewhere fast (like a ram disk) and you've done the
same thing without buying into a whole operating system (J2EE Server) to sit on
your operating system (linux, bsd, aix, solaris, or yes, Windows). "But Andrew,
Java is cross platform", says the dweeb reading this as he gets increasingly
disturbed. No its not. Don't be silly. The byte code is in fact cross platform
but the servers, and the development tools are not. They have to be developed
for each platform. The byte code needs someplace to run. A JVM or a J2EE
server. Those aren't cross platform. They have to be built and installed for
each machine or operating system on which they'll run. Taken by that standard,
a Notes application is way more cross platform. I know thaif i build a Notes
application, I can deploy it to any Domino server and run it on any Notes
workstation. It must be cross platform! Of course it is, just as much as Java.
Now, whole sectors of the industry are out there generating their own momentum
for this "Language" while camps of anti-Microsoft zealots valiantly struggle
with poorly implemented, overweight constructs and bizarre rules and symbols
just to keep the last hope of Java (again, the J2EE server) competitive. In the
long run, it won't be. Some future thing based on it may very well be. Some
cool rapid application development platform and a server on which it can run
may well grow from all the vast brainpower and money being tossed at Java right
now. IBM alone is spending more than the collective healthcare budget of the
entire third world to make that happen. They're spending so much money on it
that I think the Pentagon is going to get jealous and buy a few more planes,
just to remind us all where the REAL money goes. They (the J2EE crowd, not the
Pentagon) may one day be successful at creating this holy Grail of development
using entirely non-Microsoft stuff. If they are though, it will look a lot
different than it does today. Oh well -- let the flames begin. I feel better
just getting that off my chest.
re: re: A legacy of bad standardsBy Andrew Pollack on 06/23/2004 at 11:21 AM EDT
: I've certainly run into companies recently that are using xml where soap
would be great -- simply because soap is so hard to use.

------------------------------------------------------------------------------
in response to
------------------------------------------------------------------------------
Wow. I agree with almost everything you wrote, except that I think CSS is
higher up on the usability scale than you do, and SOAP is lower. Heck, even XML
is probably lower, but I'll not quibble (for once). Nice rant!: in response to
Here's a long rant. I KNOW I'm going to get flamed for some of this --
particularly the anti-Java stuff. Oh well. I've been told I'm an idiot before,
and I didn't melt then. I guess I'll live through it again. In the weeks
leading up to the opening of this blog, I've worked to come up to speed on some
of the more 'web oriented' protocols and 'languages'. I use quotes in that case
because many of these are barely valid enough to be called languages. Below are
my impressions of some of these key new technologies and why I think we're all
doomed if we can't do better going forward. Overall, the first standards were
exactly what they were supposed to be. They were simple methods for the grass
roots user community to share development and information. Gopher, html, cgi,
mp3 -- all of these work in the way they were meant to work if you stay within
the limits for which they were intended. Then along comes the dot-com
hyper-evolution with its influx of more cash than was healthy for the
'ecosystem' of grass roots development. I'll talk more about that in another
rant, but for now lets just look at the result. Suddenly we had everyone in
agreement that standards would be the fuel for massive new integration and
development, but big money declaring that standards were only good if you could
also 'protect your own investment' and 'synergistically leverage the standard
to maximize your market share'. Translation: If you're a little company with a
neat idea, standards let you reach the mass market; but if you're a success,
the first thing you have to do is find a way to be proprietary or everyone else
will eat lunch at your table. Into that mix, toss publishers who make more
money where there are competing standards (it was often joked that the best way
to get press for your standard was to use another name and publish a competing
standard) and platform religious wars pitting the 'everything must be free'
zealots against the 'capitalism is good' zealots. What we're left with is a
mass of barely workable, poorly defined, self important, over engineered
technology definitions that are so badly supported as to be nearly useless for
anything significant. Of course, there are some gems in that mix too. The Web
Browser as a Standard The promise of the browser was the 'universal interface'.
No more would software have to be distributed to desktops! No more would
millions of man hours have to be spent programming the dialog boxes and panel
screens for applications. Any user could sit at any desk and interface with
anything! It could almost be like one of those movies, where the archetypal
'hacker' sits down to a dos prompt and in 15 minutes has accessed every
computer on the planet and gotten or changed all the information related to
everything anywhere. Wow. Cool stuff. I'd settle for being able to access
oracle without a 45 minute install process -- or DB2 without 3 days figuring
out which CD or download to install in the first place. The reality, is that
the browser is great for things like a blog, advertising, support (remember
getting drivers before the mid 90's?) -- but for an actual application, web
browsers stink. They're a lousy interface. You can't develop really compelling
applications with them. As a result, the things we call compelling applications
have drastically changed. The technology has driven the business processes --
never a good thing. Of the new standards, or so called standards, I'll touch on
just a few that I've had to work with lately.... XML - 6 points out of 10.
Cool, but not revolutionary by any stretch. On the plus side, it handles nested
data better than most previous methods and it can handle different character
sets and strongly typed data. On the minus side it's bulky and obnoxious to
work with, creates huge files that must be read sequentially, and is grossly
over engineered. Virtually every real-world xml use I've seen ends up using
less than 1% of the capabilities "designed in" -- and its always the same 1%.
Of course, you still need this magic "DTD" which is a proprietary format for
the data to be stored in the xml, though of course, there are some published
DTD's out there. RSS:XML - 9 points out of 10. A solid, simple (as simple as
anything in XML can be) standard that does exactly what it set out to do.
Bravo. Soap:XML - 7 points out of 10. Interesting, great idea. This is what XML
was supposed to do. By adding a WSDL file, Soap:XML becomes the self -
identifying communications tool we were promised. In practice, it is both
grossly over engineered -- a result of hubris in my opinion -- and under
defined. Its great for passing a quick function call to get city and state that
match a zip code, to find out how many Pepsi's are left in the machine, or to
check the weather; but just try to use it to pass a complex object with
multiple arrays and get back a useable result with multiple responses
consisting of a complex type. Oh, its capable. It can be done. In fact, reading
the spec, I could see at least a dozen ways it could be done. Which is right?
Well, that depends entirely on an agreement between the two sides of the
transaction on what WSDL file definition will actually match. Ick. In practice,
you create the object in your language first, then let that language (admit it,
Visual Studio.NET) create the WSDL and sample XML for you, then your work
backwards. JavaScript - 8 points out of 10. Why? It does just what it was
supposed to do. It provides a lightweight fully flexible scripting language to
manipulate objects on the fly. It would get more points if the implementers
could agree on what objects it should have, and how those objects should react
to it, but then that wouldn't be "protecting the investment." CSS - 4 points
out of 10. Only 4? Yeah. It was supposed to separate the data from the layout.
It doesn't -- at least not much. It does have value, as it fills a great many
gaps in the HTML itself, but that's really it. Its really powerful features are
so badly supported by browser versions that they're unusable. Oh, and it is a
moronic failure of forethought to come up with a language that won't allow the
definition of constants or macros at the top of the code page. Hell, even XML
can do that -- though its remarkably difficult and nobody does (Its in the 99%
that nobody uses). That brings me to Java. This is a favorite topic of mine,
and will be the subject of a whole different blog at some point. To quote
Dennis Miller, "I don't want to get off on a rant here, but......" Suffice it
to say that this is proof that the idea of a universal byte code which requires
the user to maintain on his local machine the specific and "correct" version of
the virtual run time environment simply does not work. The promise of cross
platform functionality was never truly realized. Well meaning engineers spent
countless hours in the vain attempt to use Java to turn the application hostile
web environment into a functional entity and ended up with slow, unreliable,
barely functional workstation applications that required more memory and
processor than any ever had. They succeeded in beating Microsoft -- but only in
the race to see who could demand more memory and processor time for the
smallest of applications. MS Word was the undefeated champion for a long time,
but your basic "Hello World" in Java comes darn close, and anything with a real
U.I. becomes nearly unworkable on the average desktop. To introduce "state" and
"continuity" they came up with CORBA. Wow. Ever try to build a CORBA based
application that would be used by a set of users over whom you do not have
total environmental control? No? Don't. It won't work. The last time I tried
(using prepackaged CORBA code that had been fully QA'ed) 75% of developer's
workstations running what was at the time state of the art hardware and all the
latest software failed. Just imagine your average cable modem subscriber. BTW,
when it did work, we finally got almost as much functionality out of the web
browser as we had in 1983 with the IBM 3270 Color Terminals. So, if it's so
bad, why is it so popular? Simple. It isn't Microsoft. It is true that if you
build the back end of your application with Java, and run it with an operating
system on top of an operating system (a J2EE server) you've escaped Microsoft.
Laudable as that may be, it isn't a business model. On the other hand, the
countless hours and much counted dollars spent to develop the J2EE servers (the
last refuge of Java) has in fact produced something interesting. A caching
server that doesn't just cache what it wants, but instead lets you write code
that stays resident, so you can cache your own data, and share it directly
between many connections without first writing it to disk somewhere. Yes,
that's all a servlet is. Does a Domino agent, a perl script, or a cgi
executable handle many users? Sure it does. Can you pass data from one user to
another? Sure you can, but you've got to write it somewhere first. Sevlets
don't. Big deal. Write it somewhere fast (like a ram disk) and you've done the
same thing without buying into a whole operating system (J2EE Server) to sit on
your operating system (linux, bsd, aix, solaris, or yes, Windows). "But Andrew,
Java is cross platform", says the dweeb reading this as he gets increasingly
disturbed. No its not. Don't be silly. The byte code is in fact cross platform
but the servers, and the development tools are not. They have to be developed
for each platform. The byte code needs someplace to run. A JVM or a J2EE
server. Those aren't cross platform. They have to be built and installed for
each machine or operating system on which they'll run. Taken by that standard,
a Notes application is way more cross platform. I know that if i build a Notes
application, I can deploy it to any Domino server and run it on any Notes
workstation. It must be cross platform! Of course it is, just as much as Java.
Now, whole sectors of the industry are out there generating their own momentum
for this "Language" while camps of anti-Microsoft zealots valiantly struggle
with poorly implemented, overweight constructs and bizarre rules and symbols
just to keep the last hope of Java (again, the J2EE server) competitive. In the
long run, it won't be. Some future thing based on it may very well be. Some
cool rapid application development platform and a server on which it can run
may well grow from all the vast brainpower and money being tossed at Java right
now. IBM alone is spending more than the collective healthcare budget of the
entire third world to make that happen. They're spending so much money on it
that I think the Pentagon is going to get jealous and buy a few more planes,
just to remind us all where the REAL money goes. They (the J2EE crowd, not the
Pentagon) may one day be successful at creating this holy Grail of development
using entirely non-Microsoft stuff. If they are though, it will look a lot
different than it does today. Oh well -- let the flames begin. I feel better
just getting that off my chest.
re: A legacy of bad standardsBy Carl on 06/23/2004 at 11:21 AM EDT
:I totally agree on the Java client stuff, it is too unreliable.  One of the
reasons I think Lotus Workplace will follow eSuite Workplace, another write
once run anywhere product that would only run on a specific version of ie and a
single IBM NC.

By the same standards with the .NET CLI it will run anywhere :-)

------------------------------------------------------------------------------
in response to
------------------------------------------------------------------------------
Here's a long rant. I KNOW I'm going to get flamed for some of this --
particularly the anti-Java stuff. Oh well. I've been told I'm an idiot before,
and I didn't melt then. I guess I'll live through it again. In the weeks
leading up to the opening of this blog, I've worked to come up to speed on some
of the more 'web oriented' protocols and 'languages'. I use quotes in that case
because many of these are barely valid enough to be called languages. Below are
my impressions of some of these key new technologies and why I think we're all
doomed if we can't do better going forward. Overall, the first standards were
exactly what they were supposed to be. They were simple methods for the grass
roots user community to share development and information. Gopher, html, cgi,
mp3 -- all of these work in the way they were meant to work if you stay within
the limits for which they were intended. Then along comes the dot-com
hyper-evolution with its influx of more cash than was healthy for the
'ecosystem' of grass roots development. I'll talk more about that in another
rant, but for now lets just look at the result. Suddenly we had everyone in
agreement that standards would be the fuel for massive new integration and
development, but big money declaring that standards were only good if you could
also 'protect your own investment' and 'synergistically leverage the standard
to maximize your market share'. Translation:  If you're a little company with a
neat idea, standards let you reach the mass market; but if you're a success,
the first thing you have to do is find a way to be proprietary or everyone else
will eat lunch at your table. Into that mix, toss publishers who make more
money where there are competing standards (it was often joked that the best way
to get press for your standard was to use another name and publish a competing
standard) and platform religious wars pitting the 'everything must be free'
zealots against the 'capitalism is good' zealots. What we're left with is a
mass of barely workable, poorly defined, self important, over engineered
technology definitions that are so badly supported as to be nearly useless for
anything significant. Of course, there are some gems in that mix too. The Web
Browser as a Standard The promise of the browser was the 'universal interface'.
No more would software have to be distributed to desktops! No more would
millions of man hours have to be spent programming the dialog boxes and panel
screens for applications. Any user could sit at any desk and interface with
anything! It could almost be like one of those movies, where the archetypal
'hacker' sits down to a dos prompt and in 15 minutes has accessed every
computer on the planet and gotten or changed all the information related to
everything anywhere. Wow. Cool stuff. I'd settle for being able to access
oracle without a 45 minute install process -- or DB2 without 3 days figuring
out which CD or download to install in the first place. The reality, is that
the browser is great for things like a blog, advertising, support (remember
getting drivers before the mid 90's?) -- but for an actual application, web
browsers stink. They're a lousy interface. You can't develop really compelling
applications with them. As a result, the things we call compelling applications
have drastically changed. The technology has driven the business processes --
never a good thing. Of the new standards, or so called standards, I'll touch on
just a few that I've had to work with lately.... XML - 6 points out of 10.
Cool, but not revolutionary by any stretch. On the plus side, it handles nested
data better than most previous methods and it can handle different character
sets and strongly typed data. On the minus side it's bulky and obnoxious to
work with, creates huge files that must be read sequentially, and is grossly
over engineered. Virtually every real-world xml use I've seen ends up using
less than 1% of the capabilities "designed in" -- and its always the same 1%.
Of course, you still need this magic "DTD" which is a proprietary format for
the data to be stored in the xml, though of course, there are some published
DTD's out there. RSS:XML - 9 points out of 10. A solid, simple (as simple as
anything in XML can be) standard that does exactly what it set out to do.
Bravo. Soap:XML - 7 points out of 10. Interesting, great idea. This is what XML
was supposed to do. By adding a WSDL file, Soap:XML becomes the self -
identifying communications tool we were promised. In practice, it is both
grossly over engineered -- a result of hubris in my opinion -- and under
defined. Its great for passing a quick function call to get city and state that
match a zip code, to find out how many Pepsi's are left in the machine, or to
check the weather; but just try to use it to pass a complex object with
multiple arrays and get back a useable result with multiple responses consiting
of a complex type. Oh, its capable. It can be done. In fact, reading the spec,
I could see at least a dozen ways it could be done. Which is right? Well, that
depends entirely on an agreement between the two sides of the transaction on
what WSDL file definition will actually match. Ick. In practice, you create the
object in your language first, then let that language (admit it, Visual
Studio.NET) create the WSDL and sample XML for you, then your work backwards.
JavaScript - 8 points out of 10. Why? It does just what it was supposed to do.
It provides a lightweight fully flexible scripting language to manipulate
objects on the fly. It would get more points if the implementers could agree on
what objects it should have, and how those objects should react to it, but then
that wouldn't be "protecting the investment." CSS - 4 points out of 10. Only 4?
Yeah. It was supposed to separate the data from the layout. It doesn't -- at
least not much. It does have value, as it fills a great many gaps in the HTML
itself, but that's really it. Its really powerful features are so badly
supported by browser versions that they're unusable. Oh, and it is a moronic
failure of forethought to come up with a language that won't allow the
definition of constants or macros at the top of the code page. Hell, even XML
can do that -- though its remarkably difficult and nobody does (Its in the 99%
that nobody uses). That brings me to Java. This is a favorite topic of mine,
and will be the subject of a whole different blog at some point. To quote
Dennis Miller, "I don't want to get off on a rant here, but......"  Suffice it
to say that this is proof that the idea of a universal byte code which requires
the user to maintain on his local machine the specific and "correct" version of
the virtual run time environment simply does not work. The promise of cross
platform functionality was never truly realized. Well meaning engineers spent
countless hours in the vain attempt to use Java to turn the application hostile
web environment into a functional entity and ended up with slow, unreliable,
barely functional workstation applications that required more memory and
processor than any ever had. They succeeded in beating Microsoft -- but only in
the race to see who could demand more memory and processor time for the
smallest of applications. MS Word was the undefeated champion for a long time,
but your basic "Hello World" in Java comes darn close, and anything with a real
U.I. becomes nearly unworkable on the average desktop. To introduce "state" and
"continuity" they came up with CORBA. Wow. Ever try to build a CORBA based
application that would be used by a set of users over whom you do not have
total environmental control? No? Don't. It won't work. The last time I tried
(using prepackaged CORBA code that had been fully QA'ed) 75% of developer's
workstations running what was at the time state of the art hardware and all the
latest software failed. Just imagine your average cable modem subscriber. BTW,
 when it did work, we finally got almost as much functionality out of the web
browser as we had in 1983 with the IBM 3270 Color Terminals. So, if it's so
bad, why is it so popular? Simple. It isn't Microsoft. It is true that if you
build the back end of your application with Java, and run it with an operating
system on top of an operating system (a J2EE server) you've escaped Microsoft.
Laudable as that may be, it isn't a business model. On the other hand, the
countless hours and much counted dollars spent to develop the J2EE servers (the
last refuge of Java) has in fact produced something interesting. A caching
server that doesn't just cache what it wants, but instead lets you write code
that stays resident, so you can cache your own data, and share it directly
between many connections without first writing it to disk somewhere. Yes,
that's all a servlet is. Does a Domino agent, a perl script, or a cgi
executable handle many users? Sure it does. Can you pass data from one user to
another? Sure you can, but you've got to write it somewhere first. Sevlets
don't. Big deal. Write it somewhere fast (like a ram disk) and you've done the
same thing without buying into a whole operating system (J2EE Server) to sit on
your operating system (linux, bsd, aix, solaris, or yes, Windows). "But Andrew,
Java is cross platform", says the dweeb reading this as he gets increasingly
disturbed. No its not. Don't be silly. The byte code is in fact cross platform
but the servers, and the development tools are not. They have to be developed
for each platform. The byte code needs someplace to run. A JVM or a J2EE
server. Those aren't cross platform. They have to be built and installed for
each machine or operating system on which they'll run. Taken by that standard,
a Notes application is way more cross platform. I know thaif i build a Notes
application, I can deploy it to any Domino server and run it on any Notes
workstation. It must be cross platform! Of course it is, just as much as Java.
Now, whole sectors of the industry are out there generating their own momentum
for this "Language" while camps of anti-Microsoft zealots valiantly struggle
with poorly implemented, overweight constructs and bizarre rules and symbols
just to keep the last hope of Java (again, the J2EE server) competitive. In the
long run, it won't be. Some future thing based on it may very well be. Some
cool rapid application development platform and a server on which it can run
may well grow from all the vast brainpower and money being tossed at Java right
now. IBM alone is spending more than the collective healthcare budget of the
entire third world to make that happen. They're spending so much money on it
that I think the Pentagon is going to get jealous and buy a few more planes,
just to remind us all where the REAL money goes. They (the J2EE crowd, not the
Pentagon) may one day be successful at creating this holy Grail of development
using entirely non-Microsoft stuff. If they are though, it will look a lot
different than it does today. Oh well -- let the flames begin. I feel better
just getting that off my chest.


Other Recent Stories...

  1. 01/26/2023Better Running VirtualBox or VMWARE Virtual Machines on Windows 10+ Forgive me, Reader, for I have sinned. I has been nearly 3 years since my last blog entry. The truth is, I haven't had much to say that was worthy of more than a basic social media post -- until today. For my current work, I was assigned a new laptop. It's a real powerhouse machine with 14 processor cores and 64 gigs of ram. It should be perfect for running my development environment in a virtual machine, but it wasn't. VirtualBox was barely starting, and no matter how many features I turned off, it could ...... 
  2. 04/04/2020How many Ventilators for the price of those tanks the Pentagon didn't even want?This goes WAY beyond Trump or Obama. This is decades of poor planning and poor use of funds. Certainly it should have been addressed in the Trump, Obama, Bush, Clinton, Bush, and Reagan administrations -- all of which were well aware of the implications of a pandemic. I want a military prepared to help us, not just hurt other people. As an American I expect that with the ridiculous funding of our military might, we are prepared for damn near everything. Not just killing people and breaking things, but ...... 
  3. 01/28/2020Copyright Troll WarningThere's a copyright troll firm that has automated reverse-image searches and goes around looking for any posted images that they can make a quick copyright claim on. This is not quite a scam because it's technically legal, but it's run very much like a scam. This company works with a few "clients" that have vast repositories of copyrighted images. The trolls do a reverse web search on those images looking for hits. When they find one on a site that looks like someone they can scare, they work it like ...... 
  4. 03/26/2019Undestanding how OAUTH scopes will bring the concept of APPS to your Domino server 
  5. 02/05/2019Toro Yard Equipment - Not really a premium brand as far as I am concerned 
  6. 10/08/2018Will you be at the NYC Launch Event for HCL Domino v10 -- Find me! 
  7. 09/04/2018With two big projects on hold, I suddenly find myself very available for new short and long term projects.  
  8. 07/13/2018Who is HCL and why is it a good thing that they are now the ones behind Notes and Domino? 
  9. 03/21/2018Domino Apps on IOS is a Game Changer. Quit holding back. 
  10. 02/15/2018Andrew’s Proposed Gun Laws 
Click here for more articles.....


pen icon Comment Entry
Subject
Your Name
Homepage
*Your Email
* Your email address is required, but not displayed.
 
Your thoughts....
 
Remember Me  

Please wait while your document is saved.