[JDEV] Why XML?
Jon A. Cruz
joncruz at geocities.com
Wed Jan 13 11:26:24 CST 1999
Sedat Kapanoglu wrote:
> >It's probably important to measure things before guessing, otherwise you
> >end up doing the equivalent of optimizing the system idle loop.
> I meant if we can make things faster even a bit without giving out the
> flexibility away why won't we? :)
> As I know in ICQ v2 protocol an online alert (packet sent from server to
> client) is a UDP packet only 10 bytes long (with the advantage of using UINs
> of course). In Jabber I can hardly guess but probably would be much longer
> than that. Think you have a list of four thousand people in your roster then
> measure it :)
Well, the main point is to determine what needs to be done, and how to achieve
it. Then we can look at the details.
For example, how many people will have 4000 people? And if they do, they deserve
what they get ;-)
Aside from the bytes being transmitted, you need to look at the entire
transaction in context.
Bottom line is: will a binary format make the actual use significantly faster?
That 'if' is what is important.
> >Yes. No. Yes, the parsing might be a little slower, but no, that will not
> >to make coding harder. You just grab a standard parsing library and hook it
> >Look into the SAX interface for Java for an example.
> What if I'm coding in Delphi? :) (I'm currently coding in Delphi in fact
> ehhee). And are the standard libs suitable for network-kind of parsing?
I'm not sure about delphi, but... MS has XML support via MSIE 5.0, and
professors have been assigning XML parsers as exercises. They're not that big or
two hard (also, one of the Java implementations isa 26k class file).
> >Binary??? Ouch. Directly use structs?? Ouch.
> That worked for ICQ, why won't it work for us? :)
> >Which platform, compiler, packing options??
> Packing options what an IP header has :)
> >Intel byte-order or Motorolla?
> Say Intel. What about ntohs, htons ? :)
Oops. That's Motorola. You guessed wrong. Once again things are complicated.
> >8,16, 32 or 64 bit alignment???
> Why align?
Well, if you're directly using structs, your compiler will be doing aligment.
Ask some of the programmers who've transitioned from Win31 to Win32. Also lot's
of people got bit on int<>16 bits. etc. etc.
> >What if you change your C/C++ code? Break the protocol???
> No. Allow the protocol itself to tolerate the changes. That's not that much
Then you have to do a translation anyway from the wire format to your internal
format. Just lost some of the anticipated gain from going binary.
> >( ever seen the .TGA file format? )
> Nope but now I wonder heheh
One of the problems was that in part of the header they had a byte, a short, and
three bytes. Worked fine for 8-bit programs, but as soon as some people compiled
for more (e.g. Windows), then that caused packing to create an empty byte, and
people who had been just reading and writing the struct directly got burned.
> >Yes, but XML makes the functional aspect of the protocol so much stronger.
> >Noticing a missing '>' is far easier than noticing a missing bit. Besides,
> >gives you a wealth of tools for viewing the data exchanged, debugging, etc.
> I have to admit but if a protocol which makes sense going to be used, then
> why a markup language selected for that? A straight text based protocol
> (such as POP3, SMTP) could be used. (Then a new code parsing code would be
> needed you're right about that one I'm afraid ehhehe)
And I haven't even got into Unicode and internationalization.
In general, I was just trying to point out that there's a lot more to doing a
binary protocol than some people think. Also, the W3C has pretty much said that
all new Internet protocols be XML based.
More information about the JDev