:: Forum >> Version 2 >>

Any way of compressing XML when downloading from server to client?

Hi all,
I have a major problem, as I deal with very large + long tables.
The resulting XML can be as big as 4MB. So, very long to download.
I wonder if anything exists giving the chance to compress/decompress XML prior to transmission / prior to usage.

Any hint would be highly appreciated.

Ciao,

Fab
Fab
Friday, March 9, 2007
You never want to bring that much data back to the client, it must first be filtered on the server side/in the DB. If you do bring that much data to the client, their peecee's cpu will be pegged at 100%
Kurt
Wednesday, March 28, 2007
Use virtual mode and retrieve smaller blocks of data from the server as the grid needs it while scrolling.
Randall Severy
Thursday, March 29, 2007
Hi Randall,

since I use XML data representation model, does virtual mode affect in any way the need to transmit the FULL xml file from the server to te client??
From the tests I've done, it looks like it doesn't: it actually doesn't makes any difference wether I set virtual mode ON/OFF in terms of tranmission times....

Kurt: if that's all the data I need to send to the client (coming out from very complex DB queries) what can I do?

Thanks for your help
Fab/Diabolik
Wednesday, April 18, 2007
I don't have much experience setting up mod_gzip with Apache, and I have no experience with your execution environement, but I do know that most (if not all) browsers accept gzipped data and will decompress it on the fly. I don't see why dynamic output can't be compressed before delivery.

What percentage of your XML is data, whitespace and markup?
J.D. Pace
Wednesday, April 18, 2007
Hi Fab/Diabolik,

Randall Severy might have been referring to /javascript.forum.13524.7/when-will-ajax-paging-be.html
So, when the user scrolls the grid, the rows that need to be displayed are fetched from the server using an AJAX call and then displayed.
The user doesn't need to download the entire 4 MB.
Some changes will be required in your server side code to respond with JSON data (or respond with XML data - change Alex's code given in the URL to accept XML) to the AJAX call.
If the amount of data is expected to increase in future, it might be a good idea to implement this instead of compression.
Otherwise, there might be a point when despite compression, the data size is too huge.

If you don't want to implement the above and just want compression, mod_gzip on apache might be the best option, as suggested by J.D. Pace.
Plus , as a bonus, it will compress all pages it serves (it detects if the browser supports compression) and not just the XML data :)
If you would like to see how much it will compress, just gzip your data and check the size.
It will compress by approximately the same amount and you should be able to decide if it is worth the effort.

If you are using PHP, there's another alternative to mod_gzip.
Check out http://au.php.net/ob_gzhandler
If you are using a language other than PHP, it might have an equivalent statement.

Ankur
Ankur Motreja
Wednesday, April 18, 2007
Hi guys,

I really have to thank you all for your support, highly appreciated, really.
I have now an important development thread to investigate, and will come back to you with the results.

Grazie!!

Fab
Fab/Diabolik
Thursday, April 19, 2007
Sorry, I forgot a few answers:

1 - Amount od XML-specific stuff VS. real payload: ENORMOUS. More than 80% can be tags...

2 - My environment is a classic LAMP

Thanks again

Fab
Fab/Diabolik
Thursday, April 19, 2007



This topic is archived.

Back to support forum

Forum search