The Space It Takes - XML, Binary and Raw Mesh Formats



March 20th, 2021 by Diana Coman

While Eulora's communication protocol can handle easily the transfer of any data files that the client might need, the size of such files still matters, mainly from a user perspective1 - large files take time to transfer and nobody wants to have to wait more than absolutely necessary. Which is all nice and well, except totally at odds with what the rasterised approach to computer graphics requires, namely huge listings of everything2. Despite this obvious problem with size for graphics files, the legacy formats of Crystal Space (CS) and Cal3d bring with them the very dubious design decision of using XML everywhere - basically adding even more (since everything needs an XML tag and then every tag needs a closing tag) to the size of the already large files for the theoretical gain of making huge listings of numbers (floating point for the most part, too) human-readable as if that makes or made somehow sense for someone, somewhere, sometime. Don't you want to check by hand the positions, normals and triangles of something even modest, say a couple of hundred vertices in the 3D space with their 3D normal vectors and about a thousand triangles perhaps? No? Why not, since now you totally can, hooray for XML!

As my hopefuls are quite efficiently packing a lot in relatively small surfaces, the above XML-inflation of files didn't hurt *too* much until now - although it was certainly showing early signs of trouble since the full generation of a model took less than the chunking or sending of the resulting files, to give just an example. Nevertheless, the place where the size really went through the roof and basically can't be ignored anymore is the terrain - first the heightmap itself turned out rather too big for the initial uncompressed .png format. Then, as this was addressed by switching to the .dds format (hence the heightmap went down to 1.1M while still fullsize, 1024x1024) and adding therefore that as well to my gfx pipeline, the mesh itself exploded in size, with the resulting CS mesh file (xml, of course) weighing in at 190M. At which point the XML became indeed so unbearable to require a look around as to *any* other more reasonable option, from existing binary formats to raw dumpings of the data, if need be.

Looking around at existing binary formats didn't turn out very inspiring - while there are tons of formats for graphics files (so that one can spend as much time as they have sifting through it all), a closer look is usually quite dissapointing. My take on it so far is that binary formats that might perhaps be of some use are very close in fact to raw dumpings, without any tailored packing that might bring some additional gains for the narrow case of mesh definition - this seems to me to be mainly because none of the formats can quite commit to keeping to such a narrow scope as literally focusing on just vertices, normals and triangles. Even the simplest (such as the PLY polygon file format, which seems to keep indeed at least as simple and straightforward as possible) still consider it mandatory to at least allow the definition of additional parts and therefore to not attempt any task-tailored packing. This being said, I had only a quick look at the field and given the huge number of formats out there, I don't claim to have looked at them all by any means (and some, such as .obj) are proprietary so their binary form is undocumented, too, so if you know of some very efficient such format, please let me know in the comments box below, thank you.

Given the above sad state of affairs regarding other formats, I took a bit more time to gather at least some concrete measurements of existing options with what I already have, so that there is at least some basis for any decision moving forwards. For this, I added to my gfx pipeline a few more writers to have a way of looking for instance at exactly how much do those xml tags add on (hence writing a file simply stripped of the xml tags) and what would the most straightforward of "raw" writing occupy anyway. On this last point, it's worth noting perhaps that one can of course calculate the lower and upper bounds of the file size, simply based on the map's size and otherwise known representation of the data types involved. Considering the map of 1025x1025 vertices and corresponding 2*1024x1024 triangles, the "raw" overall size still ends up with a lower bound of approximately 80M (using 4 octets for each vertex index used to define triangles and 8 octets for the floating point values used to represent vertices' positions, normals and texture coordinates). Keep in mind though that this raw format is however not compressed and a simple zip of it can reduce the size significantly.

The tables below show the concrete measurements of actual mesh files, first for the terrain meshes that are the largest (1`050`625 vertices and 2`097`152 triangles) and use CS format and then for one of the larger meshes (654 vertices and 1300 triangles) used as a limb of a character. While the Cal3D meshes are relatively small, they have some additional data that is required for animation and they use the Cal3D format that is even more verbose in its XML plain-text version:

Format Plain File Deflation (as reported by zip) Zipped File
CS mesh (XML) 190M 81% 36M
CS mesh stripped of xml tags 112M 73% 31M
Raw with guaranteed, 16 digits precision for floating point 113M 51% 55M
Raw with Float (typically 8 octets) 81M 62% 31M
Cal3D XML mesh (xmf) 272M 85% 41M
Cal3D Binary mesh (cmf) 69M 57% 30M

As for one of the meshes that serve as limbs for my hopefuls, the size is anyway significantly smaller simply because the mesh has only 654 vertices and 1300 triangles in total:

Format Plain File Deflation (as reported by zip) Zipped File
Cal3D XML mesh (xmf) 210K 83% 36K
Cal3D Binary mesh (cmf) 54K 48% 29K

Overall it seems to me that the rasterised approach is really the core and unavoidable trouble, leading to a de-facto limit on the total number of vertices and triangles if one wants to keep the size of the mesh file below any specific threshold, there isn't much around it. As the zip is quite effective at reducing size for all the above formats, it's unclear to me if there's much gain in looking specifically to design a very packed mesh-only format as such. Arguably, if the terrain is the only mesh with too many vertices and triangles, it might make more sense to simply send over just the heightmap plus materials for the terrain and let the client polygonize and texturize it as it wants - all the rest anyway does not add any further information, since it's all contained in the heightmap really. Alternatively, one can simply make a less detailed polygonization of the terrain - this is probably exactly the reason why the legacy version used such a relatively small cell size, namely to avoid having that many vertices and triangles. For all I know, it might even be the case that CS won't handle all that well that many vertices for a single mesh, after all.

Nevertheless, the above workarounds don't solve in any way the wider problem of rasterised files rapidly exploding in size as the number of vertices and polygons increases so I guess the question remains as to whether it's worth struggling to find some clever, specifically tailored representation of all those lists of vertices and of triangles and how much could even be gained as a maximum anyway, especially beyond what the generic zip compression achieves already otherwise.


  1. Otherwise, from a purely technical perspective, the protocol allows the transfer of files of up to about 140GB and at least so far I never got anywhere near that with any of the files I needed. Then again, I'm sure that more XML use *could* get me there in no time. 

  2. Vertices and triangles but also normals at each vertex, texture coordinates at each vertex, bone "influences" and weights and so on and so forth. Basically there is no upper limit to the number of lists or to the length of such lists since "the more detailed the better" in rasterised terms. 

Comments feed: RSS 2.0

One Response to “The Space It Takes - XML, Binary and Raw Mesh Formats”

  1. [...] 12 image files to create just one sector of the whole wide world and this piles on, of course, the megabytes transferred and stored under the generic label of "art assets". The reason for such shameless requirement is the reliance [...]

Leave a Reply