You make some good points. And yes the memory footprint would increase because you are no longer sharing libraries. But again, how big would a statically-compiled Ubuntu be? How much more RAM would a base boot require? How much slower would it be?
You make some good points. And yes the memory footprint would increase because you are no longer sharing libraries. But again, how big would a statically-compiled Ubuntu be? How much more RAM would a base boot require? How much slower would it be?
-------------------------------------
Oooh Shiny: PopularPages
Unumquodque potest reparantur. Patientia sit virtus.
Well "how static"?
If you statically compile against GTK+ for every GTK+ app you'll probably have some absurd memory requirements. Do you really want to this? If so .... WHY? If a library is multiple megs it's a little much to statically compile this into an app.
It's worth noting that Arch/Fedora doesn't statically compile everything. Windows apps dynamically link to the Win32 API (largely implemented as shared libraries), otherwise every app might pull hundreds of megs of libraries
Arch bundles -dev into the package, which is mostly the header files for the lib (required for compilation). This is not the same thing as statically compiling everything. I mean, Arch is regularly respected for it's speed!
Last edited by phrostbyte; January 25th, 2010 at 01:44 AM.
Proud GNU/Linux zealot and lover of penguins
"Value your freedom or you will lose it, teaches history." --Richard Stallman
True. The header files aren't huge Yes, gtk has a large footprint. So there seems to be a tradeoff between package size, number of packages, and installed footprint size.
Look at dsl, tinycore, slitaz for how small a desktop linux can be.
-------------------------------------
Oooh Shiny: PopularPages
Unumquodque potest reparantur. Patientia sit virtus.
Bookmarks