The only really valid criticism I see here is the binary blob problem-- clearly this is a poor way to store user configuration data
I disagree. Enlightenment has been doing this from the start because it is faster to read and write than a specially formatted ASCII file and because the user is expected to do all configuration through the graphical interface.
antithetical to the kinds of philosophies that undergird Linux
This somewhat true but I believe editing text files by hand is the type of thing Ubuntu is trying to move away from.
ESR is an old hat who has been using Linux since way back in the day and is set in his ways. I am too, but he should know Ubuntu isn't for him. They are focused on ease of use for new users, not accommodating all the idiosyncrasies of the old ones.
I've got plenty of text files I don't regularly edit by hand. However, when the time comes:
- I've got a multitude of tools to choose from to evaluate and modify the files (editors, search tools, scripted modification, etc.).
- I can see with my own eyes what the contents of the file are, and generally, what is wrong. GNOME went the wrong way on this a long time ago by allowing non-intelligible values within gconf, a'la Microsoft's Registry.
- I can make atomic edits and deletes to/from the file (add/remove a line). I can carry those atomic edits around with me (configuration bag of tricks).
- I CAN COMMENT THE MOTHERLOVIN FILE TO EXPLAIN WHY THE FUCK I PUT (OR REMOVED) A SPECIFIC VALUE. Configuration files are NOT there for the convenience of the computer / interpreter / system they are managing, they are there for the administrators to understand what is configured and why.
Removing the ability to comment files, or to make atomic updates in a commented system (e.g: git) removes a core value of configuration management.
Yes, GNOME jumped the shark, but it happened years and years ago.
The issue I'm trying to understand is exactly how the fuck it is so fucking slow to parse 56 fucking kilobytes of text at session login time that it justifies encrypting my fucking desktop settings in a fucking binary configuration file.
I thought maybe the default configuration had ten megabytes of shit in it or something, which could conceivably sort of explain it, although then that would be another fucking stupid decision that would require explanation. But 56 fucking kilobytes? Which gzip suggests would be 35 kilobytes with a human-oriented syntax?
The issue is not as much parsing as reading from a lot of different files.
As someone else has put it in other thread sequential reading is fast. Random reading from 50+ files which involves seeking the disk is not.
Of course the gconf configuration may have had other alternatively smart ways of doing things and gsettings authors are just optimizing where it is easy to do.
And by the way the gconf stuff is still necessary
sudo apt-get remove gconf2
shows the a whole lot of packages which still depend on it, so when the transition will be complete there will be more than "56 fucking kilobytes of text"
Sticking all the stuff you need to read at login time into one file is at least plausibly reasonable (although! ldd /usr/bin/gnome-terminal | wc alone still gives me 52 libraries, so config files are not close to the only problem here!) but turning that one file into an opaque binary file is not.
> it is faster to read and write than a specially formatted ASCII file
How many milliseconds does it take to parse YAML, XML, or any other file format for which a parser written in C is widely available?
> the user is expected to do all configuration through the graphical interface.
Changing configurations using the GUI is exactly what Unity and GNOME 3 don't allow the user to do. Try changing anything other than the desktop background.
> How many milliseconds does it take to parse YAML, XML, or any other file format for which a parser written in C is widely available?
YAML and XML may be slower than this, but Python 2.6's built-in 'json' module averages less than 20 milliseconds to parse a simple 1000-element "object"/dict on my system.
This isn't even the highest-performance JSON parser for Python, and parsing into less dynamic C or C++ datastructures would probably be somewhat faster still.
Actually it does, you just have to tilt your head right. :) See [1].
Either way, the principle stands. If a desktop environment wanted to standardize on JSON + C/C++-style comments, they could extend their standard parser easily enough without meaningful performance impact.
Even so... Doing yet-another-incompatible-binary-format?
There are a number of standardized binary formats that support all sorts of data configurations. Hell, use sqlite for all I care, but at least use a format that can easily be examined and manipulated manually when the UI inevitably breaks it.
I disagree. Enlightenment has been doing this from the start because it is faster to read and write than a specially formatted ASCII file and because the user is expected to do all configuration through the graphical interface.
antithetical to the kinds of philosophies that undergird Linux
This somewhat true but I believe editing text files by hand is the type of thing Ubuntu is trying to move away from.
ESR is an old hat who has been using Linux since way back in the day and is set in his ways. I am too, but he should know Ubuntu isn't for him. They are focused on ease of use for new users, not accommodating all the idiosyncrasies of the old ones.