gnomeuser
May 18th, 2009, 05:45 PM
This atrocity of lies and half truths (http://linuxfonts.narod.ru/why.linux.is.not.ready.for.the.desktop.html) hit slashdot earlier.
0. Let's grant this premise even if it's both bleak and unlikely for software patents. Additionally it is forgotten that one can be commercial and Open Source, many companies do this.
1. Learn to love Pulseaudio, while not yet perfect it is paving the way towards that solution. Not labeling this as in progress is extremely dishonest.
1.1 See 1., once PA is in place we have a very sane set of volume controls available to us.
1.2 See 1. and 1.1
1.3 See 1. - 1.2, also file bugs
2. supposed X problems
2.1 An outright lie, GTK has been API stable for years and the new plan for GTK3 and beyond still sets API stability as an important goal.
2.2 Not citing measurements and methodology is dishonest
2.3 Pass (I don't know enough about the internals of toolkits to give a full and accurate answer for this)
2.4 fontconfig bitching
2.4.1 funny when I change the hinting defaults in GNOMe they are applied instantly. I'll call bogus or at least demand additional data here.
2.4.2 Only works in every modern toolkit yes, but applications written using those make up every preinstalled application on most distros (not citing data here, I welcome people to poke holes in this assumption).
2.4.3 Ugly is subjective.
2.4.3.1 I assume he means disabling subpixel hinting by default and not shipping the patented parts of freetype/cairo enabled. The first one is untrue for Ubuntu (Fedora ships with sph disabled though) the second one is grounded in legal reasons.
2.5 Pass (I don't know enough about the internals of toolkits to give a full and accurate answer for this)
3. The downfall through prolification fallacy
3.1 See Augeas (http://augeas.net/). Not labeling this as in progress is also dishonest. Aside that for most users no configuration is needed, where we can we autodetect correct settings or pick sane defaults.
3.2 Meet PackageKit (http://packagekit.org/) works on all major distros and is getting very mature. This is solely about installation and package management.
3.3 Packaging itself is another problem however that can be left up to distributions who have shown themselves quite capable of packaging pretty much everything. Smaller distros will not have the manpower to package everything but most distros today based themselves on distros with a large package base. openSUSE, Fedora and Ubuntu all have tens of thousands of packages as well as an eager community to do this work for you. More unification would be nice though but not likely for the near future.
3.4 If you absolutely must, the LSB defines a set of standards you can rely on for this development. As does FreeDesktop.org for higher level desktop tasks. I agree more standardization is good however I do not believe that a 100% frozen API for all time is a good idea.
4. I'll grant him this with the modification that most things today can be set without opening a terminal for many users. Work is in progress, there is also long standing tools like YaST to fill this hole.
5. Whining over marketshare and misunderstandings of FLOSS.
5.1 For highly specialized applications this is true but for a surprising amount of common real world tasks it is not only false but you are offered choice between a range of applications. This will always be the case but as more long tail software gets written using .NET porting will be a minimal effort (Novell e.g. does good business helping vendors move to Linux this way).
5.1.1 I believe this to be a minor subset of users compared to mom and pops who do wordprocessing, browsing, photo editing, chatting and email. I believe this group to make up a larger part than the group mentioned in this item by the author. As these guys move over a larger market is created for vendors to port this applications. For 3D animation e.g. it is also untrue, while 3D studio max might not be here, major hollywood companies do all their work on Linux desktops (Madagascar e.g. was done entirely in a GNOME environment running on Linux using specialized applications).
5.2 There are lots of games for Linux, sure we might not have World of Warcraft but we have many games and the selection of titles is growing. Steam is even said to be ported to Linux soon giving us a deployment channel for both major players and indie game producers.
5.3 Drivers are a problem, but out of the box no OS supports more hardware than Linux. We are closing the gap for video drivers and plugging holes in ALSA as I type this. Not labeling this in progress is utterly dishonest.
5.3.1 Pass (Every printer I have owned worked under Linux, but I don't feel confident enough to give an answer on this one).
5.3.2 True, but with hefty modifications. Recently the uvcvideo driver set was merged into Linux and this brought with it a lot of support. We are expanding every day plus the Linux Driver Project will happily write the driver for vendors for free, under NDA. All we need is specifications. In progress, not labeling it as such is also dishonest.
5.4 True, but ffmpeg amongst other projects have pledged to add blu-ray support. I doubt this one will stand for long.
5.5 Total bogus (with reserve). The legal status is not questionable for codecs, they are illegal to install where software patents exist due to license violations. However more and more vendors move towards open standards and till then you can buy supported, legal codecs from companies like Fluendo (http://www.fluendo.com).
6. Regression testing is performed with every kernel, distros like ubuntu have automated crash reporting as well as hardware testing. There are tools to test suspend and resume. We can do better but regression testing on a scale as massive as every hardware configuration out there will be lacking regardless of how much effort you put into it. We are getting better, the tools are there and distros do consider it a priority. Tools are also present today such as Smolt to show us which setups are the most used to help us target regression testing. True but the goal is unrealistically set.
7. We have bugs, all software has bugs. There will be a few that never get worked on unfortunately but of those filed a decade ago how many are still valid or relevant given the pace of development I wonder. True, but the goal is set unrealistic, no project will ever realistically reach zero bugs.
8. Examples?
9. True, performance testing for desktop use could definitely improve. However using OpenOffice as the example is picking the worst possible case as it uses a non native toolkit, reinvents a lot of wheels and so on. Still it has gotten better over time, Michael Meeks e.g. has done some amazing optimizations. I will grant that we should if possible focus more on getting proper tracing and a performance regression testing done. Tracing is in progress and many projects have regression testing now e.g. Banshee and when major problems occure users tend to complain followed by fixes being commited.
9.1 Lacks data, also largely solved (http://google-opensource.blogspot.com/2008/04/gold-google-releases-new-and-improved.html) - I assume he means a gcc linker but we have many linkers mono has a linker as well e.g. which performs quite nicely as I understand. The problem is poorly defined.
9.2 Parallel boot has not been shown to decrease boot time, e.g. Moblin' impressive 5 sec boot from bootloader to usable gui was specifically done without parallization for performance reasons. Aside that most major distros today ship a parallel capable bootsystem (upstart e.g. is parallel capable and deployed in Ubuntu as well as Fedora just to mention two major distros). Untrue and a false assumption.
9.3 Lacks data but tends to be true in my experience. Work to be done. As suspend becomes more reliable one could question if shutdown will be as large as problem space as it is now though. Still worth fixing and being worked on (Fedora e.g. has work ongoing to fix this).
10. It depends, you really don't want users to see all manners of errors. Varnish has a good approach where it just displays "guru meditation error" with an error code admins can look up in their logs. An error message needs to be translated and it might not always make sense to users. Instead we could do better capture tools to report bugs. I am unconvinced that heaping on more and more error dialogs will be the best way to solve this.
11. True, at least for many projects. This is a good entry point for users to get involved though. The Fedora documentation project has good experiences with this. I believe such effort could do with more PR at least and yes we could do with better documentation at all levels. True but also true for most non-Linux deployments.
12. SELinux protects against many attack vectors. As does AppArmor and every one of the 3 major distros deploys one of these solutions by default. Linux has a very strong security framework along with proactive security enhancements. I feel unqualified to speak to specific threats but in general we have very strong security available to us along with a much better patch rate and a shorter patch cycle than Windows(Mark Cox' blog is good reference for this data (http://www.awe.com/mark/blog/)).
13 API/ABI complaints
13.1 GNOME 2.x has been API compatible since June 2002 till now and we are only just breaking API. There are good reasons to not keep certain APIs stable such as in-kernel APIs but aside that the major desktop environments have strict API stability requirements for the duration of each major cycle. There are reasons why we change APIs and it is not done lightly, there are also many projects where keeping the APIs stable is a primary concern to say that as a whole Linux doesn't care about these things or engineering solutions that are forward looking is false.
13.2 True, see 13.1. Not all APIs will or should remain stable at all times aside that compatibility libraries are options, as is static compilation. I assume we are talking about situations where source is not available in which case static compilation is the safest option (if licenses are a problem for this solution, often asking politely for an exception or a license change very often works).
13.3 Companies like the one referenced are most welcome to file bugs. I don't see the blog referencing any of the bugs they hit, I am sure they are there - no software is flawless. I am unsure what the actual complaint is here, that libraries are software and thus have bugs?
(please note we are at item 13.3, near the end and this is the first reference given in the entire article)
14. Enterprise woes
14.1 Unsure what he even means with this. Companies like Red Hat specifically offer products with strong API/ABI policies with up up to a decade of support. The complain needs to be specified with an example.
14.2 Every enterprise distro has it's distribution channel, Red Hat network, Novell has it's thing and Canonical has Landscape. If you want to deploy a single application often the best option that is the least support burden for you is a software appliance. It depends on your needs if this is a problem. Yes it would be nice if we have the one true distribution method, and in part we do, source code but work is being done on such management (see Spacewalk (http://www.redhat.com/spacewalk/) - there is no reason why this could be extended to support multiple distributions). If the complaint is that everyone uses their own packaging format and package manager then yes, it is a problem but often the solution is as simple as asking someone to do the hard work of packaging it and accepting bugs, trusting them to do good work and accepting bugs. The way into RHEL e.g. goes through Fedora.
0. Let's grant this premise even if it's both bleak and unlikely for software patents. Additionally it is forgotten that one can be commercial and Open Source, many companies do this.
1. Learn to love Pulseaudio, while not yet perfect it is paving the way towards that solution. Not labeling this as in progress is extremely dishonest.
1.1 See 1., once PA is in place we have a very sane set of volume controls available to us.
1.2 See 1. and 1.1
1.3 See 1. - 1.2, also file bugs
2. supposed X problems
2.1 An outright lie, GTK has been API stable for years and the new plan for GTK3 and beyond still sets API stability as an important goal.
2.2 Not citing measurements and methodology is dishonest
2.3 Pass (I don't know enough about the internals of toolkits to give a full and accurate answer for this)
2.4 fontconfig bitching
2.4.1 funny when I change the hinting defaults in GNOMe they are applied instantly. I'll call bogus or at least demand additional data here.
2.4.2 Only works in every modern toolkit yes, but applications written using those make up every preinstalled application on most distros (not citing data here, I welcome people to poke holes in this assumption).
2.4.3 Ugly is subjective.
2.4.3.1 I assume he means disabling subpixel hinting by default and not shipping the patented parts of freetype/cairo enabled. The first one is untrue for Ubuntu (Fedora ships with sph disabled though) the second one is grounded in legal reasons.
2.5 Pass (I don't know enough about the internals of toolkits to give a full and accurate answer for this)
3. The downfall through prolification fallacy
3.1 See Augeas (http://augeas.net/). Not labeling this as in progress is also dishonest. Aside that for most users no configuration is needed, where we can we autodetect correct settings or pick sane defaults.
3.2 Meet PackageKit (http://packagekit.org/) works on all major distros and is getting very mature. This is solely about installation and package management.
3.3 Packaging itself is another problem however that can be left up to distributions who have shown themselves quite capable of packaging pretty much everything. Smaller distros will not have the manpower to package everything but most distros today based themselves on distros with a large package base. openSUSE, Fedora and Ubuntu all have tens of thousands of packages as well as an eager community to do this work for you. More unification would be nice though but not likely for the near future.
3.4 If you absolutely must, the LSB defines a set of standards you can rely on for this development. As does FreeDesktop.org for higher level desktop tasks. I agree more standardization is good however I do not believe that a 100% frozen API for all time is a good idea.
4. I'll grant him this with the modification that most things today can be set without opening a terminal for many users. Work is in progress, there is also long standing tools like YaST to fill this hole.
5. Whining over marketshare and misunderstandings of FLOSS.
5.1 For highly specialized applications this is true but for a surprising amount of common real world tasks it is not only false but you are offered choice between a range of applications. This will always be the case but as more long tail software gets written using .NET porting will be a minimal effort (Novell e.g. does good business helping vendors move to Linux this way).
5.1.1 I believe this to be a minor subset of users compared to mom and pops who do wordprocessing, browsing, photo editing, chatting and email. I believe this group to make up a larger part than the group mentioned in this item by the author. As these guys move over a larger market is created for vendors to port this applications. For 3D animation e.g. it is also untrue, while 3D studio max might not be here, major hollywood companies do all their work on Linux desktops (Madagascar e.g. was done entirely in a GNOME environment running on Linux using specialized applications).
5.2 There are lots of games for Linux, sure we might not have World of Warcraft but we have many games and the selection of titles is growing. Steam is even said to be ported to Linux soon giving us a deployment channel for both major players and indie game producers.
5.3 Drivers are a problem, but out of the box no OS supports more hardware than Linux. We are closing the gap for video drivers and plugging holes in ALSA as I type this. Not labeling this in progress is utterly dishonest.
5.3.1 Pass (Every printer I have owned worked under Linux, but I don't feel confident enough to give an answer on this one).
5.3.2 True, but with hefty modifications. Recently the uvcvideo driver set was merged into Linux and this brought with it a lot of support. We are expanding every day plus the Linux Driver Project will happily write the driver for vendors for free, under NDA. All we need is specifications. In progress, not labeling it as such is also dishonest.
5.4 True, but ffmpeg amongst other projects have pledged to add blu-ray support. I doubt this one will stand for long.
5.5 Total bogus (with reserve). The legal status is not questionable for codecs, they are illegal to install where software patents exist due to license violations. However more and more vendors move towards open standards and till then you can buy supported, legal codecs from companies like Fluendo (http://www.fluendo.com).
6. Regression testing is performed with every kernel, distros like ubuntu have automated crash reporting as well as hardware testing. There are tools to test suspend and resume. We can do better but regression testing on a scale as massive as every hardware configuration out there will be lacking regardless of how much effort you put into it. We are getting better, the tools are there and distros do consider it a priority. Tools are also present today such as Smolt to show us which setups are the most used to help us target regression testing. True but the goal is unrealistically set.
7. We have bugs, all software has bugs. There will be a few that never get worked on unfortunately but of those filed a decade ago how many are still valid or relevant given the pace of development I wonder. True, but the goal is set unrealistic, no project will ever realistically reach zero bugs.
8. Examples?
9. True, performance testing for desktop use could definitely improve. However using OpenOffice as the example is picking the worst possible case as it uses a non native toolkit, reinvents a lot of wheels and so on. Still it has gotten better over time, Michael Meeks e.g. has done some amazing optimizations. I will grant that we should if possible focus more on getting proper tracing and a performance regression testing done. Tracing is in progress and many projects have regression testing now e.g. Banshee and when major problems occure users tend to complain followed by fixes being commited.
9.1 Lacks data, also largely solved (http://google-opensource.blogspot.com/2008/04/gold-google-releases-new-and-improved.html) - I assume he means a gcc linker but we have many linkers mono has a linker as well e.g. which performs quite nicely as I understand. The problem is poorly defined.
9.2 Parallel boot has not been shown to decrease boot time, e.g. Moblin' impressive 5 sec boot from bootloader to usable gui was specifically done without parallization for performance reasons. Aside that most major distros today ship a parallel capable bootsystem (upstart e.g. is parallel capable and deployed in Ubuntu as well as Fedora just to mention two major distros). Untrue and a false assumption.
9.3 Lacks data but tends to be true in my experience. Work to be done. As suspend becomes more reliable one could question if shutdown will be as large as problem space as it is now though. Still worth fixing and being worked on (Fedora e.g. has work ongoing to fix this).
10. It depends, you really don't want users to see all manners of errors. Varnish has a good approach where it just displays "guru meditation error" with an error code admins can look up in their logs. An error message needs to be translated and it might not always make sense to users. Instead we could do better capture tools to report bugs. I am unconvinced that heaping on more and more error dialogs will be the best way to solve this.
11. True, at least for many projects. This is a good entry point for users to get involved though. The Fedora documentation project has good experiences with this. I believe such effort could do with more PR at least and yes we could do with better documentation at all levels. True but also true for most non-Linux deployments.
12. SELinux protects against many attack vectors. As does AppArmor and every one of the 3 major distros deploys one of these solutions by default. Linux has a very strong security framework along with proactive security enhancements. I feel unqualified to speak to specific threats but in general we have very strong security available to us along with a much better patch rate and a shorter patch cycle than Windows(Mark Cox' blog is good reference for this data (http://www.awe.com/mark/blog/)).
13 API/ABI complaints
13.1 GNOME 2.x has been API compatible since June 2002 till now and we are only just breaking API. There are good reasons to not keep certain APIs stable such as in-kernel APIs but aside that the major desktop environments have strict API stability requirements for the duration of each major cycle. There are reasons why we change APIs and it is not done lightly, there are also many projects where keeping the APIs stable is a primary concern to say that as a whole Linux doesn't care about these things or engineering solutions that are forward looking is false.
13.2 True, see 13.1. Not all APIs will or should remain stable at all times aside that compatibility libraries are options, as is static compilation. I assume we are talking about situations where source is not available in which case static compilation is the safest option (if licenses are a problem for this solution, often asking politely for an exception or a license change very often works).
13.3 Companies like the one referenced are most welcome to file bugs. I don't see the blog referencing any of the bugs they hit, I am sure they are there - no software is flawless. I am unsure what the actual complaint is here, that libraries are software and thus have bugs?
(please note we are at item 13.3, near the end and this is the first reference given in the entire article)
14. Enterprise woes
14.1 Unsure what he even means with this. Companies like Red Hat specifically offer products with strong API/ABI policies with up up to a decade of support. The complain needs to be specified with an example.
14.2 Every enterprise distro has it's distribution channel, Red Hat network, Novell has it's thing and Canonical has Landscape. If you want to deploy a single application often the best option that is the least support burden for you is a software appliance. It depends on your needs if this is a problem. Yes it would be nice if we have the one true distribution method, and in part we do, source code but work is being done on such management (see Spacewalk (http://www.redhat.com/spacewalk/) - there is no reason why this could be extended to support multiple distributions). If the complaint is that everyone uses their own packaging format and package manager then yes, it is a problem but often the solution is as simple as asking someone to do the hard work of packaging it and accepting bugs, trusting them to do good work and accepting bugs. The way into RHEL e.g. goes through Fedora.