I just saw this on /.
http://linux.slashdot.org/article.pl...31240&from=rss
Personally, i think we should fork, starting where Colin Kolivas left.
I just saw this on /.
http://linux.slashdot.org/article.pl...31240&from=rss
Personally, i think we should fork, starting where Colin Kolivas left.
<plexr> do you know std c++ ?
<plexr> or is a weak understanding of VB your only strength
<ahorse_> oohhhhh he just said the equivalent of *yo momma*
www.acgla.net <--- My webpage : )
can you qualify that a little? Not tying to be a jerk, but i would like to know why we should consider it, lets turn this in to a proper deabate.
I personally don't know enough about schedulers etc. to really be of any contribution to such a debate, but im sure there are in the UF community.
Forking it would be stupid. Having a desktop branch might be a good idea but no to forks and spoons. To much FUD.
Ubuntu: A Free Software Operating System
forking would just slow things down. just because someones patch didn't get accepted for the mainline kernel doesn't mean the whole thing is corrupt and evil.
still if someone feels that a fork is needed hey its all GPL in the end
HOME BUILT SYSTEM! http://brainstorm.ubuntu.com/idea/22804/ Please vote up!
remember kiddies: sudo rm -rf= BAD!, if someone tells you to do this, please ignore them unless YOU WANT YOUR SYSTEM WIPED
Nice inteview with Con here, http://apcmag.com/6735/interview_con_kolivas
A desktop branch could eliminate the need of a fork though. But having things separete will kinda help organize things.
<plexr> do you know std c++ ?
<plexr> or is a weak understanding of VB your only strength
<ahorse_> oohhhhh he just said the equivalent of *yo momma*
www.acgla.net <--- My webpage : )
Well its would lead to duplication of effort.
My biggest thing is, like echoed on slashdot, what defines a desktop and a server.
Slow, scalable algorithms are used rather than lean but limited ones.
If this is true it is actually a good idea. Today's personal computers have a lot in common with high end machines from 10 years ago.
Multiple processors? Check.
Gigabytes of RAM? Check.
Harddisks with hundreds of Gigabytes? Check.
And I guess the trend will continue, so what belongs in the big iron of today will be fine for tomorrow's personal computers.
--
Ubuntu: A Free Software Operating System
coming from a windows environment i like the idea of having separate desktop and server versions of the kernel. But i also agree with the articles in that the current development model that open source software uses does not lend itself well to the maintenance of 2 different kernels. but isn't the point of open source the fact that you can freely obtain to code and make whatever changes you see fit to it, to satisfy your personal needs. That is what large corporations like Novell, IBM and Red Hat do. and they are the ones who supply a lot of the money for the development of linux. also forking the kernel would hopefully lead to more home use software written for linux.
Forking a monlithic kernel isn't alot of fun, nor usually productive. If people are that concerned...then simply compile your own. Remove the modules you want, add the ones you need etc. Forking the kernel may not even be legal, not sure. Linux may be GPL, but the kernel is owned by one Linus Torvalds. You'd put the desktop fork back years by forking. All new agreements for drivers(Intel anyone?), propagation of said drivers under what a new GPL, BSD type? GPL 3? To many headaches.
If anyone is so concerned with what is in their generic kernel..feel free to recompile and gain that 2% performance gain you might get. If you don't kill off something you need, or break your system.
Beau D.
Bookmarks