PDA

View Full Version : "RAM is cheap so why complain?"



FoolsGold_MKII
July 9th, 2007, 05:37 AM
I was reading a thread elsewhere by a guy who figured Adobe Reader 8 in Windows was rubbish. Now there was some debate as to whether this was so (not the point of this thread), but the argument came up that the reader was using a fair amount of memory when running, which he figured excessive for a program designed to simply view PDF files. Allow me to reproduce a quote I read from someone's response:


Well that does happen with newer software releases. For example how Windows Vista uses more ram than what Windows XP does.

And how Office 2007 would use more ram than say Office XP.

If the computer has sufficient ram than it shouldn't bother it. Ram is cheap enough these days that there shouldn't be a reason not to have enough ram.

My question is this: given how cheap (most) RAM is these days, is that any excuse for unnecessarily bloated software? Perhaps more pointedly, does the availability of RAM excuse sloppy optimization? Have we reached the point where developers feel they don't HAVE to trim the fat off their software, because hey, everyone's got plenty of RAM?

It was only after I returned to Linux did I realize how powerful a modern system can run with limited resources. Using RAM to enable extra functionality is one thing, but this is just a PDF reader, it's not even the full version.

maniacmusician
July 9th, 2007, 05:45 AM
I was reading a thread elsewhere by a guy who figured Adobe Reader 8 in Windows was rubbish. Now there was some debate as to whether this was so (not the point of this thread), but the argument came up that the reader was using a fair amount of memory when running, which he figured excessive for a program designed to simply view PDF files. Allow me to reproduce a quote I read from someone's response:



My question is this: given how cheap (most) RAM is these days, is that any excuse for unnecessarily bloated software? Perhaps more pointedly, does the availability of RAM excuse sloppy optimization? Have we reached the point where developers feel they don't HAVE to trim the fat off their software, because hey, everyone's got plenty of RAM?

It was only after I returned to Linux did I realize how powerful a modern system can run with limited resources. Using RAM to enable extra functionality is one thing, but this is just a PDF reader, it's not even the full version.
yes, developers do think that way nowadays; at least the bad ones. And yes, thankfully, we have higher standards in our communities...

@trophy
July 9th, 2007, 05:46 AM
yes, developers do think that way nowadays; at least the bad ones. And yes, thankfully, we have higher standards in our communities...

LOL with the possible exception of Firefox... "All your RAM are belong to me."

jrusso2
July 9th, 2007, 05:58 AM
Its amazing how fast things were when I only had 512 kb of ram. Seems like computers keep getting faster but software gobbles up any speed increase.

FoolsGold_MKII
July 9th, 2007, 06:03 AM
Its amazing how fast things were when I only had 512 kb of ram. Seems like computers keep getting faster but software gobbles up any speed increase.
Modern computers are also capable of doing a lot more than they could with just 512KB. Whether they actually use the extra resources for useful functionality or fluff is another thing though. They couldn't afford fluff with only 1/2 meg. :KS

init1
July 9th, 2007, 06:30 AM
LOL with the possible exception of Firefox... "All your RAM are belong to me."
http://xkcd.com/c286.html

Atomic Dog
July 9th, 2007, 06:46 AM
yes, developers do think that way nowadays; at least the bad ones. And yes, thankfully, we have higher standards in our communities...

I took a software engineering/programming class and not one single thing was EVER mentioned about conserving memory or efficient code. I kept pressing the instructor to talk about these things, but there was no interest.

Times have changed indeed. When you only had a few K of ram you learned how to write code efficiently. Now the comment is "well get more memory." Sad.

kerry_s
July 9th, 2007, 07:51 AM
Yeah, i think it's a shame somethings can only be upgraded so far before it tops out.
when i got this vaio pcg-f430 it had 64ram & 6gig hd, it took me a while to find the pc100 ram locally and top it out at 256 the max supported, the hd was really hard, it's internal and there is barely any info for upgrading it, most things i found said it could not be upgraded, i finally said screw it i'll just take it apart and look inside. well once i opened her up and took a look around it looked pretty standard to me, so i found a 20gig notebook hd and swapped it out.

well now i have a fairly decent laptop to use while i take my time rebuilding my main system, everything kept failing because i kept fixing it with used parts to save money and get it back running as fast as possible.
i understand there are certain things we use everyday and thank god there are people out there who give us a choice. for people like me who's priorities are the bills, food, and getting that kid through college more ram and a bigger hd is not even near the top of the list.

sorry, i ramble. :(

Tomosaur
July 9th, 2007, 01:58 PM
It's a fairly pathetic argument - it's like saying 'we've got great painkillers, so you should just go ahead and bang your head against that wall for an hour'. Over-using RAM creates problems, not just a lack of RAM. We're talking RAM fragmetation, general slow-down of EVERYTHING in memory, blah blah blah. Software should be optimised as much as possible.

Sunflower1970
July 9th, 2007, 02:07 PM
RAM's cheap...?

Guess it depends on the type of RAM one's buying. The RAM for my Dell is, sadly, quite expensive, still.

smoker
July 9th, 2007, 02:58 PM
you just have to use an os like puppy linux or DSL, that give you a full desktop operating system and various apps, and are way less than 100MB in total, to see how absolutely bloated most stuff has become! cheap memory is no excuse for bloat, besides, memory may not always be so cheap!

lisati
July 9th, 2007, 02:59 PM
Modern computers are also capable of doing a lot more than they could with just 512KB. Whether they actually use the extra resources for useful functionality or fluff is another thing though. They couldn't afford fluff with only 1/2 meg. :KS

What about 640k?

bigken
July 9th, 2007, 03:01 PM
RAM's cheap...?

Guess it depends on the type of RAM one's buying. The RAM for my Dell is, sadly, quite expensive, still.


you don t have to buy it from dell ?

lisati
July 9th, 2007, 03:05 PM
The first computer I owned back in the early 1980s (if you could call it a computer - it was little more than a programmable calculator that used a dialect of BASIC) had a massive 1kb of user ram, and that was after an expansion pack was installed. Times have changed.

forrestcupp
July 9th, 2007, 03:08 PM
"Most RAM is cheap" is relative. If I only have $50 to eat on for 2 weeks, no RAM is cheap.

I remember back in the Commodore days when programmers had 64K of RAM to work with, and that is only if they could write over the kernel. There was no such thing as bloat then. Programmers had to conserve to the max and use techniques to fool people into perceiving things as better than what they actually were.

argie
July 9th, 2007, 03:22 PM
Lies! RAM is not cheap. Hence the rest of the argument is invalid.

jkeyes0
July 9th, 2007, 03:36 PM
RAM's cheap...?

Guess it depends on the type of RAM one's buying. The RAM for my Dell is, sadly, quite expensive, still.

Btw, Sunflower. Most likely the reason your ram is so expensive is due to it being RDRAM, which has always been expensive. There have always been so many "gotchas" about RDRAM that I never thought it was worth the effort.

Sunflower1970
July 9th, 2007, 03:49 PM
Btw, Sunflower. Most likely the reason your ram is so expensive is due to it being RDRAM, which has always been expensive. There have always been so many "gotchas" about RDRAM that I never thought it was worth the effort.

Yeah. When I bought this computer back in 2001 (or was it 02? anyway) I didn't know the difference between RAM, or much of anything else. If I had known the cost of the RAM would not drop, I would not have bought this computer. I assumed that the RAM would get cheaper, as it had done for my computer before that one. Ah well. I still :heart: that computer. :)


you don t have to buy it from dell ?

I don't actually. I choke when I see the prices at Dell for the RAM (these have to be bought in pairs). I could just buy a new computer for the cost of the RAM there. I can usually find a deal on Ebay...but still the prices are high.

dca
July 9th, 2007, 04:09 PM
Current RAM is cheap... DDR 4300 & DDR 5200 because that's what is currently being shipped in new systems. The price increases when companies like Crucial & Kingston have to continue manf (support) older speed (PC2700, etc, etc) for people now just finally upgrading their memory. Indeed, it's no excuse, but that's the vicious circle of (tech) life...

bigken
July 9th, 2007, 04:13 PM
Yeah. When I bought this computer back in 2001 (or was it 02? anyway) I didn't know the difference between RAM, or much of anything else. If I had known the cost of the RAM would not drop, I would not have bought this computer. I assumed that the RAM would get cheaper, as it had done for my computer before that one. Ah well. I still :heart: that computer. :)



I don't actually. I choke when I see the prices at Dell for the RAM (these have to be bought in pairs). I could just buy a new computer for the cost of the RAM there. I can usually find a deal on Ebay...but still the prices are high.

ye good old rdram I remember it well bloody expensive but like you say ebay is the only way for this stuff I actually gave 2 sticks away a few weeks back they were only 128mb

Spr0k3t
July 9th, 2007, 04:53 PM
Give it another year when the norm is closer to 4GB+ and you will hear of this exact same argument. The history of the "bedroom coder" is all but forgot. The days of when programmers did everything they could to cut memory usage down. Some of the instructors I've studied under did not focus enough on the Big O of N. There are several reasons all programmers should be concerned about memory usage for the apps. A great example of this is the new commenting system over at Digg. It's effecient... but not so much that it would work fluidly with more than 50 comments and the tree fully expanded. I saw the code and noticed how they could improve upon it. Same deal with games... push the envelope as far as possible without going overboard all the while making the graphics more stunning than before. Eventually we will start to see applications which take multiple gigs of memory to run... but I hope that doesn't come too soon.

beefcurry
July 9th, 2007, 04:58 PM
New ram is dirt cheap, but ram for old computers are now plated with gold expensive ;). Not everyone has fancy new DDR2 capable machines.

argie
July 9th, 2007, 05:35 PM
About the digg comment system, it sucks, I hate it, it makes Firefox stall on my computer for a couple of seconds each time another tab loads its comments.

whayong
July 9th, 2007, 06:16 PM
Fry's prices this past weekend for RAM as an example:

Crucial 1GB DDR PC3200 Memory $59.99 $44.99 after rebate
Crucial 1GB DDR2 PC5300 Memory $39.99 $19.99 after rebate

I spent quite a bit more then this to get my 2GB of DDR a few months ago, back when they were $80 for 1GB (brand name).

cobrn1
July 9th, 2007, 10:02 PM
RAM is cheap eh? That really depends. I can get 2GB of good quality 667Mhz ram for £50. If I needed the next frequency up (800Mhz) it's now £100. Upwards of that it gets ridiculous. Also, it depends on what type of RAM you're using. Some is just much more expensive than others (sucks to be you in that case).

Also, more RAM is no excuse for not optimising. It's an excuse for more features, more powerful features and more flexibility, but it should not encourage bloat! Optimisation is an essential part of coding (especially if you missed a memory leak in the code - that's just no fun nomater how much RAM you have cos its just a countdown until you have to restart).

The XP kernal uses about 120Mb of RAM - about right if you ask me. Vista uses 500mb. Why??? No extra features, it's just bload and lack of optimisation (BTW, archaic things were removed from the kernel, look up 'removed from vista' in wikipedia to find a lit of thing that have been remove since XP. Still the requirements go up. That's just bad coding +bloatware).

FoolsGold_MKII
July 10th, 2007, 04:32 AM
I knew people would run with the prices of RAM, which is why I put the word "most" with brackets in the original post. Never generalise anything with computers. :)

macogw
July 10th, 2007, 06:27 AM
No, it's no excuse. All it means is the programmer sucks at programming. Some people need some schoolin' in the meaning of "big O runtime."

And uh, RAM is cheap? Excuse me? 256MB of PC100 RAM in-store is $70! Online it's $35. That ain't cheap! Best shot is $20 at Goodwill's computer store (like Salvation Army...charity where you donate old junk and they sell it then give the proceeds to the homeless). 3 out of 5 of the computers here use PC66-PC133 (I'm guessing that 66MHz would be called PC66), so I put PC100 in them as it's easiest to find and they scale it fine.

FoolsGold_MKII
July 10th, 2007, 06:50 AM
And uh, RAM is cheap? Excuse me?
I was referring to the typical desktop machine with the most commonly-available RAM chips. Besides, other people tell ME it's cheap; I haven't need to buy RAM for nearly 10 months. People say it's cheap, I'll take their judgement because they argue it's cheap enough to not bother to trim code resource usage, which is obvious tosh.

Motoxrdude
July 10th, 2007, 07:11 AM
As far as i know you can only go up to 3gb with windows xp 32bit edition. Now im not sure if you are limited by the 32bit os or the fact that it is windows xp, dunno, but still that's a weak-sauce statement. Why not spend a little extra time and polish the program a little bit?
"You have to pay $$$ for our laziness" is basicly what it is telling me.

c4taclysmicPr0posal
July 10th, 2007, 07:59 AM
I appreciate the high standard of coding that comes with linux. Once i master C, i hope to get good at this optomization. Its ridiculous that vista uses .5 gig, its xp with flashier graphics...

macogw
July 10th, 2007, 08:10 AM
I was referring to the typical desktop machine with the most commonly-available RAM chips. Besides, other people tell ME it's cheap; I haven't need to buy RAM for nearly 10 months. People say it's cheap, I'll take their judgement because they argue it's cheap enough to not bother to trim code resource usage, which is obvious tosh.

Okay my laptop's RAM is $120/GB which is better than the desktop's old RAM, but still sounds rather high to me

steven8
July 10th, 2007, 08:16 AM
The RAM for my comp at 1 gb is between $89 and $199, depending on if it's ECC and the manufacturer, etc. 89 dollars to a guy trying to take care of a family of 5 is a lot of money. If I was still 18 and living at home, where my whole paycheck was mine do with as I wished. . .then I'd buy 4 gigs of ram, no sweat!!

FoolsGold_MKII
July 10th, 2007, 08:34 AM
You guys have made your point - RAM isn't as cheap as I thought, particularly for laptops and older PCs.

All this means is that

(a) the people who bring out the "RAM is cheap" argument have more money than sense
(b) their position is actually made worse because RAM isn't as cheap as they assume

Conclusion: yay for Linux?

@trophy
July 10th, 2007, 07:19 PM
Conclusion: yay for Linux?

Conclusion: Linux has the ability to not fall into the bloatware trap. It would behoove us to reverse the trend and start pumping out lots and lots of properly optimized code.

forrestcupp
July 10th, 2007, 09:01 PM
Yeah, but some people's bloat is other people's features. I don't necessarily want a system that's so slimmed down that it is featureless. I like being able to do more with my computer.

tszanon
July 10th, 2007, 09:16 PM
Don't forget about those low-latency ones. DDR2 800MHz...these are expensive...

@trophy
July 10th, 2007, 09:33 PM
Yeah, but some people's bloat is other people's features. I don't necessarily want a system that's so slimmed down that it is featureless. I like being able to do more with my computer.

Right, but if those features aren't implemented well, then it's bloat. I was referring more to memory leaks, storing byte values in an int, storing boolean values in an int... some of you may laugh, but I've seen all of the above and more in production code.

We need to get back to teaching CS students "use only what memory you need... and think about what the best algorithm would be for whatever you're trying to do.".

insane_alien
July 10th, 2007, 10:13 PM
RAM is only cheap to those who can afford it.

if your application needs 50MB to read a 1MB pdf then something is very very very wrong.

igknighted
July 10th, 2007, 10:45 PM
Yeah, but some people's bloat is other people's features. I don't necessarily want a system that's so slimmed down that it is featureless. I like being able to do more with my computer.

This is why I love KDE... it has the features (or as some say, bloat) available, but it can all be turned off so it becomes a nice, trim system. On my higher-end boxes I have all the features on, and on slower ones (<512mb ram) I start turning stuff off so the system is still responsive. It's too bad more apps don't follow this theory. I was all excited about FF because I thought the theory was bare-bones out of the box and then you add features via extensions... but FF is a massive beast with no extensions sadly. Same could be said about OpenOffice, they could modularize it much better and keep the features as options for those who desire them. Granted none of these apps are as bad as, say, iTunes or Acrobat Reader, but still...

EDIT: For those who claim DDR2-800mhz ram is expensive, I just bought 2 weeks ago 2 gbs of the stuff for $70... that hardly seems expensive to me. I remember paying $40 for a 512mb stick of pc3200 not too long ago. Do your research before you buy, there are many deals out there.

PatrickMay16
July 10th, 2007, 11:30 PM
Bloated and unoptimised code is inexcusable. You might say "DUH! Give up your ANCIENT AthlonXP 1800+! Get into the 21st century!", but ask yourself this. Should you really need at least a pentium 3 to comfortably run a web browser? And if a program is slow like this, it doesn't matter if you've got a fast enough machine to run it comfortably if it's a mobile machine running on a battery; it will use more CPU, and cause your battery to deplete fastar. That'll wipe the smile from your face. Heh heh heh, and then all the women will all taunt you for it. Hey look it's SLICK JOHNNY with the small battery. Hah hah hah, you won't live up under the shame. Come on man yeeeeeeeeeahhhhahhah DENIS DENIS DENIS. OH MAN.

macogw
July 11th, 2007, 12:55 AM
Yeah, but some people's bloat is other people's features. I don't necessarily want a system that's so slimmed down that it is featureless. I like being able to do more with my computer.
Bad, inexperienced, or unknowledgeable programmers are more likely the cause of software bloat on programs that don't do much. There's a lot to pay attention to in terms of what algorithm the programmer decides to use and how they implement it. The easiest thing to think of or implement and the fastest-running problem are often not the same thing.

Say you need to keep a bunch of values in order. You don't know how many values, just that there's a bunch. You could:

A) Create an array of 250 values. When you have 250 in there and need to add more, but it's full, create a new array that can hold 500 values. Copy each of the 250 values over 1 by 1. Then add that 251st value. Arrays require one continuous block of memory. First you took one chunk and filled it, then it wasn't enough. So you took another chunk twice as big. There are now 2 chunks of memory in use. Well, that chunk has to be continuous, so if there isn't a continuous block of memory on your RAM which is large enough to be allocated for the 2nd chunk (for instance if a lot of other programs are using enough of your RAM that there's space for <=499 of those values), all of it goes to swap, and swap is slow. Then, regardless of whether it's on the RAM or swap, you have to write to the 2nd array 251 times. Hopefully, the programmer remembers to release the first chunk of memory. Many will forget. So, now some space was freed up in the RAM if they released that memory, but the current array won't move into the first chunk because 1) it's too big 2) it's already got a spot. That first chunk will go unused (to waste) unless the user starts up another program which requires an amount of memory smaller than that first chunk, in which case that program will take it.

B) Create a linked list. Linked lists can grow and shrink as they are used. A continuous chunk of memory is not required; it'll just take whatever small chunks of memory are available, even if they're scattered all over the place. When a node is deleted from the list, that memory is released immediately. When one is added, any small amount of memory (provided it is as large or larger than the data about to be written) can be used.

In cases where the values do not need to be accessed randomly (only in order), the linked list is better. Beginning programmers, however, are only taught arrays. Linked lists are things found in data structures classes. Programmers who didn't go to school for computer programming and never bothered to learn about data structures are likely to use arrays for everything of this sort too. Don't get me wrong, arrays have their uses. Like I said, if access to random values is unneeded, the linked list is better. Example:

A) All of the data was put into an array. Now you need to get the 50th and 198th entries. Easy. array[49] (numbers start at 0) and array[197].

B) All of the data was put into a linked list. Now you need to get the 50th and 198th entries. Still easy, but it takes longer because linked lists must be walked. The list must be walked through 50 times (probably in a loop) to get to the first value, and 198 times for the 2nd one. Compare 50 & 198 instructions to just 1 for each in the array.

Then, there's sorting. You could have an array of values, pick the first value, compare to a whole bunch of other ones, and copy it to its spot in the new array....over and over and over until they're in the right order. Bubble sort, selection sort, heap sort, merge sort....

Linked lists will also require picking one, comparing to everything, and placing into the new linked list in order, one at a time

Binary trees are better than both of those for sorting. One item is put in, then the second goes to one or the other side based on if it's greater than or less than the first item. The third item put in is compared to the top one. It goes left or right depending on how it compares, and if it hits the 2nd item compares to that too and then becomes a branch off of it. These are automatically sorted as they are formed. Then (of course) there are different types of binary trees. If the first item wasn't kind of in the middle, everything is going to branch off to one side like a list, and that's going to take longer to search. Some searches will be done really fast if they're close to that first item. Some will take a long time if they're farther. It's unpredictable if the tree isn't balanced. A really unbalanced tree could have to go through every item individually to reach the one with the highest value. Red-black trees are auto-balancing. They shift which item is at the top (trees go down, unlike the kind of trees that have chlorophyll) if one side is longer than the other. All of the leaves are roughly the same depth from the root. Because it is always balanced, the worst-case scenario (longest possible) runtime is, O(log n), or it never has to do its thing more than the log of however many items are in it. So if you have 1024 items, it'll only have to go through only 10 times (because 2^10=1024), whereas if the tree was really horribly unbalanced, it'd be 1024 times to reach the last item (though more likely the item for which you are searching will be somewhere in the middle, so ~500 times).

Quick example of why big-O runtime matters:


The importance of this measure can be seen in trying to decide whether an algorithm is adequate, but may just need a better implementation, or the algorithm will always be too slow on a big enough input. For instance, quicksort, which is O(n log n) on average, running on a small desktop computer can beat bubble sort, which is O(n²), running on a supercomputer if there are a lot of numbers to sort. To sort 1,000,000 numbers, the quicksort takes 20,000,000 steps on average, while the bubble sort takes 1,000,000,000,000 steps!

http://www.nist.gov/dads/HTML/bigOnotation.html

ynnhoj
July 11th, 2007, 01:01 AM
macogw: that post brings back many memories of the data structures & algorithms class i took in college! :) i just skimmed, but i'll give it a more thorough read later.

macogw
July 11th, 2007, 01:26 AM
macogw: that post brings back many memories of the data structures & algorithms class i took in college! :) i just skimmed, but i'll give it a more thorough read later.

Fresh in my mind. I took it last semester. Had an evil (j/k) kernel hacker for a boyfriend and he said I had to implement a doubly linked list (or he'd dump me) by the time I got back from Winter Breaks...a week *before* that class started! At Spring Break he said I should do a red-black tree, but Spring Break involved me in the hospital one night + no internet on which to look up info on red-black trees, so that didn't happen. They were barely skimmed over at the end of the year, but I did end up doing about half the code I'd need for it the week before we got to binary trees.

Andrewie
July 11th, 2007, 03:31 AM
RAM's cheap...?

Guess it depends on the type of RAM one's buying. The RAM for my Dell is, sadly, quite expensive, still.

have you seen apple's ram.....be thankful. For the price your playing for that ram I would expect it to be gold coated

Compucore
July 11th, 2007, 04:39 AM
I would agree with you on this one Atomic Dog. When I was learning programming in the early 90's we had no choice but to learn efficient programming way back then. When you had only 256-640K of ram on your home computer. Or any Minicomputer on up to the mainframe or supercomputer for that matter. (Didn't graduate in it back then but I will admit to this I did learn it very well even by today's standards.) Then like you with my recent prof in 2002/3 argued about it as well. The thing is and Its not our fault here either. But it's simply the lazy man way of not doing the job right. I am sure if they kept the golden rules of way back then about proper programming and efficiency on any software that are under the windows environment.(I'm just using windows as a classic example.) Most of these programs would be speedy at any system from Pentium 1 or equivalents on up.

:)

Compucore


I took a software engineering/programming class and not one single thing was EVER mentioned about conserving memory or efficient code. I kept pressing the instructor to talk about these things, but there was no interest.

Times have changed indeed. When you only had a few K of ram you learned how to write code efficiently. Now the comment is "well get more memory." Sad.

forrestcupp
July 12th, 2007, 10:56 PM
Right, but if those features aren't implemented well, then it's bloat. I was referring more to memory leaks, storing byte values in an int, storing boolean values in an int... some of you may laugh, but I've seen all of the above and more in production code.

We need to get back to teaching CS students "use only what memory you need... and think about what the best algorithm would be for whatever you're trying to do.".

I totally agree with that. Poor programming techniques need to be addressed. But a lot of people ignorantly call unwanted features bloat. The only thing that makes it bloat is if it was programmed poorly, not the fact that it is there.

jpkotta
September 15th, 2007, 07:05 AM
I would suspect that many times when a program is eating a lot of RAM, it is trading space for time. The class of people who don't care what RAM is but do care that program A is faster than program B (even though it ties up the whole computer) is huge. The trouble is that when you trade too much space, your cache misses a lot or you start swapping, and it ends up slower anyway.

Not to defend sloppy programming, but there are also things like deadlines, bugs to fix, and demanding clients. Sometimes you want to make it better, but there's just no time.

Finally, I have a conspiracy theory. Large software companies collude with hardware manufacturers to constantly increase demands for computing power. Some things just need more power, like numerical simulations. But viewing a document?