PC Hardware

General discussion

Why all the hype for installing lots of RAM?

by Lee Koo (ADMIN) CNET staff/forum admin / October 23, 2009 6:56 AM PDT
Question:

Why all the hype for installing lots of RAM when the system doesn't utilize it?


I have always followed the advice of this community, and they have always been great suggestions. I have one question. I have 3GBs of RAM on my current Dell 531S and my new Dell 546, which I will be receiving, will have 4GB of RAM. I have two RAM monitors on my systems that keep an eye on RAM usage. They never go above 1GB and most of the time it is at 225MB to 500MB of usage. So why all the hype for lots of RAM? I periodically do film editing and it still uses only a little RAM. Is there something I am not doing to use more RAM? Is there a way to get the computer to utilize more, a setting I should redo or set? Suggestions and comments appreciated.

--Submitted by Jhampa S.

Here are some featured member answers to get you started, but
please read all the advice and suggestions that our
members have contributed to this question.

Why the hype --Submitted by scleung
http://forums.cnet.com/5208-7591_102-0.html?messageID=3158186#3158186

Misinformation --Submitted by yourpcmedic
http://forums.cnet.com/5208-7813_102-0.html?messageID=3158397#3158397

Peak performance is key --Submitted by say592
http://forums.cnet.com/5208-7591_102-0.html?messageID=3158191#3158191

Always max your allowable RAM --Submitted by High Desert Charlie
http://forums.cnet.com/5208-7591_102-0.html?messageID=3157033#3157033

Memory utilization --Submitted by GEO2003
http://forums.cnet.com/5208-7591_102-0.html?messageID=3157027#3157027

If you have additonal opinions or advice for Jhampa, please click on the reply link and post it. If you are providing troubleshooting advice, please be as detailed as possible when submitting your solution. Thanks!
Post a reply
Discussion is locked
You are posting a reply to: Why all the hype for installing lots of RAM?
The posting of advertisements, profanity, or personal attacks is prohibited. Please refer to our CNET Forums policies for details. All submitted content is subject to our Terms of Use.
Track this discussion and email me when there are updates

If you're asking for technical help, please be sure to include all your system info, including operating system, model number, and any other specifics related to the problem. Also please exercise your best judgment when posting in the forums--revealing personal information such as your e-mail address, telephone number, and address is not recommended.

You are reporting the following post: Why all the hype for installing lots of RAM?
This post has been flagged and will be reviewed by our staff. Thank you for helping us maintain CNET's great community.
Sorry, there was a problem flagging this post. Please try again now or at a later time.
If you believe this post is offensive or violates the CNET Forums' Usage policies, you can report it below (this will not automatically remove the post). Once reported, our moderators will be notified and the post will be reviewed.
Collapse -
Lots of RAM
by mjb5406 / October 23, 2009 9:39 AM PDT

Funny you should ask this since it was the subject of a question at my Windows 7 Launch Party last night. The answer is simple and complicated... if you are using a 32-bit operating system you are limited to 4GB of memory but you can't access it all. When the PC was originally designed, nobody ever thought anyone would need more than 4GB of memory... and the highest amount that can be accessed by a 32-bit number is 4,096,000 bytes (4GB). But you can't use it all... all of the hardware adapters in your PC (video, network, etc.) use a part of the top end of the 4GB space, so, in essence, you only have access to 3.25GB. When Vista was first released, it reported the real memory accessible... people screamed because they had 4GB and it only showed 3.25 so, in a later patch, Microsoft actually started reported it as 4GB (technically incorrect, but great marketing).

64-bit operating systems can access something like petabytes of memory, but are usually limited (by design) to like 256GB, which nobody can afford anyhow.

In either case, though, the system will work better with more memory because it can keep the OS and running programs in memory rather than swpping them out to (slower) disk, or "Virtual Memory" as they call it. Most OSes (Windows, Linux, UNIX. Mac OS X) use this "swap space" to enable multiple programs to run simultaneously, even if there is less memory than is needed.

Collapse -
Original design
by ejsecco / October 24, 2009 4:18 AM PDT
In reply to: Lots of RAM

mjb5406, you had a good post, but your history is wrong about the original design of the PC. The original PC used Intel's 8088 8/16-bit processor that can only address 1MB of memory using a 20-bit addressing scheme. At the time, Microsoft decided that only 640k would be accessible to programs because at the time, Bill Gates stated that he didn't think anyone would ever need more than that. The remaining 360k was reserved for the video card, etc.

Collapse -
You're Correct
by Hforman / October 25, 2009 11:37 AM PDT
In reply to: Original design

I guess some people here have been reading and believing in Wikipedia or are too young to know that the first commercially available personal computer was the TRS-80 from Radio Shack. Or remember anything from Sinclair computers. 640 mb wasn't even the first available memory configuration. It was much smaller than that and the only available storage was cassette tape. That was before the eight-inch floppies got popular.

Collapse -
Old Computer
by scleung / October 25, 2009 12:07 PM PDT
In reply to: You're Correct

My old Trash 80 computer had 48K with a tape drive to store the program I typed in. I was the most high tech person on the block.

Collapse -
You had a big TRS 80 Model 1
by quantum_force / October 30, 2009 11:03 AM PDT
In reply to: Old Computer

48 K was luxury. My still running TRS 80 had 4K (later larger) of ROM and 16K of RAM...to get 64K required an external $299 box + about the same cost in RAM to populate it.

Collapse -
Commodore 64
by blauder / October 30, 2009 1:55 PM PDT

I was lucky, my old Commodore 64 had 64 kB RAM + 20 kB ROM AND it still works! Couldn't play Spore on it though. Ha!

Collapse -
Well, when I was a kid...
by real village idiot / October 31, 2009 10:05 AM PDT
In reply to: Commodore 64

Hah! My VIC-20 only had 5k! And only let me use 3.5k as I recall. Talk about uphill both ways!

Collapse -
mitts 680B
by donaldjj / November 6, 2009 1:50 AM PST

my first computer had only 512 bytes of ram 24 toggle switches and 24 leds for I/O. I later installed a Tarbell cassette interface which did not work well.
Later Iupgraded to a kim1 and later to a Commodore pet. I disagree that the TRS 80 was the first home computer as it didn't arive for 2 years after my mitts 680B
Don

Collapse -
MITS
by Walter L. Johnson / November 6, 2009 9:08 PM PST
In reply to: mitts 680B

It sounds like you and I had the same computer company to start with, except I think my model was earlier than yours. The MITS Altair 8800, where you had to read the LED lights as binary and convert them manually to base 10 numbers for results unless you flushed out the computer with $1,700 of optional components which I didn't see as cost effective. Math that way was hard, especially using toggle switches to make entries in binary numbers.

My boss and I had assembled it from a kit just for fun and I did the soldering. I had to take it back for repair though because the assembly instructions didn't do a good job describing how to ground everything. At the repair shop I saw the first Commodore PET for sale and sold the MITS on consignment before waiting for something even better or cheaper to come out. My boss and I bought the MITS partly for the novelty of assembling it from a kit. I kind of wish I still had that computer just as a display novelty, but I sold it when the Commodore PET computer came out obviously a much better value for the home user. The Commodore PET, then Commodore 20 and Commodore 64 were much more user friendly, and I think I still have a Commodore 20 in storage. Radio Shack (Tandy) popularized the TRS-80 but I never bought one, although I think it made some inroads in schools and I know a co-worker who bought one.

The only thing good I can say about the early computers was programming was more fun and less work in those days, but having so little computer memory and unreliable tape storage was not fun at all. All of the pocket calculators today can do math easier without worry about the number of bits in a register and trying to put too many into a register.

The "Good Old Days" are usually exaggerated in nostalgia, but I can honestly say I would not voluntarily go back to them.

According to http://oldcomputers.net/altair.html the Datapoint 2200 was the first computer, although it was actually a programmable terminal dating back to 1970. The Radio Shack TRS-80 came out in 1977, the same year the Commodore Pet came out, while the Mits Altair 8800 came out in 1975. What Radio Shack and Commodore did was popularize home computers, while the IBM PC in 1981 legitimize small computers for business and home use in general.

Collapse -
The good "OLD" days
by donaldjj / November 7, 2009 2:31 AM PST
In reply to: MITS

yes the 8080 was about 4-6 months older than the 680B. And i still have my pet, C=64, and associated herdware periferals. The 1.05 mb floppy that was years before anyone else had one. the intresting thing about that drive was that it would not work with the special high density floppys that rge other drives required.
I still enjoy going and playing with the old games such as MULE with its very catchy tune that you either love or hate. fortunatly the commodore emulators work nicely in the new machines.
oh yes. a memory stick can be trained to emulate a commodore drive and WILL work on a C=64. takes recoding the serial port routines though.
Don

Collapse -
JUNK code is mainly the problem.
by TreknologyNet / October 31, 2009 7:09 PM PDT

[SOAPBOX]
I started in domestic computing when 4k was 'standard', 16k was 'luxury' and 64k was a "programmers' dream". Code had to be tight and specific.

When the basic IBM PC/XT was released, it frequently only had 256k installed because, as quoted elsewhere, Bill Gates said "No one would ever use 640k of memory."

It would seem that from that very moment on, MS set out to fill up as much memory as possible. When a new driver was installed in MSDOS, it did not REPLACE the original, it was loaded ABOVE the original which still sat in (now wasted) memory.

I had a simple BASIC word processor (self-written), which ran on my Trash-80 quite nicely at a blazing 1.7MHz. Porting that program to the PC's GW Basic, the program could not even keep up with the keyboard (and this is a machine running FOUR times faster). Compiling it improved the speed, but at a huge memory sacrifice (all the JUNK code embedded in the compiled version that is never going to be used).

Take a DOS Word document, and load it into any version of WIN Word, then look at the huge increase in file size when you save it. Most of that extra information is unnecessary JUNK.

The argument for JUNK code is "Well, by the time I've trimmed it down, there's a CPU that runs it faster anyway." This philosophy is now rampant in the programming community, with 90% of programmers out there relying on users having extra memory, hard disc space and CPU speed--rather than honing their own skills to keep their resources to a minimum. For those that do trim their code, the compiler of choice probably results in a sloppy translation anyway.

With multiple layers: BIOS, OS, GUI, Compiler/Translator, Proprietary hardware drivers that have to negotiate several of these layers simultaneously and with every one of the above containing at least 50% junk code, it all conspires to chew up your computer's resources at an exponential rate.

You will also notice this phenomenon on Web pages--ever-increasing graphic content aside, it can take 25k of data to transfer a fairly simple page from a server to your browser, which a competent HTML coder could reduce to about 2k.

Therefore, when it's no longer worth your own time to write your own program, when you can purchase someone else's, I always recommend the maximum affordable memory up to the capacity of the Motherboard/CPU/OS combination (there's no point installing unaddressable memory).
[/SOAPBOX]

Collapse -
As a programmer
by wasyed / November 1, 2009 2:39 AM PST

as a programmer i have to agree that a lot of programs can be trimmed down by A LOT! efficiency should be focused on as well when making a program but I have to agree that most programmers do not focus on that. I'm one of the few that actually try to be as effecient as possible. It does take a long time to try to trim code, I'm currently trying to do so right now and have been working on days trying to get the code slimmed down so it will load faster.
I'm also working on a webpage on the side too and stuck with the same issue, I was supposed to actually release the page about 2 weeks ago but its awfully heavy in code especially in my PHP and javascript portion of the pages. well back to work! Happy

Collapse -
Hardware cost vs Labor cost
by scleung / November 1, 2009 5:47 AM PST
In reply to: As a programmer

I can't speak for other platforms, but for PC's, as long as labor cost exceeds hardware cost, on the average you will not see software efficiency. I've been working with PC for more than 25 years. As hardware price goes down, the software efficiency goes down with it.

PS: Some of you may still remember Turbo Pascal, the entire compiler and editor fit into a 360k floppy. I don't think we will ever see this type of efficient applications again in our life time.

Collapse -
What about 8 k basic or tiny basic
by donaldjj / January 14, 2010 8:47 AM PST

I can remember when 360k was overkill and bloat. The tiny basic and 8k basic that I started with was noy perfect by a long way but they worked and were almost free.
Basic was and still is a good language especially when the proper compllers are used. Just stay away from the bloated Microsoft versions whenever possible.
Don

Collapse -
heat spreaders
by goldilocks20 / January 22, 2010 2:18 PM PST

there was even hype of RAM with heat spreaders. But the heat spreaders do work. Memory that has heat spreaders installed are intended for gaming and ultra high memory usage and overclocking; meaning (making a pc run faster than the original design, which causes a great deal of heat)
For the normal PC user who surfs the web and plays an occasional game and does spreadsheets and the like, heat spreaders are not necessary but if the spreaders are installed it would be just an added benefit that you wouldn't notice under normal usage.

Collapse -
No Way
by msgale / November 1, 2009 12:16 PM PST

I have been trying to respond to your soapbox, and instead of a point by point rebuttal I have decided to use a series of questions.
1. Define "Junk code"
2. Assuming that "junk code" is code that never executes, Prove its existence.
3. "the compiler of choice probably results in a sloppy translation anyway" Name bad compilers, that produce "sloppy code"
4. I serious doubt that any word processor written for a trash-80, had any of the capabilities of current Word Processors, i.e. WYSIWYG display, multiple fonts, color, table generation, foot notes and endnotes, spell checkers, charting, graphics, mail merge, equation editing.

Collapse -
I don't see the answers to the legitimate questions
by desirawson / November 5, 2009 3:32 AM PST
In reply to: No Way

There seems to be a huge problem with techs that have been doing this since the invention of computers. They don't want to grow with the technology nor do they want to learn about it so it can be utilized its best possibly functionality for all.

20 years from now when they are all retired maybe we'll get somewhere with the new generation that expects fast or they don't touch it, and will find out how to get it. There will be no cost issue or size issue involved.

I became certified because every single technician that I would take my computer to would do something else to my computer that wasn't there before and didn't help a thing. Most people don't want to learn about computers they just want them to work.

As for RAM there seems to be a huge discrepancy between whether 32 will use anymore than 3 GB - it doesn't. 64bit uses 4 GB

Collapse -
You haven't read all the comments if you see no answers
by Walter L. Johnson / November 5, 2009 9:33 AM PST

Other threads have very clearly stated the RAM limitations for different operating systems. 32 bit versions of windows can not directly address memory greater than 3.2 GB, but no one sells that specific quantity as far as I know, which means you need to buy 4 GB. However 64 bit versions of windows can address far more than the 4 GB you stated. Basically the addressing limit is 2 raised to the power of the number of bits, minus 1, which is why the first generation PC could use only KB of memory until 16 bit chips with a new version of DOS came out. Programmers developed work around solutions that broke programs into segments of acceptable size, that paged in and out of addressable memory as needed, but 32 bit chips eliminated most need to do that and made programs much more efficient except that less knowledgeable programmers were trained and got careless about using memory and other resources which drove the development of 64 bit chips, which was the size of the registers used in the very old mainframe 6400 computer.

The worst thing about your remark is that you did not restate the actual question you are saying wasn't answered, nor did you propose a question that you wanted answered. The original question was how much RAM to buy in a new PC, and everything else came originally from that simple question.

As to old programmers and computer staff I was one 20 years ago and observed many retiring. My impression is that most stayed very interested in keeping up with their field until the last year or two just before retirement. You had to or you became obsolete quickly.

The difference between young and old programmers is largely that the old programmers were more disciplined employees who were loyal to both their employer and their professionalism. The young ones often have not made the serious mistakes that cause companies losses and cost overruns and will just move to a different company when they do instead of taking the heat. You will find as you get older that those who get ahead do so primarily on either extreme skill like I had (but have mostly lost to time and non-use) or a combination of adequate technical skills and good people skills.

I have shared your experience with computer repair techs. although not with network techs who want the latest to make their jobs easier. Even those trained by software and hardware companies have a second line after looking at the online help of a tech who mostly picks answers from a manual before you get to talk to anyone with meaningful technical knowledge. By the third level you have actually reached someone who almost certainly can fix your problem, even all the way to India by phone. That is why I have always built and maintained family PC's in my retirement years even though it costs more initially than buying a PC off the shelf.

However, you have to be fair with techs at repair shops too. Very rarely are components worth fixing rather than replacing and it doesn't take much skill to swap components until you find one fixes the problem. What you lack at home is merely a supply of spare parts to plug into your PC.

Many years ago when windows was on floppy disks, one PC repair shop introduced me to a serious virus infection simply because they got careless and did not make the install discs read only. As a result they picked up a virus from a customer who got it using the internet and sought a hardware repair. The shop then gave it to others including at least one business customer and went promptly out of business.

In the 20 year future you forecast so casually, I frankly doubt that anyone will have a PC. Some multifunction device on your person and at home will likely combine the functions of electronics control, energy management, some home lighting, clocks, phones, and computer with tasks voice controlled, at least for the households with the latest and greatest technology. Right not doing these things is not standard and very costly but it won't be in 20 years. At work your monitor of function will probably be sewed onto your cubicle partition wall.

Fast comes with Moore's law and is thus a function of elapsed time, so everything electronic will be fast in 20 years. Moore's law on the doubling of processor speed will one day break down when we reach quantum state computing, but it won't be in the next ten to fifteen years. It is for example already possible to make clothing that is powered by solar power and displays whatever the user wants. It just is not cheap or yet fashionable. The clothing cleaning is also more challenging than simply using the washing machine and dryer.

Collapse -
The questions I was referring to were addressed to SOAPBOX
by desirawson / November 5, 2009 10:13 AM PST

The questions that weren't answered were in the threat addressed to SOAPBOX from "msgale"

Mr. Johnson: I appreciate your knowledge but you have admitted, unwillingly for some reason, that most tech professionals close to retirement do not want to learn anything new and want to deal with everything "the way they used to" (the quote from same that I now personally know from taking the certification courses in order to build my own as well). I live in a small town where the median age is 68 so that ought to give you a clue.

I am not disagreeing with anything that you said except that "msgale" had a few valid questions but no one agrees on the answers which is why he is so confused. If you've read the thread I am sure you would agree. I am also running 4 GB on a 32 bit and I do NO gaming at all but my system resources are taxed because of the number of programs that I need open to work on at one time and the reality that 4GB are not available to me.

I do not miss working in I.T. anymore - as a matter of fact most employers have no knowledge of what is needed and I.T. is the bain of their existance (they don't want to spend any money but everything must run at maximum speed/level/quality - and NOW).

For those who are trying to get answers on the fly on this website, there are so many conflicting answers, most of them are over their heads but they are just trying to get a firm answer on questions that (as you well know)if you do not do this for a living or have not attended the expensive classes for certification can be mind-boggling. They just want their computers to work and work right without maintenance, updates or additions (not possible).

Computers are not magic but they are advertised as such. The worst of it is that the very programs that protect novice users from viruses and accidentally ruining their computers are the same programs that slow their new beautiful fast machine down to a crawl (McAfee being the biggest legal virus and until recently, hopefully due to educating the consumer, cost a fortune.)

Collapse -
Actually not!
by msgale / November 5, 2009 12:48 PM PST

The maximum amount of memory on a 32 bit computer is 2 raised to the power of the number of bits. The address ranges is from zero to 2 raised to the power of the number of bits minus. Example the maximum address memory of a 32 bit machine is 4,294,967,296. The address range is 0 to 4,294,967,295. Since a PC uses memory mapped I/O all the assdess are not useable.
I am assuming your reference to 6400 computer is a CDC 6400, the baby brother to the CDC 6600. It had a 60 bit data word not 64 bits, however the address register was 17 bits which meant it had a maximum memory size of 131,072 words. The 6400 was word addressable, not byte addressable. Note too, a character was six bits not eight. bits.

Collapse -
I got careless
by Walter L. Johnson / November 8, 2009 6:55 PM PST
In reply to: Actually not!

I got careless and accepted the number given elsewhere in the discussion without calculating the value, which is actually 2 to the power or 31, presumably with one bit still to keep the sign of the number.

As to the CDC it was just to make a point. You are right about the 60 bit, but the choice of six or eight bits to make a character depends on the use of ASCII (64 characters) or Extended ASCII (255 characters) character sets. Today there are even more choices, but the CDC 6400 ran with ASCII.

What was most annoying about the CDC 6400 (6500 and 6600) is that they were so slow to print a graphic I wrote it to a monitor and the monitor was filmed on microfilm which could then be printed. There was a CalComp plotter but it would take a half hour per graph. Back then the earlier IBM 1401 Computer was used as a card sorter at the computer center I used.

My boss had an even rougher introduction to computers by hard wiring computer programs with plug in cables.

Collapse -
You want to see JUNK code?
by TreknologyNet / November 8, 2009 9:57 AM PST
In reply to: No Way

Write the simplest BASIC program:

10 CLS
20 PRINT "Hello world!"

Now run it through a compiler, and just take a look at how much JUNK there is. Why are there DOS messages about disk errors? Why is virtually every error message that DOS can ever generate for itself buried inside your compiled program?

For such a simple program, DOS is loading the file, and should handle all disk errors itself. Proper program execution only requires video access and then terminates. So why is the compiled version something like 50k?

Collapse -
Sorry, but you are wrong
by msgale / November 8, 2009 11:41 AM PST

An error messages for any possible error has to be included. If an error can happen proper design requires it, how else would the user know what to do if the program fails?

Collapse -
An error message is "possible errors"
by wasyed / November 8, 2009 12:24 PM PST

Sorry but I think you might be wrong TreknologyNet. When you get an error during programming (in any language) it usually gives you a number of possibilities, and of course they are all tied together sometimes too. one error can generate another error simply because the second error depends on the first and so on. But many many times when you get a bunch of errors, its throwing possibilities at you, its letting you know it can be any number of things. Error messages are there to help a programmer determine whats wrong.
Sorry but this is not what junk code means.

Collapse -
sorry
by wasyed / November 8, 2009 12:34 PM PST

sorry i just realized i repeated what msgale just said above me... Well at least this way I know someone agrees

Collapse -
The OS is already dealing with those errors.
by TreknologyNet / November 8, 2009 12:51 PM PST

If I want to deal with those errors inside my program, I don't get messages, I have to look for error "codes" and process them accordingly.

Collapse -
I think IBM had great error manuals
by Walter L. Johnson / November 8, 2009 7:05 PM PST

I never liked getting error in programming, but it seems harder now to diagnose errors with windows than it took to diagnose mainframe computer errors. IBM described basically every possible error code and in enough detail no one had to guess what system was involved. I once even diagnosed a floating point processor error that was due to a hardware failure the computer center didn't know about because only a few users used that component.

Now it seems that unless Microsoft has experienced an error and added it to the knowledge base decoding errors is a guessing game. Let me know if you know of a comprehensive error code manual as IBM had.

Even today you always have to start with the first error and work your way through the others, that aren't obvious in retrospect, one at a time until all are gone, as wasyed states. Each programming language is prone to different kinds of basically similar error. The logic ones are by far the hardest to fix and are rare among great programmers.

Collapse -
Errors
by msgale / November 8, 2009 10:18 PM PST

I believe it was called "Error Messages and Codes." DEC had a simular manual for the VAX computer. UNIX was a little different,

"Dennis Ritchie, designer of the C programming language, revealed his design for a new model of car today. Instead of the multiple confusing gauges on the dashboard is a single light that lights up with a '?'.

"The experienced user", Ritchie says, "will usually know what's wrong."

Collapse -
for info on "old" computers
by jonah jones / November 13, 2009 6:55 PM PST
In reply to: Old Computer
Collapse -
TRS80 = NOT the first comercialy available computers.
by Alain Martel1 / October 31, 2009 8:16 AM PDT
In reply to: You're Correct

When I bought my Apple ][+ with a full 48K of RAM, the TRS80 was not available. It only came along about 2 years later.

At the time, there where MANY different computers available on the commerce.
Atari, Texas Instrument, Comodor Pet and CBM, and many, many others.
Most had less than 48K of ram, sometimes, only 2 to 8K.
Every brand used different media for the programms and data storage, different versions of the Basic language, and where essentialy uncompatible with one another.

At the time, computers where 8 bits machines, with at most, the possibility to access 64K of total memory with a 16 bits address path.

Popular Forums
icon
Computer Help 47,885 discussions
icon
Computer Newbies 10,322 discussions
icon
iPhones, iPods, & iPads 3,188 discussions
icon
Security 30,333 discussions
icon
TVs & Home Theaters 20,177 discussions
icon
HDTV Picture Setting 1,932 discussions
icon
Phones 15,713 discussions
icon
Windows 7 6,210 discussions
icon
Networking & Wireless 14,510 discussions

Tech for the holiday

Find recipes for July 4 with these foodie apps

The Fourth of July means fireworks, fun and food. If you're planning on a barbecue this weekend, we've got the apps to help you find holiday-inspired recipes.