155 total posts
(Page 1 of 6)
The integrated graphics chip on the new will be far faster than what you have in your old PC.
If you are unhappy with the performance of the new PC you may always add a graphics card later.
If the integrated graphics uses 256 meg of the PCs memory make sure the new PC has at least, preferably 4 or more gig.
Ideally the motherboard should allow you to upgrade to at least 8 Gb.
re: Buying a new PC, will I regret the integrated graphics
Integrated graphics cards are not for gamers. They will only handle the basic GUI of windows. They will be able to play games, though older ones that aren't as graphically intensive as new games.
If you do want to play new, modern games (from 2007 onwards) you will need to buy a graphics card. Most, if not all of them go into the PCI slot on your motherboard and handle all graphics on the computer.
Think of having two hands. Two hands can type a document on a computer with ease. If you tell those two hands to type on a computer and draw a picture, it will be able to do both, but much slower and with sub-par results. Think of getting a graphics card as getting an extra five pairs of hands. The two original ones will still be typing a document, but the ten extra ones can be drawing you a magnificent picture.
re: Buying a new PC, will I regret the integrated graphics
Nice comparison. and you are correct
Integrated Graphics Cards
What a wonderful analogy by wkw427, as a teacher, I will use that when explaining to staff the merits of separate graphics cards! You're absolutely right, by the way!
Not so with all humans(see under creatures)
That two hands can type with ease certainly doesn't apply to me, a one finger dolt be I.And yet can type and draw at same time must mean I don't need a graphics card at all but is integrated in my tiny brain.
If it is a high-end mobo; even SLI integrated graphics are possibly built in. Just a quick look over at the online stores will confirm my claim.
The beauty of doing that is many of them are SLI bridge capable also! So you can put another compatible SLI card in an available express slot and whalla! You got dual card capability, with one of them already built in!
I would only do this on a 64bit system so the bus would be properly used to stream the data in an efficient way. But then, I'm not a gamer (YET), so I would only use 64bit games anyway, (so far).
JCitizen, read what you wrote again
SLI is 2 or even 3 dedicated graphics cards. You cannot bridge the onboard graphic chipset with an add-on graphics card for dual support. In fact, you would disable the onboard graphics chip in order to use a dedicated video card. Also, most motherboards in store-bought systems are not high-end, but usually entry-level motherboards. An important consideration to look at before buying one of these systems is whether it even has room to add a video card in ... some don't.
Not always true...
A Limited number of compatible motherboard chipsets and GPUs using Hybrid SLI / Geforce boost can indeed combine the power of the onboard gfx and dedicated GPU.
There is a lot more to consider besides the amount of Ram dedicated to integrated graphics. Most onboard GPU's would not make good gaming machines. Be sure that the machine you buy has a dedicated slot for an add-on card. If you find the onboard graphics are not up to snuff it is a simple matter of adding a dedicated card and disabling the onboard graphics. Most newer motherboards will see the new card, if not you may have to go to the BIOS and disable there. I would recommend at least 512 megs of RAM for a dedicated card and doing a bit of research. The biggest makers are ATI and Nvidia. Both have good lower priced cards but Nvidia drivers seems to support games developed with Open GL better than ATI so this is something to consider.
Integrated Graphics Card
First of all, you asked two separate questions. The answer to the first is that shared memory means that the on-board graphics card shares memory with the system ram. For example, if you have 2gb of ram and the graphics card uses 256mb of shared memory, then 256 mb of the system ram is used for the graphics card and subsequently the usable system ram for applications is reduced by that amount to 1.75 gb. The second question asks whether the integrated graphics card is "up to snuff". This depends on several factors, including the onboard graphics chipset and the quality of the drivers. For most applications, the onboard graphics card will perform well. However, for very graphic intense applications such as video editing and high performance gaming, you would probably want to update to a dedicated video card with a better graphics processor and drivers, and possibly more memory, such as those from ATI and nVidia. This will also free up the 256mb of shared memory for applications to use.
Games = Dedicated graphics card
If you plan on playing light / moderate cpu taxing games (such as flash games or web games) then you can probably get away with an integrated graphics card with today's CPUs.
If however you are looking to play more graphically demanding games (such as Call of Duty, Assassin's Creed, World of Warcraft, Crysis, etc.) you definitely need a dedicated graphics card.
The reason is integrated graphics rely more heavily on the CPU to make all the graphical calculations and renderings which can bog down the computer's overall performance. Memory is also usually shared with the main memory which again can affect performance.
A dedicated card has it's own dedicated GPU and memory which does all the video calcs and rendering which, in turn, frees up the CPU for other tasks. This creates a much more efficient system and gives a huge boost in overall performance especially with the more graphically demanding games.
Also, if you do go with dedicated, do it right! Don't slouch on the quality of the dedicated graphics card's chipset. A low-end/older dedicated graphics card can be almost as bad as using the newer integrated graphics chipsets.
pc f rom costco
I have built 3 pc's. I always thought I should buy the best graphics card I could find. Then I decided to buy a complete pc from Costco (quad core w/integrated card). I did add more memory and use additional external hard drives. I can't tell any difference. I don't play high graphics games. I am a retired grandmother; my kids and grand kids can't tell any difference either.
integrated graphics card question
The Bad news: It wont run those graphic intensive games.
It will most likely handle all the general tasks like email,web surfing etc...
The Good News: As Long as the motherboard in the system also has a pci or even better, a pci express slot, or even still better a pci express 2.0 slot available you can upgrade to a good graphics card in the future as a decent video card today at 200-300 dollars, will a year from now be 60-125 dollars!
Integrated Graphics Cards
I bought a new HP computer with the latest integrated graphics card a couple of years ago. It worked great until the graphic demands for the new games got bigger. I had a very fast CPU chip, but the new grapics were beyond the capacity of the integrated graphics card.
No problem, I thought... I will just buy a new Graphics Card and turn off the Integrated Chip.
There was no slot on the mother-board to fit a new grapics card inside the computer!
I had no option other than buying a brand new computer.
A computer I was sure could be expanded for future demands.
It Apparently Depends on the HP Model
First of all, all HP computers have integrated graphics that use system RAM instead of having their own memory. Also, if you install a graphics card, you cannot run both the integrated graphics and the PCI card. After installing the PCI card, you have to go into the Device Manager and disable the integrated graphics, otherwise all sorts of weird things will happen. After installing the card and its drivers, and before disabling the integrated graphics, you will have to shut down the computer, disconnect the monitor from the integrated graphics' connector, and connect it to the connector on the new card. Then boot into Windows and disable the integrated graphics in the Device Manager and reboot.
Regarding AlanG59's post, I have an HP computer, too; but I didn't encounter the problems that AlanG59 encountered except the need for a more powerful graphics card with its own GPU and memory.
I do a lot of graphics work, including video editing and compilations, so the integrated graphics were inadequate. It's true that HP only uses integrated graphics that meet the needs of the software and hardware that come with the machine. If you need more graphics rendering capability, then you will have to add a graphics card. To solve my problem, I bought a new Nvidia graphics card. When I opened the computer's case, I found that I had two empty PCI slots. It was a simple matter of installing the drivers and graphics card, connecting the monitor to the new card, and disabling the integrated graphics. The only problem I have had related to the new graphics card was when I installed Linux in a dual boot configuration. I could not get the Nvidia card to work with Linux, no matter what I did. Nvidia did not have Linux drivers for my card, and the application that is supposed to allow Linux to use the Windows drivers did not work. All I got was a black screen. It was either uninstall Linux or uninstall the Nvidia graphics card. I opted to uninstall Linux rather than give up the improved graphics that I had gained from the new graphics card.
So, if you buy an HP computer and you are going to be playing graphics intense games, then you may as well include the price of a good graphics card.
The user manual should give you the specs for the computer, including the total number of PCI slots and the number of PCI slots that are free (as in not being used). Also, a glance at the back of the tower will be an indicator. The number of blank PCI card knock-outs will usually be the same as the number of available PCI slots.
You can also go to HP's Web site before you buy the machine and look up the specs for the computer you are thinking of buying.
Get a computer with a separate graphic card
I have an HP with an integrated graphic card. It was slow and used system memory. I plugged in a new graphic card, and got much cleaner images, faster rebuilds, etc. But...you can't depend on the device manager to keep the old integrated card disabled, and you can't trust a machine whose nature has been preset to work with an integrated card. You upgrade this, change that, load a new game, etc. Preferences get changed automatically...and the old graphic card is activitated, stealing memory, slowing things down, etc., or the machine out of habit supports the old card as well as the new. If you know you're going to run a faster separate card eventually, get a machine running a separate card already. It'll be in its nature, and you may avoid the lovely conflicts that arise among plug-in cards.
All you had to do is buy a newer and better motherboard and transfer everything to it.
It depends, laptop or PC? Enough space for a good card?
The answer to your question if you're buying a laptop is pretty easy, yes, you're pretty much stuck with whatever graphics capability comes with it, whether it is integrated or a separate chipset. Laptops have very limited expandability.
For a desktop, the answer is more interesting. First off, no, you aren't limited to integrated graphics as long as you have one or two PCI-e slots, and a decent power supply (PSU). You can add a gaming graphics card, assuming that you have a roomy enough case (take a look at the dimensions of an ATI Radeon 5850, for example, good graphics cards generally only use one PCI-e slot, but are double-width). Chances are when you do this you'll also have to upgrade the standard PSU...they're usually only in the 300W range.
For example, when I upgraded the graphics card in my Dell desktop PC, I also upgraded the PSU to 650W, since the stock PSU was only 375W, and lot of the graphics cards require about 500W. Also, I needed to make sure that I had space in the case for the graphics card that I was going to buy. Cooling can also be a problem, but most OEM systems should be adequately cooled, and the graphics cards all have built-in fans as well.
So, within the above parameters, yes, you'll be able to upgrade your graphics capabilities if you get a decent PC with enough extra space to begin with.
Integrated vs. dedicated.
With regard to graphics processing units (GPUs), there are generally two different types. One type, like that in the computer you're considering, has an integrated GPU. A useful way of thinking of integrated video cards is with the word 'shared' - an integrated card shares some of your computer's memory. So, say that you have 2GB ram in your new PC - 256mb of this is portioned out to the video card.
The other type is a dedicated GPU. These are video cards designed to have their own memory - some these days reaching 1GB. The advantage of this is that, given their name, they have their own dedicated memory, and are generally designed to be much more powerful.
It's difficult to give you full advice without knowing specifically what integrated card the PC uses. What I can say is that an integrated card will never give you the same graphics processing power as a dedicated one. Because the dedicated cards have their own memory, they are not sapping memory from your PC. Consequentially, games will run much more efficiently on a dedicated card. If you want to play more recent, graphically intensive games, an integrated card will only suffice if you turn all the settings down, somewhat defeating the point of it being a recent, graphically intensive game.
Remember that when you plug your monitor into the tower unit, the connection is made at your graphics card. These things are not just for powering games; if you're going to be watching high-quality movies, video- or photo-editing, a more powerful dedicated graphics card will always put you in better stead, and let you do more.
The pure joy of a desktop is that generally speaking you're free to upgrade it whenever you want, and upgrading RAM, sound cards, graphics cards and the like is remarkably straightforward (although note that some manufacturers consider user-upgrades to be a breach of the warranty). A common misconception is that graphics cards cost the earth. Yes, the newest and most powerful ones do, but plump for one of the better models from last year, and you'll get great performance with plenty of bucks left over. Before you do this, you do need to check what type of motherboard you have and match this up to the cards available. But no, if you plump for the cheaper option now it does not mean you'll be stuck playing Solitaire for ever!
dedicated, vs integrated.
Hoping to upgrade? Well now, lets be a little careful
I once had an ACER computer that for the price gave me everything and more than I needed.
One problem. Most or all of the components are hard wired in. You cannot, yes, I repeat you cannot replace them , or upgrade them without some major surgery. (except memory)
I'm not sure that it's true with all ACER computers, and I don't know why they do it that way. It seems to me that it would be easier just to assemble parts, and let it go at that.
In the final analysis, make sure you contact the COMPUER MANUFACTURER, (not always the easiest thing to do) and find out if the computer you are interested in is upgradeable, and what components or all components are upgradeable.
Now Acer does have a good customer service website, so I've not had a problem adding on some components that hang on outside the computer.
After five years it still works fine, but it does have its problems occassionally. Like not wanting to turn on at times. (you wouldn't believe what I have to do to get it started)
re: ACER ASE500-U-P9300
Integrated graphics doesn't mean that one cannot simply (in most cases) add a second graphics card in an open PCI slot and disable the graphics that came attached to the motherboard. Then you just need to know to which VGA port you attach your monitor to.
Integrated vs, Dedicated
I said it does not matter cause it is something of a silly question for the vast majority of cases. For all practical purposes they all come with integrated cards. If you buy an upgraded computer with a dedicated card from most of the "big boys" they stick a card in an available slot, disable the integrated one, and if you are lucky they put caps over the integrated connectors so you dont plug the monitor in the wrong hole. If you did not upgrade or buy a higher end model the integrated card stays enabled, you paid less money, and later you can stick the card of your choice in an available slot and if you are good will cap off the not used holes. Try and order a custom upgraded
computer from HP or Dell etc. and see if you can get them to remove the integrated one!
Dedicated all the way
Even if your on a tight budget,you can purchase dedicated cards that are $50 bucks and under that will give you everything you need and more.Also your stuck with a vga monitor connection on almost all integrated gpus.Dvi or digital connection that is on almost all dedicated gpus is by far the way to go with all today's great picture quality options.Remember you do not have to remove your intergrated setup to add a dedicated at any time,that's what I did and if my gpu ever breaks I still can go back to the on board graphics.
PCI Express Graphic Card
I would not go with integrated graphics unless absolutely necessary. Many of the newest games require 1 GB of graphic memory. If you think you are going to want to play games, go with a graphic card that has 1 GB. or better. Some avid gamers run two or four cards in the same computer! You don't have to have the latest greatest card either. It should not cost you a whole lot more to get a desktop with a PCI Express graphic card.
I almost certainly avoid integrated graphic card
When I got my HP PC back in 2004, it was shipped in intergrated graphic card. I didn't know the meaning of it. Due to my own curiosity, I came to realize that having standalone graphic card is always better than having integrated one in term of performance. By having standalone graphic card, the burden of CPU and the amount of RAM that has been used will be reduced. GPU will take over the graphic processiong job and there would be dedicated memory for graphic so that your ram could be used for other tasks. If you don't use your PC for resource-intensive job, integrated graphic card will just do fine. Before running softwares, it's a good practice to check their system requirement. I'm not a hardcore gamer. But I do use Autocad, Adobe Flash and Photoshop. They do consume a lot of resources.
The major limitation of the integrated video is not the 250MB memory but the processing power - which will definitely not handle the latest games, nor a good many older ones. If the computer has an empty PCIe slot, then you can add the video card of your choice at any time. There are just two things to check out: (1) is there room for the particular card you want? - some are quite long, (2) Some high end cards draw a lot of power, so you should check that your power supply unit is adequate.
I bought my computer with integrated graphics because the price was good. I used it for everything but games, no problem. Several months later, I added an Nvidia 9800GT card. This has been good for all the games I've thrown at it so far.
Graphics Integrated Lee is correct
It has been mention numerous times that memory is NOT the key component for graphics but the graphics CPU chip or GPU is the key component. Now I will also agree that your question is "mute", most PCs come with an integrated graphics card and most motherboards have a PCI-e slot or regular PCI slot where you can install your own graphics card....so search the net for that PC and see if find out if the motherboard has an extra slot for graphics. And yes a mid level graphic card a few year;s old between $ 80-150 (w/rebate) will handle all your needs.......unless you are a gamer and gamers are never satisfied...they will always upgrdae.
When I brought my E-machine PC (now retired) back in 2003 the integrated card handle everything I threw at it and play all my son's games which were a few years old. In six months I purchase a seperate card so he could play newer games. And on my current system.....I ordered it with a graphics card so he could play all his games. So find out if it has a slot and don't worry. When you buy a card, to play it safe buy 512 mbyte of ram and if affordable buy 1 gbyte RAM to ease your worries.
Moot, not mute! :-)
Not trying to be one of those PIA grammar police posters, but since you put it in quotes, I was compelled to correct. I think you meant MOOT (as no longer applicapble or relevant)rather than mute. In an otherwise intelligent post, I know I appreciate if someone points out I use a word wrong - don't appreciate it when it's just typo, as we all do that when our fingers can't keep up with our brains!
see...I did it myself <holding head in shame!>
Back to Desktops Forum
(Page 1 of 6)