Windows Vista forum

General discussion

How bad is it to continue to use your pc during deep defragm

defragmentation* i am using system mechanic to do a deep defrag (this takes days) the last time i did it and re analysed the system afterwards i found it was as defraged if not more.

Discussion is locked
You are posting a reply to: How bad is it to continue to use your pc during deep defragm
The posting of advertisements, profanity, or personal attacks is prohibited. Please refer to our CNET Forums policies for details. All submitted content is subject to our Terms of Use.
Track this discussion and email me when there are updates

If you're asking for technical help, please be sure to include all your system info, including operating system, model number, and any other specifics related to the problem. Also please exercise your best judgment when posting in the forums--revealing personal information such as your e-mail address, telephone number, and address is not recommended.

You are reporting the following post: How bad is it to continue to use your pc during deep defragm
This post has been flagged and will be reviewed by our staff. Thank you for helping us maintain CNET's great community.
Sorry, there was a problem flagging this post. Please try again now or at a later time.
If you believe this post is offensive or violates the CNET Forums' Usage policies, you can report it below (this will not automatically remove the post). Once reported, our moderators will be notified and the post will be reviewed.
Collapse -
Two things

Two things:

1: Defragmenting really doesn't give any kind of a performance boost in probably 99% of common computer uses of the average person, and really is just unnecessary stress on the drive

2: If for some reason you're doing a lot of something that falls into that 1% exception area, then the more you use your system during a defragmentation session, the more likely files are going to be locked when the defragmenter wants to access them, because some other program is using them.

So basically, if you're not running a large database cluster, a high availability web server, or doing a lot of high end A/V editing, you're better off not even bothering with defragmenting. It will only offer meager and fleeting performance gains to programs where the performance is based on how fast the hard drive can retrieve data. As I said, that rules out at least 99% of all tasks commonly performed by your average user. For the overwhelming majority of us, it's a waste of time, effort, energy, and will send our hard drives to an early grave needlessly.

Of course I have an open challenge to anyone to find evidence to the contrary. So, if you can find a study done by some company that doesn't themselves SELL a defragmenting program, or that wasn't PAID for by a company that sells a defragmenting program... AND this study compares relative performance to things like web browsing, word processing, reading email, and listening to music files... I'd be interested in seeing it. I'm rather skeptical that any such studies even exist. I suppose it's possible a few were commissioned, but when no one could produce the results defragmenter peddlers wanted, they just buried them.

Collapse -
I slightly differ

In reply to: Two things

"1: Defragmenting really doesn't give any kind of a performance boost in probably 99% of common computer uses of the average person, and really is just unnecessary stress on the drive"

99% is a bit of an exaggeration. While I agree that running a defrag utility isn't going to make huge across the board performance benefits, I would bet that in a non-contrived situation that one would notice a difference more than 1% of the time. Due to the nature of Bittorrent virtually anything that is downloaded with a Bittorrent client will be fragmented. Some files will be in hundreds of fragments to a dramatic enough degree that playback of video files will actually skip. If you think that downloaded video files off a torrent site is a contrived example than you don't realize how popular Bit torrent traffic really is online.

"So basically, if you're not running a large database cluster, a high availability web server, or doing a lot of high end A/V editing, you're better off not even bothering with defragmenting. It will only offer meager and fleeting performance gains to programs where the performance is based on how fast the hard drive can retrieve data. As I said, that rules out at least 99% of all tasks commonly performed by your average user. For the overwhelming majority of us, it's a waste of time, effort, energy, and will send our hard drives to an early grave needlessly."

With modern NTFS partitioned drives fragmentation isn't nearly as much of an issue as it was on older FAT16 or FAT32 formatted drives, but an occasional defrag isn't likely to send hard drives to an early grave. I have drives that 8+ years old that still work that I have defraged frequently for years.

Even common sense tells you that occasional defrags shouldn't harm drive life much if they aren't beneficial. Most of the wear and tear on a drive is spent seeking things. Fragmentation increase seek time, which means that it takes more time for the drive to spin down. Every second less that the drive is spinning is one less second that the motor is moving towards the grave. While frivolous use of defrag utilities no doubt shorten the life of the drive, most users will replace the drive long before the drive even approaches MTBF. I encounter a *lot* of average users who have HDDs that are 5-8 years old and still work because their computer is hardly ever on. Most of the HDDs I have seen that had a premature failure weren't people who were defragging the drive every other day. Most of them were people who had BitTorrent clients downloading data nearly around the clock so the drive almost nearly spun down. If defrag kills so many drives where are all the articles about people who killed HDDs left and right by defragging often? The evidence for this claim is sorely lacking.

"Of course I have an open challenge to anyone to find evidence to the contrary. So, if you can find a study done by some company that doesn't themselves SELL a defragmenting program, or that wasn't PAID for by a company that sells a defragmenting program... AND this study compares relative performance to things like web browsing, word processing, reading email, and listening to music files... I'd be interested in seeing it. I'm rather skeptical that any such studies even exist. I suppose it's possible a few were commissioned, but when no one could produce the results defragmenter peddlers wanted, they just buried them."

The notion that a study is flawed because of who funded it is dubious logic. Some of the dumbest advice I have ever heard on computers has come from people who are on no vendors' payroll directly or indirectly. Attack the message not the messenger. Furthermore, any study would also have to consider video playback of video downloaded off the internet, where defragging the the file that will be played back can certainly be noticed even in a double blind test.

Furthemore, why would Microsoft spend money to license a defrag utility if 99% of their users wouldn't benefit from it? Even if the licensing were a few cents per Windows license that would still easily be in the millions of dollars worth of lost profit.

Defragging one's computer in general won't result in dramatic performance gains, but I think you are marginalizing this oft tip a bit more than is justified.

Collapse -

In reply to: I slightly differ

Interesting, a well reasoned and intelligent response. Usually all I get are a bunch of emotional knee-jerk reactions where try as you might, you can't figure out what the person's point was, assuming there was one. So, for that alone, I thank you.

Okay, you're right, 99% may be a bit of an exaggeration, but not by that much. And honestly, any decent bittorrent client will allocate the space for the file before it even begins downloading. I know Azureus/Vuze and uTorrent both do this.

Per my comment about sending drives to an early grave. Most of the people who would defragment way too often are people who wouldn't really know how to diagnose that it was their excessive defragmenting that helped kill it. Pretty much every staff reporter for Cnet, and almost the entire rest of the major computing rag outlets, clearly need a giant post-it note to remind them where the power button is on their computer. They probably have to use Wikipedia to figure out what a hard drive is.

I still personally wish Microsoft would "innovate" an idea that's been in place on Linux for several years now. Instead of using a simple FIFO queue for writing files, the operating system looks for the first available chunk of space that will fit the ENTIRE file. Failing that, it will break it up into as few pieces as possible. The result is an effectively self-defragmenting filesystem. On any Linux system I've ever run, even when I had large amounts of source files sitting around because I would compile absolute bleeding edge stuff (up to the hour) and use it, I never had fragmentation levels over 3%. One would think that such a "highly innovative" company like Microsoft, with its vast stockpiles of cash, could "innovate" some feature like this. Since "innovate" seems to have been co-opted by Microsoft to mean "copied" or "stolen", just like they've completely basterdized the word "genuine".


I think we can all agree that bad advice isn't hard to come by no matter where you look. The reason I gave the restriction of it not being paid for by some company that sells a disk defragmenter, is because if they're paying for it, they're going to want specific results. Namely that defragmenting is this absolutely wonderful performance enhancing thing that everyone should do. If they don't get those results, they'll likely shop around to look for a testing company that WILL give them the results they want, and all the other studies that show little to no gain will be buried and never see the light of day.

Microsoft's various anti-Linux studies are clear proof of this. And if it seems like I'm picking only on Linux, then you can look at how nVidia was gaming their drivers to detect benchmarking programs and then apply special settings that would give better results. A more recent example is how if you run a memory benchmark on Via's Nano CPU you get significantly different results depending on whether or not the Via CPU is identifying itself as a Via chip or an Intel chip.

And honestly, who in their right mind would put up the results of a study that show some product you sell to be of marginal usefulness at best?

As to why Microsoft licenses a defrag utility... Because they're hamstrung by their own commitment to backwards compatibility. It's also a good chunk of why Windows is still so incredibly insecure, tends to fall over at the slightest of nudges, and the Vista developers were bailing water like crazy, and throwing overboard everything that wasn't nailed down, just trying to keep that ship afloat. People expect there to be a defragmenter, so even though that era has long since passed, it sticks around. When Microsoft finally puts out a completely rewritten version of Windows, or even just retires Windows for something new, I wouldn't be at all surprised if the defragmentation tool goes away.

I'll just leave it with this little nugget... Old computer died after a severe manhandling by UPS when I moved. So, wind up with a new computer running Vista. Poking around looking for tips on improving performance, etc just to kind of kill time. Come across a suggestion to replace the Windows defragmenter with one from AusLogic or something like that. They also claimed to have a registry defragmenter, and I was intrigued by the notion. Anyway, on the defragmenter page, they mention how all computer experts agree that defragmenting, among other things, prevents program crashes. But you don't need to take my word for it, you can see a direct quote taken from here:

"Disk fragmentation leads to system slowdowns, PC crashes, slow startups and shutdowns."

Now personally, I'd like to know how shuffling files around on the physical surface of the drive will do ANYTHING for "PC crashes".

There's also the bit a ways down on that page where they start off with something true, about hard drives being the slowest part of the computer, but then out of nowhere claim it's because they have moving parts. And that read/write head movements are what causes system freezes. Apparently they're banking on no one remembering that the defragmenting process involves significant amounts of read/write head movement... So, if we follow their line of reasoning, the process of defragmenting should be causing system freezes, but strangely it doesn't.

Anyway, I'd like to again thank you for the intelligent and well reasoned response. A very refreshing change. You get so used to the people who think that because they've mastered how to turn their computer on and launch 2-3 applications, it makes them some kind of expert, and that their opinion is the only one that should matter, you almost forget what it's like to have an intelligent discussion.

Collapse -

In reply to: Interesting

I have read all these postings with interest,and I just wanted to mention that I am using both of the Auslogic's disk & registry programs.I have Vista a dual core AMD Athlon 4000+ processor,and there is a noted performance difference after their use.I do a lot of music file downloads and deletions so I use the disk defrag. prog. frequently.Highly recommend both programs they have been completely problem free.

Collapse -
Here's the real question

In reply to: Defragging

Here's the real question though: Have you actually taken a stopwatch and timed various operations performed once after a reboot, and repeated the process several times to develop an adequate set of data points to draw a statistical conclusion; or did you just expect things to be faster, so whether or not they actually were, you convinced yourself that they were?

Because without some hard data -- collected in a way that should eliminate the effects of caching on part of the OS which would skew the results -- all you really have is, "It seems faster." Problem is, you don't really know if it is actually faster, or you just expected it to be faster, so you're paying more attention to things, and decide that it must have taken longer before. Whether or not it did is impossible to say, because no one ever has any empirical data to back it up.

And yes, ask any psychologist who researches cognitive functions, and they should be able to tell you without hesitation that the human mind's perception of time is extremely malleable. You don't even really need a psychologist for that matter. You've got little phrases in the lexicon like, "Time flies when you're having fun," to further back it up.

So what we have is some guy/woman who's just short of spamming these forums to promote his/her employer's product, and a couple of people who think things seem faster, but don't really have anything to back it up but a perception.

When you get down to it, more RAM and a bigger buffer on the hard drive will make world's more difference than all the defragging in the world. Because with platter sizes remaining fixed, and only the density with which data is stored changing, fragmentation levels become less and less of an issue. There's a maximum distance the read/write heads have to travel, and the more densely packed the data, the less likely they will ever have to travel that entire distance.

Now, if there were a defragmenter that also took into account the physics of angular velocity, and even more so, you could put some of the most commonly used files in the places of highest angular velocity, THAT might be worth having and using. Of course that would require deep inspection of every file on the drive, and would take forever and a day to complete. Not to mention it would likely be a real pain to write such a program.

Collapse -

In reply to: Here's the real question

Jimmy..your detailed critique of my comments I though was worthy of a reply.You are quite correct I do not have any hard facts to support my claim of a "performance improvement".However,even without the use of a stop watch one can guage the length of time it takes for a page to load when using Explorer.It has been my experience that,following a "defrag" the page will load in a flash rather then a pane at a time.This gives me a sense of improved performance,and crossing the finish line first.Happy

Collapse -

In reply to: Defragging

To Jimmy:

As someone who has been in the PC/Network administration field for over 25 years, I will second your comments 100%.

To Ken:

Judging the results of a hard disk defrag by browser loading times is inaccurate at best. A web pages data (unless cached) is coming from outside your computer. Even a super computer can load web pages slow if there is enough traffic slowing down a hosts upload speed.

Collapse -
The problem is

In reply to: Defragging

The problem is that the human mind is very susceptible to suggestion, even when you're the one doing the suggesting.

It's basically a case of the self-fulfilling prophecy. You go into something with a set of expectations, good or bad, and then you will shape your impressions around those expectations. So, if you go in thinking that defragmenting will provide all these wonderful performance benefits, your mind will be prone to fabricating these results in their empirical absence.

Yes, this DOES happen, and quite a bit more than you'd think. Human memory is NOT as accurate as we would like to believe. So you can't even really trust your perception of how long things took before defragmenting. If you choose not to believe me on that, you're free to go find out for yourself. Call up someone in the psychology department of a nearby community college or university. It may not be their area of expertise, but they should be able to confirm that there are numerous studies that all show this to be true. So unless I can somehow manipulate the answer you would receive from every psychologist in academia, they would have absolutely no reason to a) lie to you or b) confirm what I have said.

And, as pointed out, a web page is a horrible test for defragmenting performance gains. For one, you're downloading data which makes it immune to the immediate effects of fragmentation, and then there's a host of networking related reasons why loading a page two different times can have two very different results.

So, short of some empirical evidence in the form of timing specific events, no one's impression is worth squat. Not yours, not mine, not the Pope's, no one's. You can choose to believe it has some kind of benefit, but the truth is, you don't really know. Maybe it does, and maybe you're just deluding yourself into thinking it does.

Collapse -
Defrag Benchmarks Are Available

In reply to: Defragging

A little Googling finds quite a few benchmark references for you to compare with.. Although most benchmarking is done by third party companies which are promoting a product, they generally compare against "no defrag", or the "Windows Disc Defrag" tool, plus another competitor's product and generally, it still indicates that a badly defragged hard drive does get a performance boost by defragging..

Clicking on this one opens a PDF whitepaper regarding defrag stats on Windows Server 2003.

Hope this helps.


Collapse -
(NT) Grif always you have the "source"

In reply to: Defrag Benchmarks Are Available

Collapse -
All three are invalid

In reply to: Defrag Benchmarks Are Available

All three of those are invalid, and here's why.

The first one gets a knock because it's a company trying to promote its own product. While they did provide far more details about how the testing was conducted and results collected, it's still a study done by a company looking to promote its own product. They're never going to publish some paper that says that the product they're trying to sell offers no real benefit. So that one's disqualified for conflict of interest.

The second one is simply invalid because they give no details on how the study was conducted or the results were collected.

The third is in the same boat as the second. No information how how the testing was done and results collected.

And before people start getting their panties all bunched, these aren't my arbitrary rules designed to create some impossible standard that no study could pass. These are all standard rules used by statisticians all over the world. How you conduct your testing and collect your results is usually far more important than the results themselves. I mean think about it. I could just claim that a survey of 10,000 people found that 95% hate Coca Cola products. Sounds pretty impressive doesn't it? However, how impressive does it sound when I then say that this survey was conducted at Pepsi headquarters, and only Pepsi employees were interviewed? What if I say that the Pepsi CEO sent out a memo before I arrived to conduct my survey, saying that anyone who said they don't hate Coca Cola products will be fired, and the CEO would be accompanying me personally while I did my interviews? Do these not seem like very important details? Details that might very well skew the results of the study.

Despite what it may seem like, my burden of proof is not really that high. There can't be an obvious conflict of interest (i.e. a company trying to promote its own product), there has to be information given about how the tests were done and the data was collected, and it has to relate to common desktop functions (i.e. web browsing, email, word processing, music playback). Now, assuming you can find a study where they actually bother to include info about how the tests were conducted and results collected, then we can at least say it passed the most basic level of scrutiny. Next will come digging into the details of how the testing was conducted and results collected, to look for any obvious flaws in the methodology.

Collapse -

In reply to: All three are invalid

I olny looked through the second one but one thing that struck me was that the computer they were defragging had just short on 1 million defragmented files. This basically takes it out of the real world as most people don't have half that number of files on their whole computer much less just the dfragged ones.

You may have a hard time getting through. I have customers to this day who refuse to turn off their computers ever and who still want to run defrag once a week.

I read an impartial study once on defragging but for the life of me I can't find it again so this is just hearsay. The study showed that a pc with a 10% defragmentation could expect a 1-2% performance increase by defragging. it takes a while to get up to 10%. I haven't defragged my pc in over a year ad its at 13%. Just upping your pc from 512 to 1 or 2gigs of ram will give you way more performance increase than that.

I usually tell my customers that once a year is fine. Twice if they are heavy users and never is fine too if they don't want to.

Collapse -
I'd buy those results

In reply to: reply

I'd buy those results as being credible on the face of it. Just the usual disclaimer about that being subject to change should I ever actually see the study for myself.

And it is sad, I know. I was recently at a job interview where the person was asking me what I would do if someone was complaining of a slow system. After going through a fairly exhaustive list, they asked me if I'd consider defragmenting, and I told them that I've never found it to be of any real help, but if the user really wanted it, I'd run it for them. I didn't get the job, but more because I later found out they changed the job description to something completely different from what I applied for, and just neglected to tell me. I have no real qualms about mentioning this was I interviewed with. I could even give specific names, but I'll just leave it at that I was left rather unimpressed with their operations, and a little annoyed that I spent quite a bit of money to fly in specifically for that interview just to be jerked around like that. Fortunately I had a friend who lived in the general area whose couch I could crash on.

ANYWAY, I think that's kind of the key here... While the hard drive is generally a performance bottleneck, the better solution really is to increase the amount of RAM available for caching frequently used data from the drive, as well as increasing the buffer size on drives. When the first 8MB buffer drives started coming out, you could see a noticeable improvement in disk throughput over older 2MB buffer drives. I don't think it's any real coincidence that there are now 16MB and even 32MB buffer drives.

I keep hoping solid state drives will really come of age soon. They're inching ever closer, but still not quite there yet. Then maybe we can create a new hard drive interface that can keep pace with the FSB a little better.

Collapse -
Roland, Interesting.. Different Figures Here..

In reply to: reply

Not that I disagree with your overall premise because many folks defrag way too often. This is certainly not a scientific test, but I just did a quick check on four XP PRO SP3 machines here by running the standard XP Defrag program in "Analyze" mode.. Since I perform the maintenance on all our computers and I KNOW that I performed a defrag on all 50+ machines last July 9 and 10, it is interesting to note that 3 of 4 have more than your 13% indicated, at least as it relates to "File Fragmentation". (Fragmenting "free space" has always been an interesting issue but I'll leave that issue aside.) It's only been about 4 weeks since a defrag was performed... See the results below:

Computer 1:
Total Fragmentation: 10%
File Fragmentation: 19 %

Computer 2:
Total Fragmentation: 13%
File Fragmentation: 26%

Computer 3:
Total Fragmentation: 1%
File Fragmentation: 2%

Computer 4:
Total Fragmentation: 10%
File Fragmentation: 20%

I defrag every couple of months on our work machines.. About the same on my home computer as well although the fragmentation percentage is generally higher on my home machine because of the things I do with it. And sorry about your RAM addition solution. I don't think that really applies. Hard drive performance still degrades from fragmentation, whether you not you add RAM.. Besides, extra RAM costs money.. Defrag doesn't..

Hope this helps.


Collapse -

In reply to: Roland, Interesting.. Different Figures Here..

It would be interesting to see a benchmark test run before and after your next defrag of the PC's. In fact, I think I may do it myself on mine since it's been longer. I'll defrag over night and post some results.

Collapse -
Well, Since This IS A Vista Forum...

In reply to: reply

I'm not sure any of our current tests, or this discussion, have anything to do with anything.. Vista is a little more difficult to run such tests unless the background/automatic defrag is completely disabled.. And unfortunately, I've got no Vista machines with which I can run tests on.. How 'bout you?

Hope this helps.


Collapse -
Ahhh.. "standard rules used by statisticians"

In reply to: All three are invalid

There is no such thing.. Unfortunately, "statisticians" can be just a biased as anyone else.. What's that old saying..? "Figures never lie.. but liars figure."

And surely you know that most testing of private products is done by the producer of the product. That doesn't invalidate the testing despite your contention that it does. Food, drugs, pesticide safety, and most other companies requesting approval of a product are required to pay for and provide their own testing results for certification of the product. It's the certification body that decides whether the testing is done correctly based on the validity of the test, NOT by who ran the test.

Oh, and another couple of minutes on Google comes up with another such test. As I stated in my previous post.."Defrag Benchmarks ARE Available".. All you really need to do is look. The link below isn't a bad set of testing results either.. Although it is posted on the Diskeeper site, the tests were conducted by an individual not associated with the company itself. I find it an interesting method for running the test.. A little different than most. And it really doesn't promote a particular product. It basically reports the impact of fragementation on file and program access.. No specific defragmenting program is even listed.:

Of course, all statistics can be rebutted but make sure YOU use those same "standard rules used by statisticians" when making your own decisions. It's your choice whether to accept the preponderance of the evidence, or not... As it is our decision to do the same. I find very few publications, and none valid by your own standards, which would support your statement below, even if you did exaggerate a bit,.:

"Defragmenting really doesn't give any kind of a performance boost in probably 99% of common computer uses of the average person, and really is just unnecessary stress on the drive"

Hope this helps.


Collapse -

In reply to: Ahhh.. "standard rules used by statisticians"

True, statisticians are biased, and that is precisely why they are expected to provide the methodology. It's part of a peer review process they adopted from the scientific method used by people in the "hard" sciences. Of course most of the time it's the marketing and PR people who will take the results of some study and figure out some way to make it work in their favor. One of my favorites was the RIAA funding a study on music piracy, which found 40% of higher education students pirate music. Of course you flip that around, and it becomes 60% of higher education students -- more than half -- DON'T pirate music. The numbers are just numbers, it's the conclusions you draw from those numbers that are significant.

And the kind of testing you describe is COMPLETELY different from the kind of testing done to promote a product. One is showing that there's little risk to the general public, the other is trying to hawk their wares. You don't see the obvious difference there?

It's one thing if I'm conducting a study to show that my product won't cause your computer to blow up. It's another when I'm conducting a study claiming that my product will improve the performance of your computer by 1000%.

So, since Diskeeper sells a defragmenting tool, and even seems to have someone from Diskeeper practically whoring themselves out on these forums trying to promote it... Nope, sorry, it's invalid due to conflict of interest. They're never going to publish, on their website no less, some study that basically states that their program is a waste of money, offers no real benefit, etc. It doesn't take someone with an MBA to figure that one out.

You really think that Microsoft is going to publish the results of a study that shows Linux having a substantial edge over Windows? You really think they're going to put that up on their Windows Server site? Make a big headline out of it, and pepper the entire website with links to that study. Can you honestly say you think that would happen? That no matter what the results, they would publish them unaltered?

What's more likely going to happen, is that Microsoft will simply bury the study and find another testing lab to conduct results. They'll keep doing this until they find one that will produce the results they want. Or, in an equally plausible scenario... The higher ups at the testing firm won't want to lose a big account like Microsoft, so they will tell the testers to give Microsoft whatever results they want.

That sort of thing does go on all the time, so don't go thinking you can dismiss it as some paranoid fantasy.

In any case, there are some standard rules used by statisticians, who tend to take the blame for the actions of marketing and PR flacks who take their data and contort it to whatever they desire. Statisticians are really just a form of math nerd. They usually don't care about the results, they care more about their credibility within the field, so will try very hard to get accurate results.

So again, I have to say that my burden of proof isn't really all that high -- more like a speed bump than a wall -- yet so far the 4 links you've posted have failed to overcome even that.

And don't go trying to turn the conversation around onto something I said. You find me even a SINGLE study that meets my incredibly simple guidelines and THEN we can talk about what I said. You already took up the challenge to find such a thing, so you forfeited your right to try and counter my claim until you've defended your own claim that defragmenting is useful. So either keep googling or admit that you cannot successfully back up your claim with any hard evidence. Don't go trying to pull stunts like shifting the focus of the conversation. I lived with a social worker for many many years, so I know all the tricks people use, and you're going to have to try a LOT harder if you want to pull one over on me.

So, again: Either you keep googling until you do find some study that can overcome the shallow speed bump that is my requirements, or you admit that you cannot back up your assertion with any hard data.

Collapse -
LOL, Right Back 'Atcha....

In reply to: True

Unfortunately, it seems like you're exaggerating again.. No one here ever suggested that a defrag would "improve the performance of your computer by 1000%"

From my point of view, I HAVE provided sufficient evidence for all but those who are unwilling to see the light. As already posted, the evidence is there if YOU choose to become enlightened. And I shouldn't have to do any more searching for the answer.. It satisfies me and most others.. Now it's your turn.. We have forfeited nothing.. If the folks you disagree with are required to provide evidence that fits your narrow view of "proof", we're waiting for you to provide supporting evidence for your contentions as well. So far, you've provided nothing more than your opinion.

So I'll quote right back 'atcha:

"Either you keep googling until you do find some study that can overcome the shallow speed bump that is my requirements, or you admit that you cannot back up your assertion with any hard data."

There's No difference in the tests.. If you don't think Monsanto, Novartis, and DuPont want to "hawk their wares" when they are required by the government to perform safety tests on their insecticides and herbicides, you clearly don't know the subject. They spend millions to make billions and they would provide the data for "clean" safety tests whether they were required to do so or not.. It's a selling point..

Living with a social worker.. hmmmm

Hope this helps.


Collapse -
A few other points of disagreement

In reply to: Two things

You make plenty of valid points, but in this day and age I don't think that we can consider the "average user" the sort that uses their computer to check their email and write the occasional Word document, and so "average use" scenarios aren't what they used to be. Plenty of folks are authoring videos, ripping DVD's, downloading music, and installing games whose size measures in the of gigabytes. These are all activities that can lead to quite a bit of fragmentation.

In addition: more and more business applications run databases on the back-end. Within the vocational technical school that I work for full time I can count at least 15 different desktop applications running SQL Server Express on the back-end; and I run Microsoft Small Business Accounting for my part time consulting gig, which uses SQL Server Express as well. So in regard to the comment about databases, its not as though they are limited to large-scale business environments anymore.

In regards to defragmentation sending our disks to an early grave, I wonder how true this actually is. I would love to see some sort of tests done to prove one way or the other whether defragmenting files once or constant access of fragmented files is harder on a drive. Personally I would think that if files were rarely changing (more readings than writes) that it would be easier on your drive to fragment once and then read the files back from a single location, because the hard drive wouldn't cover as much distance to pick up all the file's pieces. But if the files are constantly changing, a defragmentation operation would basically be undone when they are changed. Again, I'm just speculating about this.

However one common use case in which defragmentation shines is in a case where all of the files on a drive need to be scanned. One obvious scenario would be antivirus and spyware scanning. Another might be when you want to search for a file. DiskKeeper published a white paper with results of running various antivirus applications both before and after a defrag. They did the tests using a variety of defragmentation programs, not just their own. All tests showed that defragmentation prior to a virus scan sped up the scan by a significant amount, regardless of the defrag software used and the antivirus software used to scan.

Collapse -

In reply to: How bad is it to continue to use your pc during deep defragm

Your system should be as "idle" as possible to help reduce defrag interference. Afterall, defrag is a serious task and it should be given all the time it needs. Disable or shutdown any backgrd. tasks/options to help in this regard you deem necessary. I've found MS own defrag feature capable of doing this task as any other pgm. out there. I suggest if possible the pgm. CCleaner to reduce some clutter, etc., to reduce junk files and or temp. files that are useless anyways beforehand or use MS own disk clean feature under the system tools. Every bit helps, IMHO.

tada -----Willy Happy

Collapse -
Defragmentation and PC usage

Hi Rammstein420,

I cannot comment on other defragmenters, but with Diskeeper 2008, you can safely use your PC when defrag is in progress.

Diskeeper 2008 is designed to be a fully automatic defrag solution that runs in the background and defragments the disks in real-time using only a portion of the unutilized system resources. This resource utilization technology is called InvisiTasking, and makes the defrag process completely transparent to the user because all the other computing processes are given higher priority. Defragmentation will automatically initiate, pause/suspend and resume according to the CPU activity and disk I/O environment.

If you wish to know more about InvisiTasking,Diskeeper or defragmentation, please visit the White Papers section of the Diskeeper website which has free downloadable pdfs.

You can also download Diskeeper 2008 and evaluate it for free (30 days) to see if it meets your requirements.

Best Regards
Diskeeper Corporation

Collapse -

In reply to: How bad is it to continue to use your pc during deep defragm

DEFRAG runtime is a function of the number of sectors on your hard drive that are being utilized. The reason the system might need to DEFRAG occurs because the read-head seek time becomes excessive as it moves back and forth across the disk cylinders searching for the next fragment in the file you are accessing. This is an issue due to the mere fact that it results in degraded computer performance. If I feel I just can't help myself I typically run DISKCLEANUP before the DEFRAG. DISKCLEANUP will show you what files it will delete for you. In addition, if you choose "MORE OPTIONS" you can remove system restore checkpoints and backup images which can use up quite a bit of your hard drive. This will speed-up the DEFRAG. To find these programs use the microsoft "flag" key. Click on "All Programs", click "Accessories" then click
"System Tools". Now you will see the cleanup and defrag programs. As far I know these are Microsoft programs. Hope this helps.

Collapse -
Deep defrag

In reply to: How bad is it to continue to use your pc during deep defragm

I have no idea what a "deep" defrag is, but a simple defrag as done by the MS defrag should be all you need, and it'll only take hours, not days.

It's okay to use your computer while MS's defrag is running, though it'll slow it down and it might not do as good a job. I find it runs faster if I turn off my screen saver first.

Collapse -
Fat32 and NTFS.

In reply to: How bad is it to continue to use your pc during deep defragm

If you're using the Fat32 filesystem, then it's a really bad idea - if you crashed, your filesystem would become inconsistant, and there would be a chance that you'd lose data because of it. Even a chance of catastrophic data loss. I wouldn't do it.

You're probably more safe with NTFS, but you'll suffer low performance during it. I believe it's possible to stop a defrag mid-way and then start again later which could be of use to you. Ideally, you should be using as few files as possible when you're defragmenting because they will be locked and the defragger won't be able to move them at that time.

The best way of defragging is to back up all your data to an external drive, reinstall Windows, and put your data back.

Collapse -
defrag the drive

In reply to: How bad is it to continue to use your pc during deep defragm

I would not suggest using the computer or running programs in the background while defraging the computer. It can interrupt the process and make it take even longer. You may want to check out AusLogics Disk Defrag. It's free and it does a really good job at defragging the hard drive.

Collapse -
No real problem

In reply to: How bad is it to continue to use your pc during deep defragm

If you just use the internet while it does it's thing then just a little slowing down of your computer is normal.As to whether you are in the weekly defrag ,me,in that it the quicker access actually improves reliability or in the school that you ware your drive down with that much defraging.If you use an excellent free defrag like Iobit,,which is on all the time,defraging on the fly,and open Windows own tool and check whether needs doing it will state do not bother,which proves how good it is,and is much faster than Windows defrag.

Collapse -
Don't bother

In reply to: How bad is it to continue to use your pc during deep defragm

I do PC support for over 1,000 PCs and have been doing so for the 23 years I have been at my company. We NEVER defrag servers, and rarely defrag PCs. Only the old, slow ones get defragged and only then if the analysis is solid red. New ones, running Vista, could hardly be at a performance risk from fragmentation. The OS buffers reads and writes, uses look ahead algorithms and goes to great extents to make sure speed it top notch. The disk would have to be massively fragmented for you to notice anything. So, the "deep defrag" every week or so is a waste of time. Do it once a year and you might notice something, but I doubt it.

Collapse -

In reply to: Don't bother

Here are the results of my unscientific test. I just went online and downloaded the first free defrag utility I could find. I ran this under vista business 32bit on a secondary 150 gig SATA drive that I don't use for anything anymore. It used to be the primary drive of my XP build and it's been installed in this machine since before I installed Vista on a different drive. Whether or not it's been auto defraging I don't know. It may have been but reguardless, Analyzing the drive reported 18.05% fragmentation.

Analyzed with Smart Defrag version 6: 18.05% fragmented

Ran Passmark HD tests. Final result: 66.6

Ran Vista defrag
Ran Smart defrag and optimize

Analzed with Smart Defrag again: 12.89% fragmented

Ran Passmark HD tests. Final result: 70.5

Total performance increase? 0.944%

So my result showed that on my system, reducing the fragmentation by just over 5% increased performance less than 1%.

I know that this isn't the be all end all of what I could accomplish but I was just simulating ma and pa kettle clicking 'defrag now'

This is consistant with what I have seen over the years which is why I don't recommend defraging very often.

I also have screen shots of all the tests I may post if I get to it.

Collapse -

In reply to: Results

..for running the tests on your Vista machine.. I've run similar tests on our XP machines and come up with results that were clearly based on the amount of fragemented files on the drive... I find it interesting your "AFTER defragging" "Analyze" data from the Smart Defrag showed that Vista was STILL 12.89% fragmented.. So either the Smart Defrag was analyzing the data incorrectly, or neither of the drefraggers were able to defrag much of the file structure. "Locked files"?. In the XP test I did below, the defrag tools were able to bring the fragmented files down to 0%.

Just for grins, I used those same tools, on this Windows XP SP3 machine, 1.8 GHZ dual core, 80 GB IDE HD.. As to the performance difference, my results are similar to yours although this one started out with very few fragmented files..

Analyzed with Smart Defrag version 6: 4.85% fragmented
Analyzed with Windows XP Defragger: 7% Overall 15% Files fragmented
(Interesting difference in percentages analyzed by each defragger.)

Ran Passmark version 6 "Disc" tests. Final result: 65.7

Ran Smart Defrag and Optimize
Ran Windows XP Defrag

Analyzed with Smart Defrag again: 0.0% fragmented
Analyzed with Windows XP Defrag again: 0.0% fragmented

Ran Passmark version 6 "Disc" tests. Final result: 66.3

Total performance increase? 0.990%

Similar to your results, a drop of 5% fragmenation showed about 1% increase in performance..

Hope this helps.


Popular Forums

Computer Newbies 10,686 discussions
Computer Help 54,365 discussions
Laptops 21,181 discussions
Networking & Wireless 16,313 discussions
Phones 17,137 discussions
Security 31,287 discussions
TVs & Home Theaters 22,101 discussions
Windows 7 8,164 discussions
Windows 10 2,657 discussions


Sublime suburban chariot

High on style and technology, the 2019 Volvo XC90 is an incredibly satisfying everyday crossover.