Fragmentation is an interesting subject and here are some random thoughts since i have time on my hands, and nothing to do.
I have heard about heavy file fragmentation being the cause of system slowdowns and freezes. From my understanding, it seems to due to physically non-continuous placement of files on the drive platter. When a file is modified or deleted, it can leave behind spaces or result in more than 1 fragment depending upon whether the modified file is smaller or larger than the original file. The problem begins from this point onward. Future files that are added may be broken up and slotted into non-continuous spaces and so on and so forth.
So now, when the harddrive head has to read or write files it may not be able to do so sequentially, i.e. it has to hunt all over the drive and collect bits of files and piece them together. Or write small pieces of files into many free slots. Either way, the time for the read/write operation is increased.
The hard disk is already the performance limiting factor in most desktops, given the blazing speed of the CPUs and RAM these days, so you have a condition where an already weak link is further aggravated, resulting in poor system performance.
Ofcourse, there are many skeptics who claim that fragmentation has no measurable performance on a system. There was even an article I read in one of the online mags (PCworld or PCmag or something like that) on 15 common myths. Reduced performance due to fragmentation was mentioned as one of the myths.
I personally feel that it would depend on your system usage. If you generate/modify/delete large files or
large numbers of files, you are bound to notice file fragmentation related slowdowns. These days, with the John and Jane Q Public playing around with huge audio and video files on their home PCs...editing creating and deleting audio/video content, browsing the internet and filling up and deleting the caches etc, my gut instinct tells me that they run into fragmentation related hiccups without even knowing it. Then they wonder why their shiny still-new Dell or HP is crawling along instead of being the zippy performance machine the ads promised (and it was, when brand new), and blame it on 'stupid' dell or 'lousy' HP.
Apparently, the NTFS system was designed to minimize the fragmentation headaches that plague FAT, but looks like it doesnt work that well, considering Microsoft itself warns against fragmentation and includes a built-in defrag utility. IIRC, they particularly warn against the fragmentation of the master file table. Many IT admins in corporate environments seem to be terrified of fragmentation on their managed workstations or servers. Probably it results in productivity loss when cumulative effects over a large number of users and computers in a network are considered. And if you consider that these workstations see heavy duty I/O during workdays resulting in a large number of generated and modified files, presence of file fragmentation would make sense.
Personally, I defrag every 2-3 days on my home system, but I play a lot and do a bit of torrenting, so my fragmentation levels are understandably not the best.
Sorry if, for a first post, I wrote a long, boring story and merely repeated something obvious.....just had some spare time on my hands.