Laufwerk / Festplatte komprimieren Ja oder Nein?

Bogeyman

Banned
Registriert
März 2012
Beiträge
3.876
Hallo, es gibt ja in Windows diese Funktion Laufwerke komprimieren. Irgendwelche Meinungen dazu ob man es an oder ausstellen sollte? Gibt es vllt sogar irgendwen der mal verglichen hat wie es sich auf die Performance auswirkt ob negativ oder positiv?
 
Früher zu Zeiten von Windows 95/98/ME war das ein echter Performance-Killer. Heute macht es nur geringe Unterschiede. Ich rate aber trotzdem davon ab, da es sowieso kaum was bringt.
 
Das ist doch viel zu allgemein gesagt!
Wenn ich eine VM auf einer komprimierten Platte laufen lasse, habe ich ein Problem, da alles viel zu langsam ist.

Bei einigen Dateien bringt die Komprimierung aber sehr viel!
 
Wenn ich eine VM auf einer komprimierten Platte laufen lasse, habe ich ein Problem, da alles viel zu langsam ist.

Ich weiß ja nicht wie Windows da genau arbeitet aber es gibt durchaus auch "klevere" Algos wie LZ4 wo eben nur die Daten komprimiert werden wo es sich auch lohnt und der Performanceverlust bei nicht komprimierbaren Daten quasi nicht ins Gewicht fällt.
 
In Zeiten der heutigen verfügbaren Kapazität sage ich: Das lohnt sich nicht. Auch, wenn du eine SSD besitzt und die Daten darauf komprimierst. Es ist zwar nicht spürbar aber es lohnt sich nicht. Lass es sein.

Gruß Andy
 
Auf der anderen Seite Sandforce SSDs komprimieren ja auch, habe mehrere Samsung 830/840, welche das nicht tun.
 
Lass es. Lohnt sich nicht. Richtig komprimiert werden nur Textdatein und diese sind normal sowieso kein. Große Sachen wie Filme sind von Haus aus komprimiert. Und dadurch, dass die benutze Komprimierung ziemlich schnell und Ressourcenschonend sein muss, ergeben sich bei diesen Daten fast keine Vorteile :)
 
ich habs bis za. Windows 7 aus Gewohnheit deaktiviert - weil es wohl in der Vergangenheit "schlecht" (ein Performanz-Killer ?) gewesen sein soll,

die Erfahrung(en) und Tests sagen heutzutage was anderes:

http://superuser.com/questions/411720/how-does-ntfs-compression-affect-performance


Activating NTFS compression is worthwhile on high-end computers with fast multi-core processors because you can squeeze more storage space from an SSD without compromising the system performance significantly. Our comparison to powerful compression tools like 7-Zip shows that NTFS compression isn't very aggressive and it excludes important Windows system files. Thus, it's not designed to extract every last bit of capacity that might otherwise be available.

But it's precisely this approach that ensures the processor is not too heavily loaded. Users with a modern dual- or quad-core CPU should not notice the additional load incurred by enabling compression. Due to the generally lower performance available from notebooks, the same wouldn’t necessarily hold true on mobile systems. Depending on the hardware, compression could both impact speed and take away battery life as a result of a higher processing load.

...

Despite moderate compression rates, NTFS compression does indeed conjure up some much needed free SSD space. On our test system, it gave us back an impressive 12.5 GB. This is a benefit to small SSDs, in particular. If you're only talking about a 60 or 100 GB drive, reclaiming more than 10 GB is huge. Owners of large SSDs should keep NTFS compression in mind too, though. So long as you have a fast-enough processor, there's no reason to not consider enabling it.
http://www.tomshardware.com/reviews/ssd-ntfs-compression,3073-11.html


<-- von 2011, inkludiert SSDs in die Überlegung


Conclusion

Compression algorithms that're carefully tuned for a particular task will always do a better job at that task than will general-purpose compression schemes. Especially if the general-purpose compression is very computationally fast, which NTFS compression is.

It's hardly surprising that PNG, a format specially designed to losslessly compress image data by the greatest amount possible, greatly outperforms NTFS compression for that task. Heck, even quite speedy PNG (as opposed to the super-intensive form used by PNGOUT) takes several thousand times as long to compress an image as NTFS. You'd bleeding well want it to work well.

Likewise, ZIP and RAR and 7z and pretty much all of the umpteen other compressed archive formats also outperform NTFS compression, at the cost of far higher computational demands and non-transparent operation.

If you're running Windows and, for one reason or another, you have to deal with a lot of airy data on the fly, or if you just need to claw back some free space on an NTFS storage device that you can't trade for a bigger one, NTFS compression is safe, simple, and only a few clicks away.

And yes, fellow old-school tinkerers: You can even use it on a floppy.
http://www.dansdata.com/ntfscompression.htm

<-- von 2008, aktualisiert 2011


http://directedge.us/content/to-compress-or-not-to-compress-that-is-the-question

Conclusion

This test was by no means exhaustive, but I think I was able to capture a good cross-section of the files people have on their systems and what sort of performance to expect. As for answering the question: "Should I use NTFS compression?", the answer is a resounding: "It depends."

The results above should at least help to answer the question for a particular application. Disk space is cheap, and if you have enough of it, no compression at all is the way to go. On laptops where disks are often smaller, compression can benefit Program Files where it can save some of that valuable disk space (for me it's about 1.5GiB). Document files may be compressed more, but the performance impact is big enough to be noticeable, and documents that might benefit from compression are likely to occupy a very small portion of the disk. Even the high compression ratios may save only a few megabytes.

As for the previously-run benchmarks, these new results just don't sync up. I believe the results here are more indicative of true performance, since benchmarks don't take into account the differences in real-world data, and don't reflect real usage scenarios. Using files gives a much better idea of the performance to expect, and also eliminates some of the issues that Sandra probably ran into. My theory on the Sandra results is that the data that was used may have been uniform, which would make it easy to cache for the compression driver.

In any case, these tests should provide more data, and move the question further down the road to resolution.
http://directedge.us/content/to-compress-or-not-to-compress-part-ii
http://directedge.us/content/to-compress-or-not-part-iii

<-- von 2008

NTFS Compression…

Now that we have discussed sparse files, we will move on to conventional NTFS compression.

NTFS compresses files by dividing the data stream into CU’s (this is similar to how sparse files work). When the stream contents are created or changed, each CU in the data stream is compressed individually. If the compression results in a reduction by one or more clusters, the compressed unit will be written to disk in its compressed format. Then a sparse VCN range is tacked to the end of the compressed VCN range for alignment purposes (as shown in the example below). If the data does not compress enough to reduce the size by one cluster, then the entire CU is written to disk in its uncompressed form.

This design makes random access very fast since only one CU needs to be decompressed in order to access any single VCN in the file. Unfortunately, large sequential access will be relatively slower since decompression of many CU’s is required to do sequential operations (such as backups).

In the example below, the compressed file consists of six sets of mapping pairs (encoded file extents). Three allocated ranges co-exist with three sparse ranges. The purpose of the sparse ranges is to maintain VCN alignment on compression unit boundaries. This prevents NTFS from having to decompress the entire file if a user wants to read a small byte range within the file. The first compression unit (CU0) is compressed by 12.5% (which makes the allocated range smaller by 2 VCNs). An additional free VCN range is added to the file extents to act as a placeholder for the freed LCNs at the tail of the CU. The second allocated compression unit (CU1) is similar to the first except that the CU compressed by roughly 50%.

NTFS was unable to compress CU2 and CU3, but part of CU4 was compressible by 69%. For this reason, CU2 & CU3 are left uncompressed while CU4 is compressed from VCNs 0x40 to 0x44. Thus, CU2, CU3, and CU4 are a single run, but the run contains a mixture of compressed & uncompressed VCNs.
http://blogs.msdn.com/b/ntdebugging/archive/2008/05/20/understanding-ntfs-compression.aspx

<-- "Understanding NTFS Compression", auf MSDN Blogs von 2008

technischer Hintergrund wie die Kompression funktioniert
 
Zuletzt bearbeitet:
  • Gefällt mir
Reaktionen: DFFVB
Zurück
Oben