What's happening?

Stream them as a single archive.

We’ve all been there. You need to duplicate a 20GB database dump, a 50GB virtual machine image, or a folder containing 500,000 small website files. You type the trusty old command:

cp -r /source /destination Then you wait. And wait. Your terminal freezes. Your disk thrashes. You realize you have no idea if it’s working or if it has silently crashed.

cat /source/hugefile.mkv > /destination/hugefile.mkv If you want to know your exact speed (MB/s) and ETA, pipe through pv (Pipe Viewer).

pv /source/50gb.vmdk > /destination/50gb.vmdk This gives you a beautiful, real-time progress bar: [==>] 45% ETA 0:02:15 1.2GB/s The worst scenario for cp -r is a folder with millions of tiny files (like a node_modules folder or a cache directory). cp has to ask the filesystem for permission for every single file .