Problems Decompressing Large tar.gz Files

Forum: LinuxTotal Replies: 9
Author Content
dcparris

Nov 14, 2006
9:57 PM EDT
I used KDE's Konserve utility in SUSE 10.0 to create *.tar.gz backups of my data. Then my HDD crashed. The good news is, I got a new one. The bad news is, I can't seem to open/decompress the huge backups. Konserve doesn't offer to restore your system, just to compress your chosen dirs into a file.

The problem seems to be that Konserve managed to compress my dirs into roughly 3GB files, but even running tar from bash won't uncompress the files. Can anyone offer some assistance with this? Or did I just land myself in Backup Hell?

TIA!
Scott_Ruecker

Nov 15, 2006
3:23 AM EDT
Can you see the files using a live distro like Knoppix or DSL? If you can try moving them to an external HD and then try opening them from somewhere else.

Hope I helped.
Sander_Marechal

Nov 15, 2006
6:10 AM EDT
A Live distro probably won't do you any good. Not enough RAM to handle the files on your RAM drive.
dek

Nov 15, 2006
7:08 AM EDT
Maybe you've already tried this but in case you haven't, try gunzipping them first. then try untaring them.

A thought.
dcparris

Nov 15, 2006
9:19 AM EDT
Actually, I can't say I have tried that yet. I'll try that.
helios

Nov 15, 2006
3:02 PM EDT
I have had the same problem with large compressed files from day one in Linux. Recently, I discovered that if I make sure they are in my home directory, then shut down EVERY un-needed process possibe, I will get a clean extraction. In the past I googled the problem and it is wide-spread but no one seemed to have the answer.. Even though I like karchiver, I have found ark and unrar to be the most reliable. Besides, if I were to ever lose my one and only Roger Ramjet collection to a corrupt compression algorithm, I would consider one final lewd act with a bus exhaust system.

Oh, another thing, I found that zipping or compressing the folder that holds the files has a greater success rate at decompression than does a slew-load of individual files.

h
joel

Nov 15, 2006
7:44 PM EDT
D.C, I have successfully accomplished that on smaller files using a live cd with mc. Slax-5.1.4 works fairly well. Just highlight the zipped file in mc, then hit "enter". That will open the archive, after which you should be able to copy the files wherever you desire. In your case, it may be unable to because of the large size of your files, but at least it's worth a try. I had a similar problem when I backed up my entire Documents folder, which nearly filled a 700MB cd. The next time, I only tarbz2'd my Family_History, General_Interest, & Linux directories, as they contain primarily text files, which yield better compression. That worked MUCH better. My Space folder, which contains numerous jpg files otoh compresses very little, while consuming considerable time, so I don't bother with it anymore. If all else fails, rest assure that I'll remember your problem & it's speedy resolution in my prayers. :)

Helios, thanks for pointing that out. Until your post, I had no idea that large zipped archives were a problem.
dcparris

Nov 15, 2006
10:14 PM EDT
Well, I did compress my /home/me directory. It is mostly docs/text, various types of media files, & personal settings, etc.

I just attempted to gunzip the tar file to no avail. I get as far as 11%. Then it stops. I thought about doing init 1, and then trying it, but not sure that would accomplish much. Maybe I'll raise the issue on the local LUG list and see if anyone has access to a box with 4GB RAM. ;-) Or would that still not be effective?

I do know that I will no longer try a mass backup approach. I will focus on smaller filesets. 200-250MB should be reasonable. Sheesh!
jdixon

Nov 16, 2006
3:11 AM EDT
> just attempted to gunzip the tar file to no avail. I get as far as 11%. Then it stops.

Purely a guess, but it sounds like you're running out of memory or some such. Try monitoring your system's memory, CPU, and disk status while it runs and see if you can tell what's happening. You might also want to run gunzip --test on the file to see if the file checks as good (assuming even that will run).
dinotrac

Nov 16, 2006
5:45 AM EDT
Rev -

I don't think I've ever made a 3 GB tgz, but I have made 2 GB files. I run 1 GB memory. If you can't up your memory, you might try upping your virtual to some silly amount to see if you get a different result.

You cannot post until you login.