Text files do compress very well but I question if you really have a 800 terabyte space to unzip this to.
One of the largest disk drives today is only 8TB ( https://www.google.com/#q=8tb+hdd ) so few would have a server or cloud farm with 100 or more of these drives to hold that big a file.
Can you tell more about this?
I have a large file that zipped is 557 meg and unzipped seems to be about 800 terabytes. I can't tell if that's the zip bomb virus thing or not but I ran and checked for viruses last night and updated virus checkers to be sure they're current. I'm going to run virus checkers again tonight. It's a public record data file so it's possible. I'm wondering if Windows can handle such a thing if I do unzip it. Right now I'm on Windows 7. I really don't like Windows 10 just generally. I'm also wondering if there will be an editor to handle the file size once I get a server set up that can handle this. Do you have suggestions? I hope I'm using the right forum and am not posting something that's been discussed many times but it's very important to me and I hope to have the data I want out of this into SQL Server Express or something similar soon. I appreciate any suggestions for dealing with files of this size as I believe there will be others in the future. I'm wondering if I can use ETL or a text editor to break this up into something windows and my hard drive will understand as of course I don't have a machine that big right now.