Most web sites are not downloadable since they are driven by a database so you don't have to create 30,000 web pages. Imagine if you had to update such a site. You would die of old age if you had the many pages to keep current.
So close to 100% of sites today have a database on the backend and scripts to create a page on the fly based on what the user clicks.
So that's 30,000 clicks or more to create some 30,000 pages for you to find a way to make work like the old site.
Bow out fast.
So I work at an automotive shop, part of which paints vehicles. The painter gets all of his paint codes from an online database. The makers of thus database have announced that they switching over to there new data base, and the original one will not be accessible by the end of the year. The painter however does not like the new database and wishes to keep the original one. All the information in the database is plain text on different HTML pages. He can either find an click links to get the information he wants, or type in paint codes, or vehicle specifications to get what he needs. my goal is to make the original database available for him offline. so that all the links work as they do now, and all the search functions yield the same results offline as they do online. I know this is a big project as its a huge database but it needs to be done. I have internet download manager if that helps, but even after downloading 30,000 HTML pages i dot know how to make it so that they operate as the website does. any advice would be great