Dowloading an entire website.

So I work at an automotive shop, part of which paints vehicles. The painter gets all of his paint codes from an online database. The makers of thus database have announced that they switching over to there new data base, and the original one will not be accessible by the end of the year. The painter however does not like the new database and wishes to keep the original one. All the information in the database is plain text on different HTML pages. He can either find an click links to get the information he wants, or type in paint codes, or vehicle specifications to get what he needs. my goal is to make the original database available for him offline. so that all the links work as they do now, and all the search functions yield the same results offline as they do online. I know this is a big project as its a huge database but it needs to be done. I have internet download manager if that helps, but even after downloading 30,000 HTML pages i dot know how to make it so that they operate as the website does. any advice would be great

Discussion is locked

Reply to: Dowloading an entire website.
PLEASE NOTE: Do not post advertisements, offensive materials, profanity, or personal attacks. Please remember to be considerate of other members. If you are new to the CNET Forums, please read our CNET Forums FAQ. All submitted content is subject to our Terms of Use.
Reporting: Dowloading an entire website.
This post has been flagged and will be reviewed by our staff. Thank you for helping us maintain CNET's great community.
Sorry, there was a problem flagging this post. Please try again now or at a later time.
If you believe this post is offensive or violates the CNET Forums' Usage policies, you can report it below (this will not automatically remove the post). Once reported, our moderators will be notified and the post will be reviewed.
- Collapse -
Unlikely to ever happen.

Most web sites are not downloadable since they are driven by a database so you don't have to create 30,000 web pages. Imagine if you had to update such a site. You would die of old age if you had the many pages to keep current.

So close to 100% of sites today have a database on the backend and scripts to create a page on the fly based on what the user clicks.

So that's 30,000 clicks or more to create some 30,000 pages for you to find a way to make work like the old site.

Bow out fast.

- Collapse -
Allow me to elaborate

Thank you very much for your reply, very much appreciated. I do however believe that there is a way to achieve my goal,

This is what the site looks like.

This is an example of what results from an inquiry would look like

So what your saying is that the results yielded from a search on the site would be the result of a scipt/database implementation and not just a simple html link i can save for offline usage?

- Collapse -
That's what I say.

And know. In fact you proved it since you see the search page has you input search values then the website queries a database and creates a page on the fly.

All skill levels arrive here so I take it you don't write code for web servers or such.

Now a determined programmer might craft some web scraping script but that's not me. I see the task as something that if manually done would be your near full time job for a year or so. It certainly wasn't something the company that made the website made in a few days.

- Collapse -
To give you an legitimate answer

Please contact the guy who has the database. He might sell you a copy. Then you can load the site on an internal web server on your local network.

Post was last edited on November 21, 2019 3:12 PM PST

CNET Forums

Forum Info