There is no simple process for copying an entire website, and it is not always possible to do programmaticly. Even after the website goes offline you may be able to review the content using the Cache links in Google (most extensive), Yahoo, Bing, etc. search results, but your best option is to ask for a copy.
A website I use regularly and subscribe to is going off line soon as the owner is retiring. It contains hundreads and possibly thousands of useful articles which I wish to capture and be able to view offline. Can you reccomend how I can best achieve this please.