View Full Version : Downloading Websites

Billy T
27-07-2006, 11:09 AM
Hi Team

1) What is the best way to download a website complete with all pages for future viewing off-line?

2) Are there any prgrams designed to do this? I have tried the IE "make this available off line" (or whatever it says) but that doesn't seem to work for some reason.

3) Is there any way to view the metatags or key words etc to see what they used to get their site higher up the hit lists?

This is an academic interest at present but the answers will probably influence any future web site I may put up for my company.


Billy 8-{) :confused:

27-07-2006, 11:13 AM
Download an offline browser from download.com.

27-07-2006, 11:13 AM
I've used a free program called Teleport Pro in the past to do precisely this. It works quite well, and you can specify how deep it goes into links etc.

27-07-2006, 11:32 AM
file/save as

Graham L
27-07-2006, 01:08 PM
(1) I've done this a few times with the "File/Save As" technique, but the results vary. If internal links use the full URLs, I have had to spend some time replacing the URLs with relative ones. It's easy enough, just tedious. In *nix I do it with sed. If the links are relative, there's no problem. (Of course any external links will try to go out to the Internet.)

(2) Not at all difficult to do; I'm sure it has been done many times.

(3) Be aware that Google (at least) will make sure you get a very low (or no) ranking if you are obviously trying to improve your ranking with tricks. They know all the tricks, and probably find them with no human intervention needed. Their preference is for the rankings to depend on the merits of the site. I agree with that principle. :thumbs:

28-07-2006, 12:14 PM
HTTrack (http://www.httrack.com/) is a free website downloader. Not sure how well it deals with complex sites with flash, javascript etc.

Teleport Pro is good, as is Offline Explorer.