Saturday, November 29, 2014

download - How do I neatly save a set of webpages?






I have an index webpage, with just one level of links. I want to download all the linked webpages and have them neatly stored on my computer, so clicking the link on the index should open up the respective offline webpage. Is this possible?


Answer



The easiest tool to use is HTTrack, it is a free, fast and an easy to use website copier that is very configurable. You can set links deep and pretty much anything you want.


It will store all the downloaded results in a folder and preserve the directory structure of the server.


You can then either browse offline by double clicking on any file or copy to a webserver and all links will be preserved.


No comments:

Post a Comment

linux - How to SSH to ec2 instance in VPC private subnet via NAT server

I have created a VPC in aws with a public subnet and a private subnet. The private subnet does not have direct access to external network. S...