Grab a website with all links. First get GNU’s wget if you don’t have it installed already.
I used this to grab a copy of my router’s GUI:
wget --http-user="" --http-password="1234" --recursive --convert-links -e robots=off 192.168.1.1
Obviously replace usr/pwd and URL where necessary. This grabs all pages and links that 192.168.1.1 leads to. For websites with external links it is highly advisable to set the recursion level to something sane or you risk downloading the entire Internet ;-) using the option --level=NUMBER
where ‘number’ is the depth of the links you wish to follow. Be careful with -e robots=off
as this tells wget to be a bad bad spider and go against website’s wishes and trawl through pages and files they probably shouldn’t be in. Do not forget --convert-links
if you wish to browse your freshly grabbed site offline as this converts links to relative links.