Game Design Music and Art

Download the Web? – luke

luke

Member

Posts: 311
From: I use your computer as my second Linux box
Registered: 10-30-2005
In a while my internet/phone/TV will be taken offline since I am moving to a new home, I am wonderng if it is possible to save large portions of a web site, eg all of GameDev.net's featured articles (w/o having to go to each article and saving each of its pages sequentially). I already know how to save the current web page I am veiwing (file -> save as).

------------------
"Do not condemn others for their ignorance, use it against them."
Scott E. Roeben

buddboy

Member

Posts: 2220
From: New Albany, Indiana, U.S.
Registered: 10-08-2004
lol... that reminds me of a joke thing where you're supposedly downloading th e web... it's a ridiculous filesize and a really really long wait time... it was funny... oops, didn't help you. hmm... that's a good question, perhaps you could ask the people at GameDev to give you a .zip of the articles? they probably have them all on the server pc.

------------------
WARNING:

RADIOACTIVE IE AHEAD!
--------------------
#include <spazz.h>

int name()
{
char name['B','u','d','d','B','o''y']

Ereon

Member

Posts: 1018
From: Ohio, United States
Registered: 04-12-2005
I know Firefox has a save webpage feature, but it only works on one page at a time. Maybe google it and see what programs you can come up with.

------------------

Of course God knew what would happen if they used their freedom the wrong way: apparently He thought it worth the risk.
C.S. Lewis

Friendship is born at that moment when one person says to another: What! You too? I thought I was the only one.
C. S. Lewis

www.christiangaming.com

HanClinto

Administrator

Posts: 1828
From: Indiana
Registered: 10-11-2004
I'm pretty sure I've seen a Firefox extension that can do this. I can't find the one I'm thinking of at the moment, but these two might help:
DownThemAll!
FlashGot

... 10 minutes later...

Aaah! Here it is!
Slogger

quote:
Slogger creates a complete log of your browsing history. It can save every page using the same options as the "Save Page As" command as well as saving a customizable plain text history file.

So basically you can quickly browse through all of the articles that you want to save (for instance, middle-click on all of the links to open them up in separate tabs), and it will save them all locally.

I hope this helps!

--clint

Edit: Okay, found another one that might work better.
SpiderZilla

quote:
Spiderzilla is an easy-to-use website mirror utility, based on Httrack from www.httrack.com.

If the extension doesn't work for you, you might want to try Httrack
From the website
quote:
HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.


[This message has been edited by HanClinto (edited June 19, 2006).]

buddboy

Member

Posts: 2220
From: New Albany, Indiana, U.S.
Registered: 10-08-2004
my idea was easier to do. =P lol, j/k. even tho it might be, depending on whether or not they would give it to you =D

------------------
WARNING:

RADIOACTIVE IE AHEAD!
--------------------
#include <spazz.h>

int name()
{
char name['B','u','d','d','B','o''y']