On Tuesday 29 July 2003 9:04 pm, Peter Clark wrote: > This isn't a complicated matter, but since it's been a while since I've > dabbled in sed, I'm not sure how best to do this, so I thought I would ask > the experts. :) > I ran wget on a large site that has all the links hardcoded. I'd like to > remove all instances of, say, 'http://www.site.com/directory' so that I can > view it offline and have all the links work locally. So, what would be the > best way to recursively work through the files and remove the text? > Thanks, > > :Peter Can you run the wget again? It will fix all these issues for you when it does the mirror. wget --convert-links --mirror http://somesite.com -- Bret Baptist Systems and Technical Support Specialist bbaptist at iexposure.com Internet Exposure, Inc. http://www.iexposure.com (612)676-1946 x17 Web Development-Web Marketing-ISP Services ------------------------------------------ Today is the tomorrow you worried about yesterday. _______________________________________________ TCLUG Mailing List - Minneapolis/St. Paul, Minnesota http://www.mn-linux.org tclug-list at mn-linux.org https://mailman.real-time.com/mailman/listinfo/tclug-list