Two hits from google to companies that can help you:
http://www.scrape-it.nl/Web-scrapen (sorry, that's Dutch, but google translate will make you understand it)
A poor man's solution:
Save the webpage to a text file, and use a program to extract the links on it to a file. I think a regular expression that starts with http: or https: and ends with a few characters not allowed in an url (like space and comma) will be the center piece of such a program. Any second year IT-student should be able to write it.
We are looking for software to speed the removal of pirated content from website for our clients.
I'm not sure if "scraper" is the right term for what I'm looking for, so here's what I need.
I want to be able to go to a website and grab all the links from a search result.
Pick any site and run a search for a course. When I run that search I get a return of 542 results for example. I want to be able to grab all of the links at once without having to grab them individually.
Is there a program that will do that?