Post Reply 
 
Thread Rating:
  • 1 Votes - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Manual Crawling
07-03-2011, 05:07 PM (This post was last modified: 07-03-2011 05:40 PM by Abhi_M.)
Post: #1
Wink Manual Crawling
One of the essential thing in web application security assessment is crawling.

Quote:A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion.

A Web crawler is one type of bot, or software agent. In general, it starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are recursively visited according to a set of policies.
from Wikipedia

Its true that Mantra lacks a good crawler as of now.
Navicrawler was a safe bet, but its no longer updated.

So lets see how we can collect all the links in a particular webpage.


We are going to use Web Developer extension. You can activate it from the sidebar.

[Image: 5897892852_677c370fce.jpg]



You can see a new toolbar now, It has lots of functions but as of now we are concentrating only on collecting Link information.

[Image: 5897326909_65500f3288.jpg]



List of links can be collected by going to Information > View Link Information

[Image: 5897893088_76cc32e2d4.jpg]



Now you have a nice good list of all the links in the particular page you scanned for

[Image: 5897893390_122027b009.jpg]



You can also use Web Developer extension without activating toolbar buy going through 'Ayudha' menu.

[Image: 5897894374_388f5db6c7.jpg]



Enjoy.!!!

Use Mantra forums.
Please do not PM/E-mail me regarding any technical queries straight away.
Find all posts by this user
Quote this message in a reply
Post Reply