In my previous posts on Selenium and NUnit I described how to crawl your web application by following all the links on every page, and hashing the visited addresses.
My crawler can optionally reduce a page’s URL to a kind of signature consisting of the address and the names of its parameters. For example, the URL
http://myhost/app1/mypage.aspx?page=7§ion=1
would be reduced to
mypage.aspx?page§ion
If we want to add page-specific actions, the simplest approach is a huge switch/case statement which finds us the actions to be performed depending on the current address (and thus, depending on the signature of the current URL). Let’s define
delegate void PageTest();
and
string sUrlPattern = sUrl.Substring(sUrl.LastIndexOf("/") + 1); if (sUrlPattern.Contains("?")) { sUrlPattern = Regex.Replace(sUrlPattern, "=.+?&", "&"); // non-greedy sUrlPattern = Regex.Replace(sUrlPattern, "=.+", ""); } List<PageTest> lifnTests = new List<PageTest>();
We can then add page tests to the list:
switch(sUrlPattern) { case "mypage.aspx?page§ion": lifnTests.Add(delegate() { TestMyPageWithPageAndSection(); }); break; ... }
The List<PageTest> now contains all the test functions that can be called because of the parameters of the current page.
foreach(PageTest pt in lifnTests) { selenium.Open(sUrl); selenium.WaitForPageToLoad("60000"); pt(); }
An action consists of the usual Selenium commands:
private void TestMyPageWithPageAndSection() { selenium.Click("btnClickMe"); }
Of course, the necessary try/catch blocks, logging, etc need to be added to let NUnit run through an application test in case of an error.