Trying to loop links with selenium in python

Question!

I would like to loop a set of links with selenium in python. I have tried to follow this explanation with no success. I keep getting the "stale element reference error(I am trying to use WebDriverWait). my code is as follows:

list_of_links = mydriver.find_elements_by_xpath('//ul[@class="directory dir-col"]/li/a')
for link in list_of_links:
UI.WebDriverWait(mydriver, 30).until(lambda mydriver:mydriver.find_element_by_xpath('//ul[@class="directory dir-col"]/li/a'))        
link.click()
mydriver.back()

I did try placing the webdriver wait command before and after the click and back commands with no success. any help will be highly appreciated.



Answers
The issue is that once you leave the page, the elements in list_of_links become stale.

This approach should work for you, assuming each link has different text:

list_of_links = mydriver.find_elements_by_xpath('//ul[@class="directory dir-col"]/li/a')
list_of_linktext = []
for link in list_of_links:
    list_of_linktext.append(link.text)

for linktext in list_of_linktext:
    mydriver.find_element_by_link_text(linktext).click()
    mydriver.back()
By : Richard


Based on Richard idea I decided to loop on xPath values and not link text(which can be the same) the code I ended up with is:

import lxml.html as lh
import urllib2
from selenium import webdriver

htmlObject = lh.parse(urllib2.urlopen(start_url))
listOfPaths = htmlObject.xpath('//ul[@class="directory dir-col"]/li/a')
listOfLinkPathes = []
for link in listOfPaths:
    listOfLinkPathes.append(htmlObject.getpath(link))
for linkPath in listOfLinkPathes:
   mydriver.find_element_by_xpath(linkPath).click()
   mydriver.back()


For such tasks, I suggest you use irobotsoft web scraper. This video shows how to do it: http://irobotsoft.com/help/record%20robot.swf

By : seagulf


This video can help you solving your question :)
By: admin