Kreatus (Ubot Ninja) 422 Posted December 6, 2010 Report Share Posted December 6, 2010 How can to click random internal page of a website? random click.ubot Please see attached. That bot only clicks on the same link. I want it to click random pages on each loop.. Thanks! Quote Link to post Share on other sites
Net66 54 Posted December 6, 2010 Report Share Posted December 6, 2010 I think you need to take a different approach. Scrape all occurances that match the wildcard into a list.Pick a random number between 0 and the listtotal-1.Navigate to the url at that list position. You could do the navigating in a loop still so you visit multiple links from the list and you could even remove the list items along the way so you don't go to the same link more than once. Andy Quote Link to post Share on other sites
Kreatus (Ubot Ninja) 422 Posted December 6, 2010 Author Report Share Posted December 6, 2010 I think you need to take a different approach. Scrape all occurances that match the wildcard into a list.Pick a random number between 0 and the listtotal-1.Navigate to the url at that list position. You could do the navigating in a loop still so you visit multiple links from the list and you could even remove the list items along the way so you don't go to the same link more than once. AndyHi Andy thanks for the response. That's what I did earlier. Scrape all the inner pages then randomly navigate to it. But it scrape images,xml and css files also.. Im not sure yet how to remove them from the list so I can scrape internal pages only. This bot will visit all kinds of websites with random extensions like .html,.php,.asp, / and so on.. Thakns Quote Link to post Share on other sites
crazyflx 22 Posted December 6, 2010 Report Share Posted December 6, 2010 Give this a go (it works for me):random click modified.ubot Quote Link to post Share on other sites
UBotBuddy 331 Posted December 6, 2010 Report Share Posted December 6, 2010 You can also use Mouse Move and build in different coordinates and possibly use the $spin. BUddy Quote Link to post Share on other sites
UBotBuddy 331 Posted December 6, 2010 Report Share Posted December 6, 2010 I got to playing. Yes, it works using $Spin. I could have fun with this. LOL http://screencast.com/t/CiCxyTJ2 Quote Link to post Share on other sites
Kreatus (Ubot Ninja) 422 Posted December 6, 2010 Author Report Share Posted December 6, 2010 Give this a go (it works for me): Hi crazyflx already tried that but that approach will not work on random websites with different url extensions... I got to playing. Yes, it works using $Spin. I could have fun with this. LOL http://screencast.com/t/CiCxyTJ2 Nice method BB but that will steal the focus of my screen.. Thanks for the advice guys. Quote Link to post Share on other sites
theninjamanz 29 Posted December 6, 2010 Report Share Posted December 6, 2010 Hi crazyflx already tried that but that approach will not work on random websites with different url extensions... Nice method BB but that will steal the focus of my screen.. Thanks for the advice guys. The problem here is telling ubot what to scrape. I'd do something like this. Have a UI text box where the user can enter the extension of the page, so they could enter .php or .html or .aspx, this gives you maximum flexibility. Then when you scrape choose scrape only those elements that match the file extension. Then having adding all these to a list, use a random list item on that list and then NAV or click the link. Or if you wanna go the route of making it easier.....Setup separate If commands to capture the most popular file extentions. So that If search page = .html then add to listDon't set an elseIf search page = .php then add to list. etc etc Ninjaman. Quote Link to post Share on other sites
Kreatus (Ubot Ninja) 422 Posted December 6, 2010 Author Report Share Posted December 6, 2010 The problem here is telling ubot what to scrape. I'd do something like this. Have a UI text box where the user can enter the extension of the page, so they could enter .php or .html or .aspx, this gives you maximum flexibility. Then when you scrape choose scrape only those elements that match the file extension. Then having adding all these to a list, use a random list item on that list and then NAV or click the link. Or if you wanna go the route of making it easier.....Setup separate If commands to capture the most popular file extentions. So that If search page = .html then add to listDon't set an elseIf search page = .php then add to list. etc etc Ninjaman. Hi Ninjanman not trying to be a wiseguy but I already tried that. Just wondering if theres any easier method to do this.. If theres no other easy option then Im gonna do what you just said above. Thanks Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.