1nspire 5 Posted January 7, 2010 Report Share Posted January 7, 2010 Is there a way to loop through a list of lets say urls. Randomly choose each url but visit all urls. Currently the random list item is completely random and will visit a url more than once in a loop. How can I have ubut either remove the item from list after it has used it or is there a method to achieve this? Quote Link to post Share on other sites
wacek 0 Posted January 7, 2010 Report Share Posted January 7, 2010 it would be very easy if we could use 'remove from list' command, however... let's assume you get %rows list (with urls) from csv file, so you can have additional 'field' called status in there.It could be '1' a load. you can then create another list with url related data, so you'll have $url and $urlStatus then, do what you want, I mean select random list items and check if they have $urlStatus = 1,once you'll do whatever you want with the $url just set $urlStatus to 0, I think you got the idea. Quote Link to post Share on other sites
pr0m 9 Posted January 7, 2010 Report Share Posted January 7, 2010 Remove from list could help in this case. +1 Quote Link to post Share on other sites
turbolapp 5 Posted January 7, 2010 Report Share Posted January 7, 2010 I was wondering if it did that. I've got a list of 50k keywords that I have it loop through randomly really wanting none of them to be repeated never bothered to check for that. Just pondering here but could you do some sort of pull from the random list inside a list total? Quote Link to post Share on other sites
thewebsitegurus 0 Posted February 7, 2010 Report Share Posted February 7, 2010 I just ran into a situation where I needed a "remove from list" function... I guess it doesn't exist, so I had to create a really redundant process to duplicate the goal. So here is the idea in case anyone else needs to remove from list. Create 2 lists, both being identical. At the top of your loop for the first list, create a variable "#currentItem -> next list item". Every time the script loops through your list "#currentItem" will contain... the current item. At the bottom of your loop for the first list, start a second loop using your second list. At the top of this list do the same as before, create a variable "#otherItems -> next list item". The KEY part to this is an "if statement" in your second list that says " if (NOT (#currentItem = #otherItems)) then (add to list %firstlist ( #otherItems ) ). So each time you loop through the main list, the current list item will be removed and wont be part of the list for the second rotation. I hope that was clear enough Quote Link to post Share on other sites
bluegoat 24 Posted February 9, 2010 Report Share Posted February 9, 2010 Here's a bot I threw together that will randomly choose a URL from a list, nav to the URL, remove the URL from the list, and continue till all URL's are gone. Could also work on keywords, just need to change a few things. You'll first need to populate the URLs.txt file in your documents folder. You can do that within the bot with a few search engines by running the "Populate URLs.txt file" script, then run the "Remove Random" script to watch it go. You can pause or stop the script at any time to check the URLs.txt file to see that the already visited URL's have been removed. When you re-start, it will continue where it left off till all URL's have been used.Remove Random.ubot 4 Quote Link to post Share on other sites
Gogetta 263 Posted February 15, 2010 Report Share Posted February 15, 2010 Here's a bot I threw together that will randomly choose a URL from a list, nav to the URL, remove the URL from the list, and continue till all URL's are gone. Could also work on keywords, just need to change a few things. You'll first need to populate the URLs.txt file in your documents folder. You can do that within the bot with a few search engines by running the "Populate URLs.txt file" script, then run the "Remove Random" script to watch it go. You can pause or stop the script at any time to check the URLs.txt file to see that the already visited URL's have been removed. When you re-start, it will continue where it left off till all URL's have been used. Hey bluegoat, reps man. You saved me time from having to figure this out. Thanks! +1 Remove from list Quote Link to post Share on other sites
turbolapp 5 Posted February 15, 2010 Report Share Posted February 15, 2010 I came looking for this thread with just the same problem. You got a +rep bluegoat for that bot that just made my day a lot simpler. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.