Kurt 1 Posted July 10, 2010 Report Share Posted July 10, 2010 How would you protect a list of URLs other than just include then in a local text file? Is there something like "$insert list from URL" instead of "$insert list from file"? Would you add 100 lines (1 for each URL) as a string to a list? Thanks... Quote Link to post Share on other sites
meter 145 Posted July 11, 2010 Report Share Posted July 11, 2010 What is protecting a URL? Quote Link to post Share on other sites
Digital101 4 Posted July 11, 2010 Report Share Posted July 11, 2010 You can use $list from text from the variables constants list. You will have to put them all on one line with a deliminator though as it will not read the line breaks. urlone|urltwo|urlthree|urlfour You can even add the login info to the lines as well if you need to. Richard Quote Link to post Share on other sites
marketermac 0 Posted July 11, 2010 Report Share Posted July 11, 2010 you could just scrape the information from a page you setup... I assume by 'protect' you mean you don't want people you give the bot to having the complete list of URLs? If this was the case, and it was me, I would make a page with white text and a white background, then scrape the URLs from that page into a list. People will navigate to your page, see nothing and assume the bot is doing whatever, and never be the wiser about what your URL list is short of writing it down as your bot goes. no idea if that's what your after?? Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.