Jump to content
UBot Underground

webautomationlab

Fellow UBotter
  • Content Count

    407
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by webautomationlab

  1. The file recovery message every time I open UBot ...
  2. I haven't figured out how to save things yet.
  3. Thanks guys. I realize the squeaky wheel gets the grease, and in this case, solving these issues will allow UBot to scale to a larger customer base, which is good for everyone.
  4. That's a work around but I don't think we should have to work around using installed UBot. The upgrade system (with forced upgrades) needs a serious rethink. Please provide the download link. Without it, I can't fix the problem. Thanks.
  5. Update keeps telling me that Access to the path 'C:\Program Files\UBot\UBot\UBotDevTool.exe is denied
  6. Could use a checkbox to set "use config file" which would import the existing list, which would then be added to, and will overwrite the old list.
  7. You gotta sign in with username not email address (or vice versa). I spent a week trying to figure that out.
  8. Yup. Delay = 2400 Of course I would make it delay = 2400 - delays in script and a few seconds for execution and page loads.
  9. I have a bot running once daily, I set a delay = seconds in a day - time it takes the bot to run. So it does a loop, waits a day, does a loop.
  10. Can you share your bot with us?
  11. It helps a little. It just seems like a lot of extra coding work to do what should be a reasonable expectation out of the box. We should be able to nav across a variety of pages (from the top 100 in the google serps no less) with a slight delay between each, without freezing up, and without requiring nodes and nodes of error checking.
  12. When I asked for a date function, I wanted something closer to the php date function. Maybe we can twist seth's arm to get us some more date constants.
  13. Right, but when I'm hitting a list scraped from Google, there is nothing consistent to wait for. I could do an IF, EITHER with a bunch of WAIT FORs I suppose. I don't know if it is Ubot or IE, but surfing is not robust. I can't feed it 500 sites and go out for the night and reasonably think it will complete 250 or less without locking up.
  14. What do you wait for in those instances?
  15. I'm going to use your bot and see if it will do my job. Thanks. A lot. +Rep One note, if you scrape PDF links in Google, they will throw nasty errors when you try to get the meta information. I manually removed PDF links between bot 1 and bot 2. Some sort of URL checking would need to be added if something like this was used heavily.
  16. It would be handy to be able to dial down the playback speed. It's one of the few features I miss from iMacros.
  17. This is exactly what I have been working on. Right now I have two bots. One scrapes the serp. The other scrapes the head attributes and URL as you want. The issue is, the second scraper for kws, description, title, url doesn't nav consistently over a large block of URLs. Until that is resolved, you will be limited in how small of a sample you can scrape. I'm considering lowering mine to 20 results because doing 100 is not stable.
  18. TT, try Set > #hyphen >> - Then do the eval of #current_title against #hyphen
  19. I borrowed Aaron's idea and I have a skeleton file I use as the starting template for every bot. It has 5 base subs, and they are all called with run sub commands Initialize User Interface Startup The Loop Shutdown It is a lot easier to delete things you don't need, than to create them over and over in each new bot.
  20. Off the top of my head, you would want something like Loop List Total = OriginalList >Set CurrentItem (Var) = Next List Item (OriginaList) >If >>Eval >>>CurrentItem = $nothing >>Set CurrentItem >>>n/a >Add to list, AlteredList >>CurrentItem After you exit the loop, You could set OriginalList to AlteredList if you need to maintain the original list name. Something like this, maybe.
  21. Looks like imageshack.us is blocked on your network. Here it is attached Should help you.
×
×
  • Create New...