Jump to content
UBot Underground

greencat

Fellow UBotter
  • Content Count

    82
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by greencat

  1. OK it's been far too long since I played around with UBot (and I'm a tad rusty) but I thought you guys might like this. Google Plus Bot What it does The bot will plus one (+1) all of the search results for a given query - provided you are logged into your Google Plus account. For example - it will +1 all of the pages on a given site if you enter the query: site:yourdomain.com You can set a limit on the number of results to +1. This is handy because Google has a hard limit of around 300 +1s (a day?) and then it stops accepting your +1s. Terms of Use The bot is free to do what you
  2. Sorry guys & gals I haven't been on here for months (other projects). Happy to take another look at it - if there's demand.
  3. It's been a few months since I properly used Ubot (busy on other projects). Found out today my company wants to close their Facebook group (we are moving to a page) but it turns out that closing a Facebook group means deleting all of the members one by one and there are 1500 of them. Cripes. It took literally 5 minutes to hack out a Ubot to do it - and 20 minutes later it has deleted 500 of the members already. Some poor soul in my team was going to have do this manually next week - but hopefully it will all be done before I go home tonight. Magic!
  4. If the page is loading and displaying the iframe - then the url has been generated. The only thing you need to work out is what javascript variable(s) are being used to build it. You can then do something like: set variable = eval (javascript variable(s)); If you don't know javascript (and even if you do if the coder has tried to obscure it) - it's going to be fairly tricky to work out what's going on. If you post the page - I may be able take a quick look. You may even be able to get the url of the iframe via javascript directly (the src property is probably the one to go for): htt
  5. Then you'll probably have to reverse engineer which ever code creates the url - and create your own url before navigating to it.
  6. Find the url of the iframe and then navigate to it directly.
  7. That's excellent Joe118. Look forward to seeing the finished bot. Eval and Javascript combined are extremely powerful. I've used it to solve reliability problems in the past instead of using Ubot's tools to select and scrape values (you can even build in error handling). If you need to scrape or change values 1000s of times a day - javascript has the edge over Ubot's built in tools. Good luck!
  8. OK - I think I understand this. You have a UI list which corresponds to variable #catandtag1. And you want this variable to be used to select the corresponding pull down menu 1. on the website. All you need to is: run javascript: mainCategorySelector("#catandtag1") I'll try and put together a quick example. Updated: Check the attached bot. Note this won't actually change the pulldown menu on the screen - but as far as Esty is concerned it has. etsy.ubot
  9. Can you not just run javascript: mainCategorySelector("accessories") if you want to select accessories? Or have I misunderstood the problem? this.value simply refers to the value attribute of whichever menu item the user selects.
  10. Grab yourself a copy of Firefox and Firebug and go a browsing to Amazon and do a bit of investigating. Turn on inspect element. This is a great tool for finding those tricky elements that dynamically appear. I suspect the divs the search words are on is dynamically generated by JavaScript. The enclosing div's id is srch_sggst. I reckon you should be able to scrape the contents of this layer as innertext(although you'll need to junk the first line as that just says Search suggestions). Update: I had a quick go at this and used JavaScript to scrape the innertext of srch_sggst to a file. Se
  11. The site is here: http://tk.koramgame.com/
  12. To pull in data - eg a list of keywords Do "add to list" Then set the contents of the list to "list from file". You can then process them via a loop and the replace function and finally save the list to a different file.
  13. Copy and paste nodes via keyboard shortcuts eg Select then Control-C for Copy, Control-V for Paste. You need $replace in a loop to do the above. I've attached an example script string-replace.ubot
  14. One way you could probably do this is to save out as Excel XML rather than CSV. This tutorial might provide a starting point: http://technet.microsoft.com/en-us/magazine/2006.01.blogtales.aspx
  15. Is this any good to you? http://ubotstudio.com/forum/index.php?/topic/2691-gmail-activation-link-clicker/ - if you want those extra mods done (logging in and Show me local) drop me a line (I have emailed you as well).
  16. I did this for a client. I set up two bots for them. One running every 5 mins - the other running once a hour making sure the other is running. I used Windows task scheduler to manage the jobs. There are two approaches you can look at: 1. Compiled bots. If you use this - you'll need a way of starting and quiting the bots automatically. AutoIT can do this. 2. Using Ubot Studio itself. Ubot Studio can run bots automatically via the command line using the /auto switch. Error trapping is key - otherwise you can end up with a bunch of bots which have generated pop up errors and somehow not
  17. Maybe via regedit: http://www.computerhope.com/issues/ch000848.htm although this potentially opens up a massive security hole if your bot is untrusted.
  18. This isn't really a Ubot problem I reckon. Any automated script might have done the same thing. Here's a couple of things you can do that might help avoid it: a) Turn off images in IE prior to running the script. Images are major suckers of bandwidth - and therefore server capacity. If you're generating just one page view for each deleted page (and each page is 50K) - well that's a lot of bandwidth. Stick some delays in various parts of your script. This will be kinder to your server. One thing you could consider is cloning your wordpress set up on your own computer - running the script
  19. First up I want to say this is a very cool idea and I think the interface controls are great. Nice and minimalistic. It's regularly dying at the email creation stage for me. It didn't recognise that I was already logged into AOL (so I logged out manually). At the cleaner screen I'm getting "Xenocode Postbuild 2008" error which says "This application has encountered an error (0xD0000002)" - I'm presuming when it tries to run CCleaner. I would hazard a guess that this might have something to do with running ccleaner at the command line on Vista and the permissions CCleaner needs. When I ru
  20. Would running it through feedburner work for you? eg: http://feeds.feedburner.com/UbotStudioBlog Alternatively try something like FeedAgg.
  21. Copying and pasting subs from other scripts works OK for me. You do need to hit refresh though. I can't help wondering if we all shouldn't be sharing more of our subs as I wouldn't mind betting that many of us are trying to do similar things.
  22. Tools > Options... > Content > Enable JavaScript
  23. It's hard to say for definite without seeing the sites - but Ubot can execute javascript and is fairly well integrated. You can set a Ubot variable using the results of a javascript function or pass a Ubot variable into a javascript function for example. One thing worth saying - Ubot doesn't record things like Imacros does. You take it through the process you want to automate - programming each step as you go. This probably takes longer - but it does give you much greater control and scope for doing things that aren't just running through a bunch of steps.
  24. One simple way of doing this might be to simply refine your Google search eg: blog -google would find blogs that don't mention google. blog -intitle:google finds blogs that don't have google in the title. blog -site:google.com -intitle:google finds blogs which aren't google owned/on the google domain.
  25. You would need to use $replace - if you watch the tutorial on how to build your own Google keywords bot it explains $replace. Alternatively - you could try the Strings library on this forum. That also has a string replace function.
×
×
  • Create New...