Jump to content
UBot Underground

thewebsitegurus

Members
  • Content Count

    42
  • Joined

  • Last visited

Community Reputation

0 Neutral

About thewebsitegurus

  • Rank
    Advanced Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. lol 40,000 Pr requests each with a 3 second delay = 33.3 hours For most serp scraping bots, a 3 second delay just takes up too much time. If you use just a handful of proxies randomly taking turns, you can get that time down to about 1.5 hours for 40k.
  2. I own a backlink service so I perform a large amount of PR checks. I've found that if you make back-to-back PR requests google will ban you after about 700 requests. Adding a random delay function will likely double this. Im currently building a custom app that will be pulling PR non-stop all day Thanks man, but I ended up just creating this as a php app. uBot was just becoming too cumbersome.
  3. Thats a good way to get your server Ip blocked by google I've already been down that road. I was trying to use ubot because of the private proxy support. There should be a way to do this with javascript, Im pretty sure thats how all of the firefox/chrome extensions are doing it.
  4. Im sorry, but you have clearly missed the entire point of this thread. It's nice that you are trying to defend ubot, but statements like this are of no help.
  5. Yeah, that was the only reference I could find as well. I can't get it to work though: I also can't find any PR checking sites that don't require a captcha. Im using this for serp scraping, so this needs to be automated.
  6. Does anyone know how to get the current sites PageRank with ubot? Maybe using javascript?
  7. This is a huge problem. It is very common to have a bot that has been working perfectly so far just stop working for no reason. I have spent hours trying to trouble shoot a bot just to find that restarting ubot fixes everything. This was a HUGE time waster until I got in the habit of simply restarting ubot every time I ran into an issue. I think this statement should be at the top of the ubot manual : "If at any time you can't figure out why your bot isn't working as you planned, restart ubot and functionality may return". Anyone who can't relate with this statement hasn't made enough b
  8. Each video is around 30 mins long... There are about 12 videos now? Whether the videos are helpful or not is not the issue, the issue is documentation. If some one needed to understand the param function, the documentation will not help. So they would need to sit through one or more 30 min videos just to understand that one simple function. If this is true, then I doubt you are working on anything as complex as a 40 website social bookmarking bot as I currently am. I can tell you that there is nothing more frustrating then trying to troubleshoot why your bot crashes on the 39th website
  9. You can indeed run a sub inside of another sub. Try to isolate the problem by running the captcha sub by itself to see if it is just a choose issue. With captchas, you often have to try a few different methods to find the right one for the current site.
  10. This is probably the WORST problem with uBot: the error messages are of very little help in most cases to the user. I have been actively posting in the support and bug forums. Now if this was my first post, I would have to agree with you. All of the issues that I have had with ubot can be found in other threads. I completely agree that being able to create your own bot that you own for life is better than paying for SENuke. However, theres no denying that SENuke just plain works. I would argue that being able to mimic SENuke perfectly with ubot is near impossible (within reasonable a
  11. Im going to be a big dick right now and say that uBot should only cost $50 and have a big "beta" label slapped on it. This program has AMAZING potential, but I feel like Im spending most of my time finding creative ways to work around its limitations. If uBot was cheaper I would just deal with it, but considering that I paid $200 Im starting to get extremely frustrated.
  12. Yes I agree.. it SHOULD be here, but its currently not. I've only been using uBot for a short time and I already have an army of bots that I could easily sell. Im sure the same applies to a lot of people here. SO, why don't we do something about it. Who should I contact? Whos interested?
  13. Im a web developer and recently became addicted to ubot. I find myself creating bots for the most useless tasks simply because I can Anyways, I've noticed a HUGE lack in available bots, paid or free. Im a member of all of the main seo forums and there is always a huge demand for automation scripts. So I was wondering if there would be any interest in a sort of ubot repository? I was thinking of different methods to keep the content of quality such as: user must submit a bot go gain access to other bots paid membership (profit sharing with bot authors) pay per bot (author of the bot
  14. It happens for every search term. I use scrapebox for my url harvesting usually, but I thought it would be nice to integrate something like this into a few of my bots.
  15. I just ran into a situation where I needed a "remove from list" function... I guess it doesn't exist, so I had to create a really redundant process to duplicate the goal. So here is the idea in case anyone else needs to remove from list. Create 2 lists, both being identical. At the top of your loop for the first list, create a variable "#currentItem -> next list item". Every time the script loops through your list "#currentItem" will contain... the current item. At the bottom of your loop for the first list, start a second loop using your second list. At the top of this list do the sam
×
×
  • Create New...