Jump to content
UBot Underground

Can we reduce the processing weight of a bot as we go along


Recommended Posts

Something I'd like to ask, based on an observation of mine.

 

When I'm using a big bot which is running through lots of different sites (worpdress,blogger,tumbr etc etc) the bot starts to munch power. I've seen ubot using 400MB, even if the overall resource power hasn't gone much above 50%.

 

Still having said that, what I do find that happens frequently is like ubot gets a block on or indigestion - akin to what happens if you eat a Big Mac at Mcdonalds too quickly (you know that feeling when you're about to die and a litre of coke is the only drain cleaner to hand). If you hit stop and then start again, it recovers and continues. However if I did not hit stop and start it would just sit there indefinately.

 

This might be at a login screen or something similar.

 

While the latest version respects "wait for" much better - IE it literally will not move on until such time as the Wait For command is respected, clearly based on the above, it will wait until the end of time.

 

I'm not too sure what the best way is to solve this....

 

a) use some kind of monster delay - such as Delay 15 and then use a search page and look for a word like "Login", If true then run script again.

 

But the problem here is that you actually need to stop and start the script again. So a command such as Restart script [name]to simulate the action might be useful.

 

The other thing I was thinking/questioning was how ubot stores scripts once they have run. Is there anything I can do to, if you like, flush the already run scripts from Ubot, and so optimize my own bots as they are running?

 

Thanks

Link to post
Share on other sites

Running the scripts shouldn't be that heavy. Do you notice this in simple scripts - as in, scripts that don't have any subs? There are a few scripting possibilities that can increase the memory burden in that way.

Link to post
Share on other sites

I would say that simple scripts would reduce the processing load.

 

Code like this is rather bad for the load:

 

run sub1

 

sub1

if search page xyz

then

run sub1

else

blablabla

 

Since eventually you're going to reach a stack overload if the search page node always is true.

Link to post
Share on other sites

I would say that simple scripts would reduce the processing load.

 

Code like this is rather bad for the load:

 

run sub1

 

sub1

if search page xyz

then

run sub1

else

blablabla

 

Since eventually you're going to reach a stack overload if the search page node always is true.

 

Cheers, I do some of those, but generally I try and do

 

sub: Click Register

 

sub: Check for Decaptcher

 

sub: Manual Decaptcher

 

sub: Auto Decaptcher

 

sub Click Submit

 

With If's in between.

 

Still will bear this in mind. Basically subs with If's within subs a no.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...