Jump to content
UBot Underground

Help with browser crashing I think


Recommended Posts

Okay my bot is supposed to load a list of members, its a pop up and shows 11 members at a time, then you click load more and it shows another 11, well there are over 10,000 members and when it gets to about 900 ubot starts lagging bad, can anyone help me with this.

Link to post
Share on other sites

Can you post some code? That always helps.

 

This is the code, im working along side of nick.

 

define Scrape Users {

    click(<class="user-list-link subs ud-popup">"Left Click""No")

    set(#start$true"Global")

    loop while($exists($element offset(<class="btn">, 1))) {

        then {

            click($element offset(<class="btn">, 1), "Left Click""No")

            wait for browser event("Everything Loaded""")

            wait(5)

            add list to list(%urls$scrape attribute(<href=w"https://www.*.com/message/?action=send-new&userId=*">"href"), "Delete""Global")

            wait(20)

            else {

            }

        }

    }

}

Link to post
Share on other sites

If the bot just sits on a site and constantly collects data my guess would be either: the list gets to be too large - in which case you can save it off when it reaches a certain number (like a few thousand). Or, the browser.exe can also get too large so use in new browser to scrape and then after so many loops try close page (which will close that browser.exe instance) and then go back and scrape more.

Link to post
Share on other sites

Yes its the browser.exe getting too large, but if we close the browser and open it and list starts back at 12 and then it has to load more to scrape more again.

 

What do you mean it starts back at 12? Sorry I am not quite sure what site this works on so I'm not quite sure whats going on.

Link to post
Share on other sites

On the site we are scraping members to message.

 

When we scrape members its a javascript popup on there website shows 12 users, and a loadmore button at the bottom.

we scrape the 12 the program presses load more and it shows the next 12 to be scrapped.

 

Gets to about 800 scrapped and then freezes up as it sky rockets the memory usage of browser.exe.

 

But if u close that window and reopen it , it shows the first 12 so all is lost.

Link to post
Share on other sites

On the site we are scraping members to message.

 

When we scrape members its a javascript popup on there website shows 12 users, and a loadmore button at the bottom.

we scrape the 12 the program presses load more and it shows the next 12 to be scrapped.

 

Gets to about 800 scrapped and then freezes up as it sky rockets the memory usage of browser.exe.

 

But if u close that window and reopen it , it shows the first 12 so all is lost.

 

Okay, I can see how that would be a problem. I can't think of anything off the top of my head since this seems to be an issue with Ubot (hopefully they fix it in Ubot 5)

Link to post
Share on other sites

I noticed that this happens when browser.exe "in memory" size reaches a bit more than 1GB.

 

However, are "members" that you already scraped still displayed on the site/contained in its HTML?

 

If so you should remove them with "change attribute" command and that way reduce the HTML (size) that needs to be rendered by the UBot browser. This should reduce browser.exe RAM usage and prevent browser from crashing.

Link to post
Share on other sites

Its a java popup, there is no way to remove them. The browser is crashing due to the high load. and its not 1 gb its 8 gbs of memory being used at 900 loaded. I was hoping someone would know java and be able to tell me how to get around the browser problem, its not ubot problem but browsers.

 

I noticed that this happens when browser.exe "in memory" size reaches a bit more than 1GB.

 

However, are "members" that you already scraped still displayed on the site?

 

If so you should remove them with "change attribute" command and that way reduce the HTML (size) that needs to be rendered by the UBot browser. This should reduce browser.exe RAM usage and prevent browser from crashing.

Link to post
Share on other sites

I noticed that this happens when browser.exe "in memory" size reaches a bit more than 1GB.

 

However, are "members" that you already scraped still displayed on the site/contained in its HTML?

 

If so you should remove them with "change attribute" command and that way reduce the HTML (size) that needs to be rendered by the UBot browser. This should reduce browser.exe RAM usage and prevent browser from crashing.

 

 

They are not contained with in the html, its a java pop windows. so removing the members after scrape wont help.

Link to post
Share on other sites

In each iteration, just append the list to a file, no need to keep it in memory till the hole scrap is terminated.

 

EDIT:

 

Well, each iteration you have to clear the list, so it wont get bigger. just append the results to the temp file.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...