98quinnk 0 Posted October 6, 2013 Report Share Posted October 6, 2013 Need some help. Program scrapes urls and then it lags down and crashes. define Scrape Users { click(<class="user-list-link subs ud-popup">, "Left Click", "No") set(#start, $true, "Global") loop while($exists($element offset(<class="btn">, 1))) { then { click($element offset(<class="btn">, 1), "Left Click", "No") wait for browser event("Everything Loaded", "") wait(5) add list to list(%urls, $scrape attribute(<href=w"https://www.*.com/message/?action=send-new&userId=*">, "href"), "Delete", "Global") wait(20) else { } } }} Is there away i can stop this loop, ive tried putting a set at the begining, and a set on a button which makes it false but it doesnt stop the loop. Kind Regards Kevin Quote Link to post Share on other sites
k1lv9h 76 Posted October 6, 2013 Report Share Posted October 6, 2013 Hi, Here is loop while with stop sample code:sample-loop-while-001.ubot Kevin Quote Link to post Share on other sites
98quinnk 0 Posted October 6, 2013 Author Report Share Posted October 6, 2013 It scraped 900 urls and then its gone all laggy i dont understand why this is happening Quote Link to post Share on other sites
the_way 52 Posted October 6, 2013 Report Share Posted October 6, 2013 its cause of the list. append them to a file one at a time, and you wont have this Quote Link to post Share on other sites
98quinnk 0 Posted October 6, 2013 Author Report Share Posted October 6, 2013 Its not the list, Whats happening is the following. I scrape users to message. But the the window that comes up is javascript it displays 12 users so I scrape them, then you have to press load more to display another 12 so it scrapes them now 24 are displaying, as the program continues to do this when It gets up to 700 scrapped it lags really bad, browsers.exe it using up all memory on the computer. If I where to close the window it goes back to just showing 12 users. And there can be like over 5000 users in the list I need to scrape. Any help please its the only thing standing in my way for completion of my program, everything else works fine. Quote Link to post Share on other sites
UBotDev 276 Posted October 7, 2013 Report Share Posted October 7, 2013 That's happening because UBot browser is using more and more memory when more and more content is loaded with JS...further more it also happens to all other browsers (Chrome,Firefox,IE,...), it's just that limits for each browser are different. However, I think you could try the approach that I recommended here:http://www.ubotstudio.com/forum/index.php?/topic/15047-help-with-browser-crashing-i-think/&do=findComment&comment=85213 Quote Link to post Share on other sites
brusacco 20 Posted October 8, 2013 Report Share Posted October 8, 2013 After you scrape the users, can you for example clear the DIV that contains the results?Then hit the "load more" and there will be always only 12 users loaded ... Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.