DjProg 3 Posted April 9, 2016 Report Share Posted April 9, 2016 Hello guys, Another problem for no apparent reason: System.OutOfMemoryException was thrown... when doing a Save to file of a table of only MAX 50K lines !! Clearly this shouldn't happen, and when I check the system monitor, I can confirm that there is a TON of memory available. That's the system monitor when running the node... http://screencast.com/t/UEXhuP58 Needless to say that there is plenty of room to save a d*mn text file. Any idea ? I lost 3 hours+ of scrape due to this bug. I'm trying not to look to upset but I can tell you I AM !! Cheers, 1 Quote Link to post Share on other sites
DjProg 3 Posted April 9, 2016 Author Report Share Posted April 9, 2016 Well by looking at the debugger it's not even that much: 28000 rows and 8 columns... What's wrong again with Ubot ? => IS THERE A WAY TO "SAVE MY DEBUGGER DATA" ?? Because clearly the data is still there... Quote Link to post Share on other sites
luis carlos 94 Posted April 9, 2016 Report Share Posted April 9, 2016 Use thishttp://network.ubotstudio.com/forum/index.php/topic/16308-free-plugin-large-data/ Quote Link to post Share on other sites
HelloInsomnia 1103 Posted April 9, 2016 Report Share Posted April 9, 2016 It's not the system memory but rather how much memory Ubot is using, it's not 64bit so it's limited. Quote Link to post Share on other sites
jason 101 Posted April 10, 2016 Report Share Posted April 10, 2016 Sorry to hear there was an issue and you lost your data! That can be frustrating. Been there, for sure. As far as scraping best practices - Why grow the table inside your bot's memory to 28k x 8 before saving? Why not save the data to your hard drive for 3+ hours? 1) We recommend saving much more frequently. This is true whether using UBot or Microsoft Word. Build saving into your script, at least as a safeguard. 2) Growing the memory inside UBot like this is not the best way to use the software. You wouldn't use photoshop for 3 hours without saving, and likewise we don't recommend doing that with UBot. Clearing your data after saving it to the hard drive is a good way to keep things clean and functioning. Hope this helps! Quote Link to post Share on other sites
DjProg 3 Posted April 11, 2016 Author Report Share Posted April 11, 2016 Well now I'm dumping my table in CSV for each loop, this way if Ubot goes crazy I'll still know at what loop it crashed and still have the results before the crash. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.