MrGeezer 3 Posted March 18, 2014 Report Share Posted March 18, 2014 Hi guys! I am scraping about 5000 rows of data to a table, then writing that information to a csv sheet. However I am noticing that there are performance issues when i get to around 2000 rows and sometimes it fails to even store the data at all. Wondering what is the best practice to store large amounts of data locally as from what I have read on the forums, tables are not ideal. Cheers Quote Link to post Share on other sites
pash 504 Posted March 18, 2014 Report Share Posted March 18, 2014 Storage by 1000 and use the append to file. 1 Quote Link to post Share on other sites
the_way 52 Posted March 18, 2014 Report Share Posted March 18, 2014 make sure that you have a regular save of the file locally as a .csv Quote Link to post Share on other sites
MrGeezer 3 Posted March 19, 2014 Author Report Share Posted March 19, 2014 Ah makes sense to me thank you for your help! Do you think it is better (performance wise) to append each line to a csv file, or for every 500 rows or so? Quote Link to post Share on other sites
UBotDev 276 Posted March 19, 2014 Report Share Posted March 19, 2014 If your bot is fast and you are running threads you should save more than 1 row at a time, else your disk will caught on fire. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.