kev123 132 Posted April 26, 2014 Report Share Posted April 26, 2014 My latest free plugin good for large table data storing commands and functions are http://robobest.com/wp-content/uploads/2014/04/bigdatacommands.png http://robobest.com/wp-content/uploads/2014/04/bigdatafunctions.png Please make sure to look at the example code to see how everything works. you can get the plugin herehttp://robobest.com/2014/04/26/free-plugin-large-datatables/ if you have already downloaded one of my plugins you should have a email shortly 9 Quote Link to post Share on other sites
blumi40 222 Posted April 26, 2014 Report Share Posted April 26, 2014 thx nice done!viagra.dll :) Quote Link to post Share on other sites
Code Docta (Nick C.) 638 Posted April 26, 2014 Report Share Posted April 26, 2014 Relly super kewl!! Thanks Kev Quote Link to post Share on other sites
a2mateit 395 Posted April 26, 2014 Report Share Posted April 26, 2014 Very nice! Thanks for keeping it free Quote Link to post Share on other sites
kev123 132 Posted April 26, 2014 Author Report Share Posted April 26, 2014 thanks guys this thing eats massive amounts of data. going offline i'll explain any of the functions/commands tomorrow if people have questions but the example code should explain. Quote Link to post Share on other sites
LoWrIdErTJ - BotGuru 904 Posted April 26, 2014 Report Share Posted April 26, 2014 looks good nice job bud. I subscribed and some reason didnt get a confirmation email or one to download. look forward to taking a look. nice job again Quote Link to post Share on other sites
Jeredoc 4 Posted April 26, 2014 Report Share Posted April 26, 2014 really nice thank u ! Quote Link to post Share on other sites
Aymen 385 Posted April 26, 2014 Report Share Posted April 26, 2014 nice little handy pluginGreat job Quote Link to post Share on other sites
the_way 52 Posted April 26, 2014 Report Share Posted April 26, 2014 much appreciated kev Quote Link to post Share on other sites
Sanjeev 46 Posted April 27, 2014 Report Share Posted April 27, 2014 Thanks Kev! Quote Link to post Share on other sites
dutchman 0 Posted April 27, 2014 Report Share Posted April 27, 2014 Thanks really appreciated Kev !!! Quote Link to post Share on other sites
jamesfar 15 Posted April 27, 2014 Report Share Posted April 27, 2014 nice Quote Link to post Share on other sites
julius 5 Posted April 27, 2014 Report Share Posted April 27, 2014 Just what I needed.. hope this works.. thanks tsong! Quote Link to post Share on other sites
kev123 132 Posted April 27, 2014 Author Report Share Posted April 27, 2014 TJ check PM. to explain some of the return commands to ubot a bit better, list from large tablereturn large table csvlarge table return with the above commands you can specify a range for example row 10 to row 100 or you can return the full large table by specify 0 0 in row start and row finish. Large table bulk input I created this command to enable multithreading to be stored to a single main table. plugin command("Bigtable.dll", "Large Table bulk input", "bulkinputtest", "1,2,3,4,5,6,7,8,9") the above is not the best way to use this if you adding more than one line to large table in a thread the following is plugin command("Bigtable.dll", "Large Table bulk input", "bulkinputtest", %somethingscraped) so say for example you where scraping and then carrying a inner scrape of that you would add to a list like this add item to list(%somethingscraped, "scrape1,scrape2,scrape3,scrape4", "Don\'t Delete", "Local") so each line in the local list is a comma separated row of your scrapes. the local list is then added to the table. check the example code to see how it works. Thanks to everyone else for feedback i'm going to do a update later in the week to improve efficiency of a couple of the commands which were added at the last min to prove concept(most average users wont notice a difference only the heavy scrapers). If you have any suggestions for other commands let me know. 1 Quote Link to post Share on other sites
Sanjeev 46 Posted April 27, 2014 Report Share Posted April 27, 2014 Hey Kev, is it (thread) safe to use local lists for multi threading? I remember ,there was a mention on UbotDev's 'Thread counter' plugin thread - saying that lists are not 'thread safe' and also Aymen's local dictionary plugin was created becuase ubot's native 'local' lists/variables are not fully thread safe. Thanks. Quote Link to post Share on other sites
kev123 132 Posted April 27, 2014 Author Report Share Posted April 27, 2014 its the variables that can, that's why Aymen's local dictionary plugin was created. Local lists shouldn't be a issue I believe aymen added them so you can have large lists that are to big for ubots lists. I may be wrong but I haven't ever had issues with local lists as long as there in a define. Quote Link to post Share on other sites
Sanjeev 46 Posted April 27, 2014 Report Share Posted April 27, 2014 I haven't ever had issues with local lists as long as there in a define.Thanks, that's all I wanted to know. Quote Link to post Share on other sites
rocket976 62 Posted April 27, 2014 Report Share Posted April 27, 2014 Great stuff Kev cant wait to try this. Thank you. ! Quote Link to post Share on other sites
webpro 31 Posted April 27, 2014 Report Share Posted April 27, 2014 KEV MY MAN viagra.dllINDEED Quote Link to post Share on other sites
UBotDev 276 Posted April 27, 2014 Report Share Posted April 27, 2014 Thanks for the plugin. I think in modern days large data really is a problem, but UBot at core doesn't handle them well (I was even told it was never meant to). However, was wondering, how exactly this plugin solves large data problem; by loading only a subset of data at a time, that's it, right? Hey Kev, is it (thread) safe to use local lists for multi threading? I remember ,there was a mention on UbotDev's 'Thread counter' plugin thread - saying that lists are not 'thread safe' and also Aymen's local dictionary plugin was created becuase ubot's native 'local' lists/variables are not fully thread safe. Thanks.I think you got that wrong. There I was referring to global lists being accessed from threads, not lcoal ones. its the variables that can, that's why Aymen's local dictionary plugin was created. Local lists shouldn't be a issue I believe aymen added them so you can have large lists that are to big for ubots lists. I may be wrong but I haven't ever had issues with local lists as long as there in a define.Whats wrong with local variables? I never had problems with them but lately noticed that people are using workarounds for them. Is that really needed? I've didn't found a single example which would prove that local variables don't work. Quote Link to post Share on other sites
Bot-Factory 602 Posted April 27, 2014 Report Share Posted April 27, 2014 Whats wrong with local variables? I never had problems with them but lately noticed that people are using workarounds for them. Is that really needed? I've didn't found a single example which would prove that local variables don't work. I also played around with that recently and haven't had an issue. Running 50+ threads during my test. Dan Quote Link to post Share on other sites
kev123 132 Posted April 27, 2014 Author Report Share Posted April 27, 2014 Whats wrong with local variables? I never had problems with them but lately noticed that people are using workarounds for them. Is that really needed? I've didn't found a single example which would prove that local variables don't work. it came about from the http plugin. my theory is its return from plugin related but I haven't done any serious tests to prove I know Aymen proved it with a video. hes probably the best one to ask on this. Thanks for the plugin. I think in modern days large data really is a problem, but UBot at core doesn't handle them well (I was even told it was never meant to). However, was wondering how exactly this solves large data problem; by loading only a subset of data at a time, that's it, right? it solves large data by replacing ubot tables altogether hence you can have tables with considerable amounts of rows compared to what's possible with ubot tables. the loading of parts of the data is only if you returning stuff to ubot tables,lists where you wouldn't want to return everything from the large table. Quote Link to post Share on other sites
UBotDev 276 Posted April 28, 2014 Report Share Posted April 28, 2014 it came about from the http plugin. my theory is its return from plugin related but I haven't done any serious tests to prove I know Aymen proved it with a video. hes probably the best one to ask on this. it solves large data by replacing ubot tables altogether hence you can have tables with considerable amounts of rows compared to what's possible with ubot tables. the loading of parts of the data is only if you returning stuff to ubot tables,lists where you wouldn't want to return everything from the large table.I don't use http plugin, so maybe that's why I didn't experience it. I see, thanks for explaining. However, the only problem with large data (like loading 1 million rows) I had was that loading it into list/tables takes too much time, and during that time software is unresponsive...I hoped it will solve that, that's why I asked. What problems exactly do you have in mind? Quote Link to post Share on other sites
kev123 132 Posted April 28, 2014 Author Report Share Posted April 28, 2014 It solves the problems with tables such as time to load in size and making ubot unresponsive. Also if you multithread write into a ubot table with a lot of threads it will become unresponsive. This solves that as well. Loaded 1/2 million csv file in with no issues but never tried a million should be ok as it did 1/2 with ease but im not sure you would have got a million in a ubot normal table anyway? 1 Quote Link to post Share on other sites
UBotDev 276 Posted April 28, 2014 Report Share Posted April 28, 2014 It solves the problems with tables such as time to load in size and making ubot unresponsive. Also if you multithread write into a ubot table with a lot of threads it will become unresponsive. This solves that as well. Loaded 1/2 million csv file in with no issues but never tried a million should be ok as it did 1/2 with ease but im not sure you would have got a million in a ubot normal table anyway?Again, thanks for explaining, will play with it for a bit. Yep, I actually loaded 1 million rows but not in table, but list...it only took around 5-10 minutes. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.