Jump to content
UBot Underground

[Free] Plugin Large Data


Recommended Posts

My latest free plugin good for large table data storing

 

commands and functions are

 

http://robobest.com/wp-content/uploads/2014/04/bigdatacommands.png

 

http://robobest.com/wp-content/uploads/2014/04/bigdatafunctions.png

 

Please make sure to look at the example code to see how everything works.

 

you can get the plugin here

http://robobest.com/2014/04/26/free-plugin-large-datatables/

 

if you have already downloaded one of my plugins you should have a email shortly

 

  • Like 9
Link to post
Share on other sites
  • Replies 209
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

My latest free plugin good for large table data storing   commands and functions are   http://robobest.com/wp-content/uploads/2014/04/bigdatacommands.png   http://robobest.com/wp-content/uploads/2014/

I edit my above comment before seeing your response. If your reading from file yes if this are already in memory no keep as is.

Much Appreciated Work KEV downloaded and will try now   

thanks guys this thing eats massive amounts of data. going offline i'll explain any of the functions/commands tomorrow if people have questions but the example code should explain.

Link to post
Share on other sites

TJ check PM.

 

to explain some of the return commands to ubot a bit better, 

 

list from large table

return large table csv

large table return

 

with the above commands you can specify a range for example row 10 to row 100 or you can return the full large table by specify 0 0 in row start and row finish.

 

Large table bulk input

 

I created this command to enable multithreading to be stored to a single main table.

 

plugin command("Bigtable.dll", "Large Table bulk input", "bulkinputtest", "1,2,3,4,5,6,7,8,9")

 

the above is not the best way to use this if you adding more than one line to large table in a thread the following is

 

plugin command("Bigtable.dll", "Large Table bulk input", "bulkinputtest", %somethingscraped)

 

so say for example you where scraping and then carrying a inner scrape of that you would add to a list like this

 

add item to list(%somethingscraped, "scrape1,scrape2,scrape3,scrape4", "Don\'t Delete", "Local")

 

so each line in the local list is a comma separated row of your scrapes. the local list is then added to the table.

 

check the example code to see how it works.
 

Thanks to everyone else for feedback i'm going to do a update later in the week to improve efficiency of a couple of the commands which were added at the last min to prove concept(most average users wont notice a difference only the heavy scrapers). If you have any suggestions for other commands let me know.

  • Like 1
Link to post
Share on other sites

Hey Kev, is it (thread) safe to use local lists for multi threading?

 

I remember ,there was  a mention on UbotDev's 'Thread counter'  plugin  thread - saying that lists are not 'thread safe'  and also Aymen's local dictionary plugin was created  becuase  ubot's  native  'local' lists/variables are not fully thread safe.

 

Thanks.

Link to post
Share on other sites

its the variables that can, that's why Aymen's local dictionary plugin was created. Local lists shouldn't be a issue I believe aymen added them so you can have large lists that are to big for ubots lists. I may be wrong but I haven't ever had issues with local lists as long as there in a define.

Link to post
Share on other sites

Thanks for the plugin. I think in modern days large data really is a problem, but UBot at core doesn't handle them well (I was even told it was never meant to).

 

However, was wondering, how exactly this plugin solves large data problem; by loading only a subset of data at a time, that's it, right?

 

Hey Kev, is it (thread) safe to use local lists for multi threading?

 

I remember ,there was  a mention on UbotDev's 'Thread counter'  plugin  thread - saying that lists are not 'thread safe'  and also Aymen's local dictionary plugin was created  becuase  ubot's  native  'local' lists/variables are not fully thread safe.

 

Thanks.

I think you got that wrong. There I was referring to global lists being accessed from threads, not lcoal ones.

 

its the variables that can, that's why Aymen's local dictionary plugin was created. Local lists shouldn't be a issue I believe aymen added them so you can have large lists that are to big for ubots lists. I may be wrong but I haven't ever had issues with local lists as long as there in a define.

Whats wrong with local variables? I never had problems with them but lately noticed that people are using workarounds for them. Is that really needed? I've didn't found a single example which would prove that local variables don't work.

Link to post
Share on other sites

Whats wrong with local variables? I never had problems with them but lately noticed that people are using workarounds for them. Is that really needed? I've didn't found a single example which would prove that local variables don't work.

 

I also played around with that recently and haven't had an issue. Running 50+ threads during my test. 

 

Dan

Link to post
Share on other sites

Whats wrong with local variables? I never had problems with them but lately noticed that people are using workarounds for them. Is that really needed? I've didn't found a single example which would prove that local variables don't work.

 

it came about from the http plugin. my theory is its return from plugin related but I haven't done any serious tests to prove I know Aymen proved it with a video. hes probably the best one to ask on this.

 

 

Thanks for the plugin. I think in modern days large data really is a problem, but UBot at core doesn't handle them well (I was even told it was never meant to).

 

However, was wondering how exactly this solves large data problem; by loading only a subset of data at a time, that's it, right?

 

 

it solves large data by replacing ubot tables altogether hence you can have tables with considerable amounts of rows compared to what's possible with ubot tables.

 

the loading of parts of the data is only if you returning stuff to ubot tables,lists where you wouldn't want to return everything from the large table.

Link to post
Share on other sites

it came about from the http plugin. my theory is its return from plugin related but I haven't done any serious tests to prove I know Aymen proved it with a video. hes probably the best one to ask on this.

 

 

 

it solves large data by replacing ubot tables altogether hence you can have tables with considerable amounts of rows compared to what's possible with ubot tables.

 

the loading of parts of the data is only if you returning stuff to ubot tables,lists where you wouldn't want to return everything from the large table.

I don't use http plugin, so maybe that's why I didn't experience it.

 

I see, thanks for explaining.

 

However, the only problem with large data (like loading 1 million rows) I had was that loading it into list/tables takes too much time, and during that time software is unresponsive...I hoped it will solve that, that's why I asked. What problems exactly do you have in mind?

Link to post
Share on other sites

It solves the problems with tables such as time to load in size and making ubot unresponsive. Also if you multithread write into a ubot table with a lot of threads it will become unresponsive. This solves that as well. Loaded 1/2 million csv file in with no issues but never tried a million should be ok as it did 1/2 with ease but im not sure you would have got a million in a ubot normal table anyway?

  • Like 1
Link to post
Share on other sites

It solves the problems with tables such as time to load in size and making ubot unresponsive. Also if you multithread write into a ubot table with a lot of threads it will become unresponsive. This solves that as well. Loaded 1/2 million csv file in with no issues but never tried a million should be ok as it did 1/2 with ease but im not sure you would have got a million in a ubot normal table anyway?

Again, thanks for explaining, will play with it for a bit. Yep, I actually loaded 1 million rows but not in table, but list...it only took around 5-10 minutes. :)

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

×
×
  • Create New...