Luke 18 Posted March 6, 2011 Report Share Posted March 6, 2011 1 big 'ol list with lots of keywords. Let's say 10,000 words, comma separated... Many keywords are exact duplicates. To count each keyword's popularity, I'm trying to: 1. Make it count all found occurrences of each Unique keyword on the list.2. Place that info in a separate list for the time being.3. Remove the dupes4. Sew together the two lists into a table; Uniques on left, their counts on the right. Naturally this is giving me a headache because the unique list needs to be made before the counts can happen... Kind of a chicken or the egg situation. Anyone got a better idea to get the same results? Quote Link to post Share on other sites
Praney Behl 314 Posted March 6, 2011 Report Share Posted March 6, 2011 One easy way I could think is to duplicate the original list and then have the duplicate list to delete duplicates. Now you can calculate the different using the $eval constant and taking a difference of the two list totals. Praney Quote Link to post Share on other sites
Luke 18 Posted March 6, 2011 Author Report Share Posted March 6, 2011 One easy way I could think is to duplicate the original list and then have the duplicate list to delete duplicates. Now you can calculate the different using the $eval constant and taking a difference of the two list totals. PraneyHi Praney! Makes sense but how can we get the counts to line up with the unique list? It's getting really fuzzy for me at this point... Quote Link to post Share on other sites
UBotBuddy 331 Posted March 6, 2011 Report Share Posted March 6, 2011 I am thinking about this. Give me a bit of time. Quote Link to post Share on other sites
Luke 18 Posted March 6, 2011 Author Report Share Posted March 6, 2011 I am thinking about this. Give me a bit of time.No problem, Buddy. Thanks for any time you've got. Quote Link to post Share on other sites
UBotBuddy 331 Posted March 6, 2011 Report Share Posted March 6, 2011 Here ya go! Just change the folders. Of course I still have some development fat in this bot but I am sure you can eliminate that.CountingKeywords.zip 1 Quote Link to post Share on other sites
Luke 18 Posted March 6, 2011 Author Report Share Posted March 6, 2011 Here ya go! Just change the folders. Of course I still have some development fat in this bot but I am sure you can eliminate that.You, sir, are a bot Wizard. Srsly, have Seth change your board title from "Advanced Member" to "Botting Wizard" - You deserve it. You nailed this problem, exactly what I wanted it to do very first attempt! Thank you once again, I wouldn't get half as much done around here without your help every now and then. Cheers,Luke Quote Link to post Share on other sites
UBotBuddy 331 Posted March 6, 2011 Report Share Posted March 6, 2011 You are too kind. Thanks for the complement. I am glad I could help. Quote Link to post Share on other sites
positivity13 4 Posted August 11, 2013 Report Share Posted August 11, 2013 Guys I need this exact solution but I cant download it. Can anyone help? Quote Link to post Share on other sites
positivity13 4 Posted August 12, 2013 Report Share Posted August 12, 2013 Ignore that have worked it out now.Created a new list and let ubot delete the duplicates. I then run through this new list by each item and run that against my old list using comparison ever time it equaled the same it sets an increment, giving me a variable with the number of times it is duped. If any one wants the code give me a shout Quote Link to post Share on other sites
LoWrIdErTJ - BotGuru 904 Posted August 12, 2013 Report Share Posted August 12, 2013 add list to list (advanced remove dupes no) add list to list, list from text (advanced remove dupes yes) total dupes is list 1 - list 2list 2 dont have the dupes Quote Link to post Share on other sites
UBotDev 276 Posted August 12, 2013 Report Share Posted August 12, 2013 I also tried to download the file but it didn't work, so I spent some time to create my version...here it is: clear list(%URLS Original) add list to list(%URLS Original, $list from text("google.com google.com amazon.com facebook.com facebook.com facebook.com", $new line), "Don\'t Delete", "Global") clear list(%URLS Unique) add list to list(%URLS Unique, %URLS Original, "Delete", "Global") clear table(&URLS Count) set(#URLS Count, 0, "Global") loop($list total(%URLS Unique)) { set table cell(&URLS Count, #URLS Count, 0, $COUNT DUPLICATES($list item(%URLS Unique, #URLS Count))) set table cell(&URLS Count, #URLS Count, 1, $list item(%URLS Unique, #URLS Count)) increment(#URLS Count) } define $COUNT DUPLICATES(#DEF URL) { set(#URL Count, 0, "Global") set(#URLS Original, 0, "Global") loop($list total(%URLS Original)) { if($comparison($list item(%URLS Original, #URLS Original), "=", #DEF URL)) { then { increment(#URL Count) } else { } } increment(#URLS Original) } return(#URL Count) } Copy and paste the code and run the bot to see result which gets stored into UBot table. The code will also work with other strings, not just with URLs, so you could use that to find duplicated keywords for example. 2 Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.