Jump to content
UBot Underground

mdc101

Fellow UBotter
  • Content Count

    119
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by mdc101

  1. Thanks for the feedback. I guess this would be a great idea for future updates Regards Matt
  2. Just checked in to see if there would be any response. Any suggestions guys?
  3. Hi Guys I have just started playing with the database connector in UB5 dev. Awesome - made me re think about the entire way I build bots. OK so here is the questions. How do you set multiple variables directly from a single select database row in mysql? What is the most practical fastest correct way to do the above? This is what I am trying to ask suggestion or advice on... Currently the only option we have is this. plugin command("DatabaseCommands.dll", "connect to database", "mysql:server=196.128.0.1;uid=username; pwd=password; database=test_db; port=3306; pooling=false") { plu
  4. Hey guys Is anyone having issues with getting the chage file field command to work. I have the following code and when I run it does now work. I have hard coded the file path and it does not change / set the file path. Is there a bug or am I doing something wrong? loop(#ImportLoopCount) { set(#FileLocation, $list item(%Krakken_keyword_paths, #ImportLoopItem), "Global") change file field(<id="local_file">, #FileLocation,) wait(1) click(<name="submit">, "Left Click", "No") wait for element(<name="importType">, 15, "Appear") change dropdown(<name="importType">, "Kr
  5. Thank you John, will look at your code now
  6. Hey Guys how would one go about calculating the average of a column in a table with ubot? At the moment I am doing this in excel but wouldlike my bot to do this as it is a task that should be done automatically? Any suggestions would be appreciated
  7. HI Kreatus Thanks for the reply I have a variable that holds all the urls I have appleid the following: set(#Urls, $replace regular expression(#Urls, "\\/$", $nothing), "Global") This has not stripped the last / In the #Urls variable their are lines that are blank because I have made them blank by removing the doamins I dont want. Ideally I want to strip the last / and the import the urls into a list to remove the duplicates #Urls: www.epa.gov/iaq/pubs/airclean.html yourhome.honeywell.com/home/Products/Air%2BCleaning/ www.aprilaire.com/index.php%3Fcategory%3Dcleaner%26znfA
  8. Hi guys How do you remove the last backslash from a string using regex? examples www.domain.com/cars/ www.domain.com/ how do you get? www.domain.com/cars www.domain.com The data is all in a variable and I want to remove all the last back slashes before putting into list
  9. my skype: jbay-mdc Please add
  10. Hi Guys Here is two small videos demonstrating how the memory is just being eaten up. This url shows just ubot standing still http://screencast.com/t/3U29mS3Tzv8c No script nothing running but it is slowly using up memory. The vid is for 1 minute Vid 2, I load a script and don't run the script for the first 2 minutes. http://screencast.com/t/Dlpg589c The next 3 minutes I run the script. You clearly see that the memory usage is going up per second. Mem cleaner is a work around but to be honest we need to have a solid platform as I personally do not feel comfortable selling the bots
  11. With further testing, what I have had to do is on each command I have had to have it open in a new browser. Basically a new window for a procedure or process. If I run the entire process it still seems to grow massive. This seems to cut back the memory buildup as the browsers open and close. One thing to be aware of aswell is you need to look for the bots name as a process as well and set that to a limit. so you should have UBtBrowser.exe UBot Studio.exe (if running dev tests) your bot name.exe This way you manage the browsers plus the main application you running.
  12. Hi Guys I know that quite a few of the guys in the community have been having issues with the memory being eaten up by the bots which causes bot failure. I have experienced this as well. I have found a temporary solution that "manages the memory leak" until the dev team at UB can get issues fixed. I have been testing this on my bots and have had better success and less memory failures. I have been using an app called CleanMem http://www.pcwintech.com/cleanmem The reason why I used this is because it allowed me to track the memory usage per process example: UBtBrowser.exe UBot Studi
  13. Right thanks for the feeback guys. So I guess it the long way around - damn Thanks
  14. Hi I have csv file. I have added the file to table. I want to remove rows from the table that are greater than or equal to 4 in column 1 How do you remove rows from a table?
  15. Has anyone ever had issues with words that have dashes in them: "keyword-phrase"? How has any of you cleaned lists with phrase match keywords that have dashes in them? It seems no one has an answer
  16. hard coding the search phrase in does not work either $find regular expression($list item(%Urls, #FirststLevelQ), "\\b.*content-curation.*\\b(?i)")
  17. I have tried a few regexs that work in regexbuddy and deliver the correct result of 13 items found: Ruby format & .NET: \b.*content-curation.*\b(?i) However in UBOT it fails to deliver the 13 required urls???? $find regular expression($list item(%Urls, #FirststLevelQ), "\\b.*{#SearchStringDashed}.*\\b(?i)") Wonder if this is a bug?
  18. Hi John There are still urls that are not required. See on your desktop -results.txt The following urls should be removed as the do not contain the phrase "content-curation" /Curation /Digital-Curation /Social-Curation /Data-Curation /Store-Curation /Who-curates-the-curators /search?q=curation&context_type=&context_id= /Why-isnt-the-National-Museum-of-the-American-Indian-more-like-the-Holocaust-Museum /National-Geographic-1 /National-Football /United-Nations /Nations /The-National-band /Live-Nation /National-Public-Radio /Bling-Nation /Washington-Nationals /search?q=curation+nation
  19. here is a sample of what I have done to dynamically insert the dash into the keyword... Would this work with your idea? comment("Take seed keywords and build up list of urls") set(#SearchString, "{$next list item(%Keywords)} ", "Global") set(#SearchString, $trim(#SearchString), "Global") set(#SearchString, $change text casing(#SearchString, "Lower Case"), "Global") set(#SearchStringDashed, $replace(#SearchString, " ", "-"), "Global") set(#SearchStringDashed, $trim(#SearchStringDashed), "Global") set(#
  20. In the ideal situation the script will use a keyword in phrase match and return the urls. As the urls use a "-" between the words and we need to find out keyword phrase in the url, I can create the #SearchStringDashed and have the - placed where needed in the Keyword being searched within the script without the need of the text box input. I did the example to demonstarte the weird results I was seeing. Could your regex use this variable #SearchStringDashed? All we want to do is only keep the urls that have the keyword phrase "keyword-phrase" in the urls. The rest of the urls can be removed
  21. Sorry John never meant to Rush anyone, saw alot of views and no comments so I figured most are in the same baot as me!!
  22. No one able to figure out why or explain why I am getting the urls all sorts of results instead of the exact phrase match?
  23. Hi Guys I have noticed that their are many questions in the forum that deal with 1) list exceeded errors 2) removing words, urls, phrases from lists. I have personally struggled with this basic list management challenges So I have done an experiment and wrote a code block for a sample to start with and am hoping the experts can add their valued input to this thread to get a working example that does what it is supposed to as my attempt failed dismally. 1) I wanted to manage the "list exceeded challenge" 2) I wanted my phrase match to be accurate. The challenge here is to refine the c
×
×
  • Create New...