Jump to content
UBot Underground

webpro

Fellow UBotter
  • Content Count

    775
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by webpro

  1. WTF lol..... It's working for you guys ??? Geeee.... My usual luck then. I'll try to figure out what the hell is wrong on my end then Thanks by the way.
  2. Can UB do this ? ui open file("Go get your url file", #savedurlsondesktop) save to file("{$special folder("Desktop")}\\{#savedurlsondesktop}", %visitingmembersurl) I get: Given path's format is not supported What i'm trying to do is get my scrapped urls, which are saved on my desktop as whatever.txt, (import them into UB using the UI OPEN FILE command) and do a ADD TO LIST and do whatever i need to do with each urls. Then remove the used url from the %visitingmemberurl list, then save the file again on the desktop. So i delete the url from the %visitingmemberurl list first, then save it
  3. This is odd then ? I do all the settings in the TOOLS menu but it's not changing anything in the browser when i run it and like i said, it works fine using UITEXT boxes (which i'm using instead right now until it's sorted) Bug ???
  4. oH I SEE, then this could be a super neat command to add in the next version ! I will ask for it in the right section. Ok now i can load the saved file on my desktop and call back the urls (one by one) into UB and navigate to them and do my thing. Now i need to figure out how to remove the used url from the list (imported into UB) and then save the file again on the desktop/ Thanks EDITED: Got it ! I thought i would share this if someone gets stucked. Of course it won't be 100% the same asthe project you are working on but you will get the idea ui block text("Personal messages
  5. Any ideas why this doesn't work ? If i go into the TOOLS menu to adjust my settings, nothing gets passed along ? Works great tho if i use UITEXT boxes instead ? ui window("Select Country") { ui drop down("Country", "United States,Canada,United Kingdom,Australia,Afghanistan,Albania,Algeria,Andorra,Angola,Antigua and Barbuda,Argentina,Armenia,Australia,Austria,Azerbaijan,Bahamas,Bahrain,Bangladesh,Barbados,Belarus,Belgium,Belize,Benin,Bhutan,Bolivia,Bosnia and Herzegovina,Botswana,Brazil,Brunei,Bulgaria,Burkina Faso,Burma,Burundi,Cambodia,Cameroon,Canada,Cape Verde,Central African Republi
  6. Lol I didn't even notice there was a LIKE button !!!! If you have an idea on the procedure on how to remove a url from my scrappedurls.txt once i have used it SHOOT ! As i didn't figure it out yet. Thanks
  7. No not yet (Mysql). I will try both methods you talked about. I hope i'll catch at least one of them ! Edited: Right now i managed to get the scrappedurls.txt file loaded into UB again like you said (ADD LIST TO LIST command along with $List from file parameter) and navigate to the first url, then the second and so, on using NAVIGATE and $NEXT LIST ITEM in a loop so i'm getting there hehehehehe Thanks A LOT really appreciated. Now i need to figure out how to remove the used urls from the file. That's another ball game !
  8. Thanks, yeah i thought that this would be a good a way. Right now i am trying to figure out a way to get back all the scrapped urls. It's all scrapped in a .txt file on my desktop. (used the "save to file" command) My goal is to grab an url from it, do what i have to do ( the same as if i would use the navigate command along with $next list item without using a saved file) and to remove it from the scrappedurls.txt file on my desktop after. But i'm lost here. Which command/parameter to pick to do this etc... Must have something to do with NAVIGATE along with $get file Right ? Thanks
  9. Thanks guys. To be honest, i never worked with the define commands. No wonder i wasn't thinking about it. I guess i will start to use it. But this won't loop forever right ? You see, my goal is to first scrape every existing pages (depending of my settings). Then do what i have to do for around 300 profiles. Wait 14400 secs to 15400 (or so) Then do what i have to do for the next 300 profiles (no need to scrape to get the urls/profiles again) and so on, until there's no urls/pages left. Thanks
  10. Ok right now i have a LOOPWHILE scrapping pages. In fact, it scrapes everything first (which got to be thousands of pages ?) and once i have all the urls i want into %urllist, i will contact each users. The problem is, it's got a delay limit of people i can contact. So to bypass this, i need to wait 4 hours, so at least 14400 secs (well it's what i think) before i can continue the NAVIGATE command with the $next list item using %listurl list and contact people again. So the big question is, how do i tell the bot to wait the minimum required time before going to work again ? I'm trying to
  11. Ahhhh thanks. I thought i was dumb LOL UB Guys, make sure you post a thread for each update you do (please) with basic explanations. Also, we will be able to discuss about itand ask questions if any. It will be appreciated ! Thanks
  12. Isn't there a thread on updates ? Ex: 4.2.12 I can't find anything about it ? What it does and to discuss about it. Where are the "update threads" ? I must be dumb or blind (tho i would tend to pick up the first option lol...) but i can't find any
  13. Yes it does. I have the above commands on ALL my stuff
  14. This is what i do 1) Load a fresh copy of a UB browser 2)Use this: define Clear ALL Cookies { shell("C:\\windows\\system32\\rundll32.exe InetCpl.cpl,ClearMyTracksByProcess 4351") shell("cmd.exe /c rmdir /s /q \"%APPDATA%\\Macromedia\\Flash Player\\#SharedObjects\\\"") } Clear ALL Cookies()define Clear ALL Cookies 3) Also run this from a shell command C:\Program Files\CCleaner\ccleaner.exe /AUTO but you will need Ccleaner.com of course Don't recall if standard or pro can run shell commands tho ?
  15. Seth, can we post bugs or not about UB in the right section when we find one ? Man, i really don't want my balls to get fried hahahahaha Plus, i need to make sure and read things twice, no THREE times as my english sucks so... I know this ain't really good for biz (the bug section) but i think it's a good idea so that the software gets better and better ? You know what ? Remove it then. So no one will run into problems. I would...
  16. Guys, i will need your help on this one as it's too much for me and my UB knowledge right now. The bot is scraping profiles and it's saving the urls into a .text file for further use. Why am i doing this ? Cause i'm limited to 500 contact a day. So i figured that onced i got all the urls, i could call back the scrapped urls and send 500 contact requests every 24hrs. How do i call back the urls, one by one and make sure it does the job and gets removed from the .text file to make sure it doesn't contact the same profile twice, when the bot runs again ? In other words, it picks up where it
  17. Guys, where do you host ? Can you recommand something ? I'm looking for a VPS Right now i got some projects hosted by arvixe. DON'T EVER EVER pick them. TRUST me on this one. Worst host ever. The list of problems is too big to put in here it's a f*cking joke. In fact it's a nightmare !
  18. We ll can't say that in here too much lurkers hahahahahahaha I will pm you Right now it scrapes page one,2 then jumps to 5 and then 7,9,11,13,15,17 etc... (+2 increment go figure) 3 secs time inserted and it still jumping pages
  19. I see, thanks guys EDITED: Just tried some of the tips. Still doing it. Man this is odd ? Maybe this thread should be placed into the bug section lol !
  20. I don't know why it's doing this now ? If the bot clicks on Next>> to go to the next page, it always skips a page to scrape ? Ex: 1,3,5,7,9,11 etc....Will get scrapped and not 2,4,6,8 etc.... Wasn't doing this before and i ain't using the increment command ? Odd ? Does this ever happened to you ? loop while($exists($element offset(<tagname="span">, 6))) { add list to list(%urllist, $scrape attribute(<outerhtml=w"<a href=\"/profile.html?view=mini&uid=*&src=obr\" class=\"user_img\" onclick=\"return tagged.search.results.fillParamForm(*, *, \'/profile.html
  21. Hummm looks damn great ! I just copied the code into UB and it looks good ! Will give it a shot ! So in a way, you just compare the "urls" against the list ? So this could easily work out if i use $next list item You probably just teached me 2 things i wanted to know lol Thanks a lot ! Damn another person i will have to buy some beers. At this rate, might aswell throw a big party and invite all my "UB friends" lol
  22. Damn i got burned on something lol It did the same job on the same url because i used $random list item It picked up the same scrapped url (let's say you're scrapping urls) more than once. I thought it was working like $next list item, that once it has selected the url, it "throws it away" and doesn't pick it up again. Unless i'm not working it correctly ? Can we make sure that #random list item selects UNIQUE urls for each scrapped runs ? loop(#loopnumber) { clear list(%urllist) add list to list(%urllist, $scrape attribute(<outerhtml=w"<a href=\"*\" target=\"\" title=\"*\
  23. ODD, it works fine as .ubot but as .exe UI stat monitor isn't updated after the first number shown ? Ex: if it picks up 277secs, it will stay as 277secs Now i ain't sure if the wait time works either ? Looked like i was waiting for 277secs each time ? Does this happens on your end guys ?
  24. ahhhhhhh great mate ! Exactly what i was looking for. Man, nothing beats those kind of tutorials. You get the thing right on the face lol ! Thanks again !
  25. How can i choose the various user agent from an UI WINDOW ? I want to be able to select one, from the various option ex: CHROME, Explorer 10 etc... From a little pull down menu from the UI WINDOW. By understanding this, i will probably be able to do almost anything else regarding this (well i think ?) I know it has to do with the UI WINDOW Ex: If i want to offer various setting options from this little pull down menu 1) Set visibility 2) Allow popups 3) Set user agent 4) Allow flash 5) Set referrer etc.. I'm talking about the same place where you can add your deathbycaptach info. The TOO
×
×
  • Create New...