Jump to content
UBot Underground

crazyflx

Moderators
  • Content Count

    279
  • Joined

  • Last visited

  • Days Won

    8

Posts posted by crazyflx

  1. EDIT: I originally had this posted on another forum (I still do actually), but then realized that the most appropriate place for it, would actually be here. I had this made for me BECAUSE of my constant use of uBot and I figure that almost everybody else using uBot would be able to make good use of this as well.

     

    I had this software made because I have tons of bots (uBots) running all day every day making accounts at tons of different sites.

     

    Thing is, uBot is a HUGE pain to use for verifying those accounts…so I had a freelancer make me a REAL program to do it. It is a multi-threaded program, so it’s crazy fast when it comes to visiting all the verification links.

     

    I’m obviously selling this not to make a profit (I say obviously because of how cheap I’m selling it for), but rather to recoup the cost of what I paid to have it programmed and to provide real software that provides a real purpose at a really cheap price.

     

    Here are some pictures, and then beneath the pictures are the details on exactly what it does.

     

    Nimbus Theme:

    http://img836.imageshack.us/img836/461/acnimbustheme.jpg

     

    Metal Theme:

    http://img831.imageshack.us/img831/152/acmetaltheme.jpg

     

    CDE Theme:

    http://img64.imageshack.us/img64/4613/accdetheme.jpg

     

    DropDown Menu Shown:

    http://img137.imageshack.us/img137/4734/acdropdownshown.jpg

     

    Alright, so now you want to know what it does. Well, it performs a very simple yet (typically) monotonous task…it opens all the emails you get while creating accounts, scans for verification links & if it finds any at all it visits each and every one…therefore verifying your accounts for you.

     

    As you can see in the picture above, it supports:

     

    Gmail

    Hotmail

    AOL

    Yahoo

    And also any “custom†emails you may have (you’ll simply need to know their POP settings).

     

    As for the others, you simply select the appropriate email from the dropdown and select it. It will automatically use the appropriate POP settings for each email provider.

     

    All you have to do is provide your username/password and then hit START.

     

    It will download all the emails on the server (it leaves a copy of all emails behind on the server, it doesn’t delete them).

     

    It will scan for links within each email.

     

    After it collects all the links, it visits them all…but it’s multi-threaded!

     

    That means SUPER FAST verification of emails. If you have it set to 50 threads and it has to visit 50 verification links, it will be visiting them all at once!

     

    You can minimize it to your task bar if you’ve got so many emails this thing will be running for awhile.

     

    It’s programmed in Java and is a “portable†.EXE file. That means you can run as many instances of it as you want and there is also no “installationâ€Â. It’s simply a click-and-run application.

     

    I’ve already got the programmer who made this making some improvements & adding some more cool features. Any updates will be given to people who have already purchased completely free of charge.

     

    Also, it won’t appear as though the emails have been opened on your server & it won’t appear as though the links have been clicked, so don’t be alarmed. It has opened & visited each link, you can verify this by manually visiting & logging into accounts.

     

    I'm selling this for $20. Unfortunately, my "buy now" page was hosted with JustHost which I regret...a lot. So now, if you would like to buy this, please just PM me.

     

    I also am offering a "ubotter" deal, in which you can buy an "unlimited" license for this software which allows you to "hand it out" to the buyers of your bots.

  2. Try this:

    http://img718.imageshack.us/img718/7444/proxycommand.jpg

     

    Obviously, the credentials will have to correspond to the selected proxy.

     

    You mentioned that you currently have your file containing your proxies in this format:

     

    ip: port: username: password

     

    Go ahead and check this thread out (it will show you how to separate that input file into individual variables for user separately): http://ubotstudio.com/forum/index.php?/topic/3012-separating-lines-or-cells-of-data-for-use-individually/

     

    After that is done, you'll have this:

     

    List Item 0 = IP

    List Item 1 = Port

    List Item 2 = Username

    List Item 3 = Password

     

    So, you'll be able to do this in the sub window:

     

    Change Proxy:

    address = list item 0

    port = list item 1

     

    Set Proxy Credentials

    username = list item 2

    password = list item 3

  3. Not sure if this is the problem, but it is always worth a look as I made the following (very silly) mistake:

     

    alert('Please read this entire alert. If you're going to insert alerts, don't make this error')

     

    The above alert will not fire. Why? Well, because it has apostrophes in the words "you're" and "don't". There can only be two apostrophes for the "alert" to "fire", those being the ones before and after the text you want to appear.

     

    I made that mistake and couldn't for the life of me figure out why my alerts weren't firing...then, I noticed...and felt very very silly.

  4. very helpful, but because the list position is set to 0 evertime it runs it will start at that position, even if the incremen value is in there at the end?

     

    If the "set to 0" were inside of the loop, then it would set it to 0 every time it ran. But because it is outside of the loop, it is first set to 0, then the loop begins so it will run through the whole list of usernames.

     

    It won't be set to 0 until the bot has stopped and restarted.

  5. Can any one tell me...how to use a bot using proxy in detail please...?

    actually i was scraping data from a site and after a while the warning came that data scraping is illegal and stuff and bot stopped...

     

    To give an appropriate example, we'd need to know what kind of proxies you planned on using (public or private, and if private do they require authentication).

  6. That "delay 30 seconds" was actually supposed to be a "stop script" command, as it was only set up to navigate to one of the listings. I basically just threw it together as an example.

     

    I've attached a fully functional version to this post, and there is a picture below of the changes.

     

    Here is a basic rundown of what the bot is doing:

     

    "Choose by attribute" -> That is selecting only the listing titles of the search results.

     

    "Scrape Page" -> This was actually supposed to be "scrape chosen attribute", but I ended up just using a "scrape page" command because I realized I needed to scrape javascript actions and not URL's.

     

    So, the "scrape page" command is scraping the javascript action for each listing title. Normally when you click on something, you're navigating to a URL. But these listings are all using "onclick" javascript commands, so you need to scrape all of those and then run a "javascript" of the "onclick" commands you've scraped.

     

    Here are the changes I made (compare to your current bot):

     

    http://img188.imageshack.us/img188/7434/javascript.gif

     

    The "set: #javascript" command is taking the next "onclick" javascript command that we scraped with the page scrape command above, and is setting it to a variable for use later.

     

    Then, "run: #javascript" is running that "onclick" command, which is essentially causing the browser to think you clicked on a listing.

     

    Then a "wait finish" command...waits until the page is finished loading.

     

    Then a "run javascript: history.go(-1)" <- That makes the browser go "back" 1 page (just as if you hit the back button on the browser). The reason we need to go back, is because the next "run javascipt: #javascript" command won't work unless we are at the page that we scraped that value from originally.

     

    Then another "wait finish" command.

     

    Then the whole things loops all over again.

     

    If you're looking to scrape the "comprehensive listings" page, you're going to need to insert all the nodes that scrape that listing right after the first "wait finish" command, and have the very last two nodes that are currently there, as the last two nodes after whatever you insert.

  7. i downloaded this and put in a csv file and i still get errors...maybe i dont understand the intructions?

     

    It is probably because you don't have enough cells. That example currently tries to change the text field at "wordcounttool.com" to the content that is in the 3rd cell...so if you don't have a 3rd cell, it is going to throw an error.

     

    Try changing the 2 in the following node to a number that is relevant to the number of cells in your input file (the instructions are also on the picture below):

     

    http://img199.imageshack.us/img199/3057/nodetochange.gif

  8. Do the following:

     

     

    "Create Account" -> Male/Female

    Set -> #username -> Account Constants -> $Username

     

    Next time you want to insert the username that you want to use throughout your whole bot in any script, instead of inserting $username, you'll insert: #username

  9. What I'm about to say has only worked for my compiled bots, and only works when your .txt or .csv file is stored in the same folder (or subfolder) as your compiled .exe file.

     

    If you're looking to have the values loaded by uBot without you having to find the file every time, you could run this command:

     

    add to list

    list from file

    /accounts/accounts.csv

     

    It literally should look just like that. When you leave out everything from the front of the "/accounts/accounts.csv" file, the compiled bot just assumes that the file it is looking for is stored in the same folder as the .exe file itself.

  10. I just took a look, and I'm not sure what it is you're trying to do.

     

    After searching "TV", I clicked on a listing. It didn't take me to any website, it just took me to a more comprehensive listing of the link I clicked on.

     

    Here is an example. The following was one of my search results:

     

    Tanya Electronics   1 review 
    
    Location - Y-348,C/2,Sec-12, Subzi Mandi, Noida, Noida - 201301  
    
    Call  - +(91)-(11)-66361349  
    
    Also See  - Tv Dealers, AC Dealers-LG, AC Repair & Services 

     

    After clicking it, this is what it showed me:

     

     Mr Sunil Chauhan  
    
     +(91)-(11)-66361349  
    
     +(91)-9310084459 
    
     Send Enquiry By Email  
    
     Y-348,C/2,Sec-12, Subzi Mandi, Noida, Noida - 201301 

     

    So what are you trying to get from this site? If you're looking to just get the search results themselves, you could scrape the names of the search results.

     

    If you're trying to get to each listings "more comprehensive" listings, you can do this:

     

    (decided it was too difficult to explain. Just download the attached ubot.) Inside of the ubot, navigate to the page you mention in your original post, search for something, and then run the bot.

     

    What it does is scrape the javascript commands for "onclick" for each site, then navigates to each of them individually (in essence, the bot is "clicking" on each and every listing individually)

    searchjustdialcom.ubot

  11. Thanks for the captcha script.. But how do I include this in my ubot for it to solve some captcha? I want it to show inside a sub window etc and auto solved the captcha then proceed on the next site... ???

     

    Well, you could do one of two things.

     

    Option A: Copy the entire bot I've attached, step by step, into it's own sub (named beatcaptcha or something) inside of your existing bot. You would delete the "choose by attribute" node for the captcha image and "choose by attribute" for the solved captcha field however.

     

    Then set up your bot like this:

     

    your bot is doing something

    your bot is doing something

     

    choose by attribute -> captcha image

    run sub "beatcaptcha"

     

    then choose by attribute -> solved captcha field

    change chosen attribute -> account constants -> $captcha

     

    Option B: You can run an "include" inside the bot, which runs a bot that is outside of the bot you're currently running.

  12. After looking this over pretty extensively, I'm afraid I couldn't come up with a way that you could make one script and have it work for all the sites.

     

    The good news is, that uBot doesn't have any problem filling these fields. You just have to make an individual script or sub for each site.

  13. Yeah, I'll poke Seth some and lets see his answer.

     

    Yes! Thanks :D

     

    While you're poking him, let him know that in a perfect world, it would be great if the "remove" command worked two ways:

     

    1 - In much the same way as "list item" in that it asks for the list position of the item you want to go to...or in this case, remove.

     

    or

     

    2 - remove current -> Would simply remove the current list item.

  14. thanks crazyflx...

    but i have confusion in two nodes...one is send keys chosen and second send keys field chosen...why we are using them here...and what is this "s" in the send keys field node.

     

    Alright, the first "send keys chosen" is to activate the field so that the dropdown appears.

     

    If you just use "send keys field chosen" alone, nothing happens (no dropdown). Also, there will be no dropdown if you use "change chosen attributes".

     

    So, it goes like this:

     

    send keys chosen -> s (this activates the field so the dropdown appears)

     

    change chosen attributes -> $nothing (this empties the field)

     

    send keys field chosen -> "something " (it must be followed by two spaces. We use send keys field chosen here because google is actually recognizing each "keystroke", but with a bit of "lag time", which is why you follow it with two spaces...so that google is aware of all the keystrokes that took place)

  15. The "Wait Until Finished" command can sometimes actually be a pain. The reason for this is because some sites are using AJAX to load new content for you after hitting the submit button.

     

    The "Wait Until Finished" command waits for a newly loaded page to finish loading, so if after hitting the submit button, it doesn't navigate to a new page, the bot is just going to assume that the page has finished loading and move on.

     

    What you can do is a "Wait For" command, and have it wait for a bit of chosen text to appear on the screen AFTER the submit button is pushed.

     

    So, if after pushing submit, the text "success" appears on the screen, right click on the word "success" then choose the "wait for" command.

     

    Now, it will wait until the word "success" appears before moving on.

×
×
  • Create New...