Jump to content
UBot Underground

smartquin

Fellow UBotter
  • Content Count

    16
  • Joined

  • Last visited

Posts posted by smartquin

  1. Hmmm, possible, however I was more after soemthing that will scrape the relative and absolute paths of whatever page of the site you are currently on. Essentially I just need something to exclude rss/xml feeds and anything that leeds off-site. I'm not too good at reg-ex though, any help would be great :-D

  2. Hi Guys

     

    I made this bot that will go to hidemyass.com, scrape the proxy list, and then run simultaneous threads (based on TJ's script). The only thing it doesn't do is pickup bad/dead proxies. I thought scraping them straight from HMA would work, but not too well, maybe 50% failure rate. Any suggestions welcome :-)

     

    ui drop down("Thread Count", "2,3,4,5,6,7,8,9,10", #num threads)
    comment("set total number of runs, I just made it 24 for testing purposes")
    set(#number accounts, 24, "Global")
    set(#num created, 0, "Global")
    set(#used threads, 0, "Global")
    loop(1) {
        navigate("http://www.hidemyass.com/proxy-list/", "Wait")
        wait for browser event("Everything Loaded", "")
        change dropdown(<name="s">, "Response time")
        click(<id="updateresults">, "Left Click", "No")
        wait for browser event("Everything Loaded", 30)
        clear list(%paddress)
        clear list(%pport)
        clear list(%prox)
        add list to list(%paddress, $scrape attribute(<outerhtml=w"<span><style>*</span>">, "innertext"), "Don\'t Delete", "Global")
        add list to list(%pport, $scrape attribute(<outerhtml=w"<td>
    *</td>">, "innertext"), "Don\'t Delete", "Global")
        loop($list total(%paddress)) {
            if($comparison($list position(%paddress), "<", $list total(%paddress))) {
                then {
                    add item to list(%prox, "{$next list item(%paddress)}:{$next list item(%pport)}", "Delete", "Global")
                }
                else {
                }
            }
        }
    }
    loop(#number accounts) {
        loop while($comparison(#used threads, ">=", #num threads)) {
            wait(1)
        }
        loop process()
    }
    define loop process {
        increment(#used threads)
        increment(#num created)
        registration procedure()
    }
    define registration procedure {
        thread {
            in new browser {
                registration code here()
                decrement(#used threads)
            }
        }
    }
    define registration code here {
        if($comparison(#num created, "<", #number accounts)) {
            then {
                change proxy($random list item(%prox))
                navigate("http://findwhatismyipaddress.org/", "Wait")
                wait(20)
            }
            else {
            }
        }
    }

  3. Hi Guys

     

    I'm trying to write a bot that will go to a webpage, scrape the <a> tags and get the hrefs, and visit each page (to simulate a real person). However, I don't want to be directed to the website's rss feed, to another domain (like their facebook/twitter page), a javascript link etc.This is what I have so far:

     

    set(#rootdomain, $find regular expression($url, "(?![/|/www.])[a-zA-Z0-9\\-\\.]+\\.[a-zA-Z]\{2,4\}(?=/)"), "Global")
    clear list(%urls)
    clear list(%cleanedurls)
    add list to list(%urls, $list from text($scrape attribute(<tagname="a">, "href"), $new line), "Delete", "Global")
    set list position(%urls, 0)
    loop($list total(%urls)) {
        set(#temp, $next list item(%urls), "Global")
        if($contains(#temp, #rootdomain)) {
            then {
                add item to list(%cleanedurls, $list item(%urls, $list position(%urls)), "Delete", "Global")
            }
            else {
            }
        }
    }
    add list to list(%cleanedurls, $scrape attribute(<(href=w"/*" OR href=w"..*")>, "href"), "Delete", "Global")
    loop($rand(0, $list total(%cleanedurls))) {
        click(<href=$random list item(%cleanedurls)>, "Left Click", "No")
        wait($rand(20, 180))
    }

     

    The problem I have is some links could be relative, others absolute, and this bot still adds the addresses for googleads, xml feeds etc.

     

    Any help would be greatly appreciated!

  4. Hi, I'm trying to make a bot that will backup my MySQL database using phpMyAdmin. I can schedule a bot to do it, but everytime it saves the database it opens a dialogue box asking where to save the file. It is phpMyAdmin v 3.5.2.2. Any help will be greatly appreciated, here is my code so far:

     

    clear cookies

    navigate("http://mydomain.net/phpmyadmin/", "Wait")

    wait(5)

    type text(<username field>, "root", "Standard")

    type text(<password field>, "password", "Standard")

    click(<id="input_go">, "Left Click", "No")

    wait(5)

    click(<innertext="Export">, "Left Click", "No")

    wait(5)

    click(<id="buttonGo">, "Left Click", "No")

  5. Hi Guys

     

    I wonder if this feature is included at all: I'm writintg a script that needs to get a variable from the user before it does its tasks. At the moment, I'm using an Auto It script called form a batch file. It is setup in task scheduler automatically, and runs once only to post to blogs etc. however it needs to grab a variable to get the right photos/text etc. Is there a way of calling the bot from a task scheduler task in windows and pass a variable to it (eg: Enter this in the command line

    schtasks /create /tn "blogpostcombination1234" /tr "C:\bot.bat" /sc once /st 17:35:49 /sd 11/27/2012 /ru User /rp pass

     

    The bat file looks like this:

     

     

    bot.exe /play blogpostcombination1234

    )

     

    Any ideas?

    • Like 1
  6. Hello again

     

    The script was working great (thanks by the way Josh!) however now it is doing a similar thing even with CSS turned off. My script grabs login details from a file, creates an email account, then goes to create a lens. I have attached the squidoo signup portion of my script, if anyone has some ideas it will be greatly appreciated!

     

    Just on a side note, could it be that one of the members on this forum is working to counteract these scripts for squidoo?

     

     

    allow css("No")

    navigate("http://www.squidoo.com/", "Wait")

    click(<innertext="Join Us">, "Left Click", "No")

    wait(5)

    change attribute(<email field>, "value", $next list item(%my list items))

    click(<type="submit">, "Left Click", "No")

    wait(1)

    click(<type="submit">, "Left Click", "No")

    wait(3)

    change attribute(<id="username">, "value", $next list item(%my list items))

    wait(1)

    change attribute(<id="member_password">, "value", $next list item(%my list items))

    type text(<name="_squidcap_e">, $solve captcha(<id="captchaImg">), "Standard")

    wait(1)

    click(<innertext="Sign up!">, "Left Click", "No")

    wait for browser event("Everything Loaded", "")

    wait(1)

    allow css("Yes")

    navigate("http://www.squidoo.com/lensmaster/dashboard", "Wait")

  7. Hi

     

    I'm trying to write a script that will create several accounts in parallel. The data is created in a CSV file, and I want ubot to use one row per thread. My code isn't quite working however (the row off set variable doesn't increment for each thread, but stays at the same value for all threads, so the same row is used on each thread), could someone point me in the right direction please? Here is an example of what I have thus far:

     

    ui open file("File", #file)

    create table from file(#file, &database)

    set(#rownumber, 0, "Global")

    loop($table total rows(&database)) {

    thread {

    in new browser {

    set(#thisrownumber, #rownumber, "Local")

    increment(#rownumber)

    navigate("http://ausi.mail.everyone.net/email/scripts/serviceMenu.pl?user=new&EV1=13315611512215007", "Wait")

    wait(1)

    click(<create account link>, "Left Click", "Yes")

    wait(3)

    type text(<username field>, $table cell(&database, #thisrownumber, 0), "Standard")

  8. Here is the squidoo signup bot.

     

    My only guess is the way that the ubot browser is reading the css or an error in the css of squidoo because I looked at this form in FF, IE, Chrome and Safari and the page looked pretty similar in all the browsers. When I opened it up in an android devise it was missing the username and password fields like it is in the ubot browser.

     

    So I turned css off right from the start and voila! All 3 fields appeared.

     

    Next there was a problem with filling in the email field. When you do this the other 2 fields disappear. I tried filling the fields in reverse order, but any time the email field was filled in the other 2 fields disappeared. SO I tried filling in the email field first and then clicking submit to see what happens, and sure enough the other 2 fields reappeared.

     

    Hope this helps!

     

    squidoo-signup.ubot

     

    That was the same problem I was getting, I didn't realise you could turn css off! Awesome work! Only problem is when I try to download the file, I get a server error:

     

    403 - Forbidden: Access is denied.

    You do not have permission to view this directory or page using the credentials that you supplied.

     

    Is that because I'm a new user on the forums?

  9. Hello all. I'm new here, been playing around with ubot for a bit and it has been great! However, I want to create new squidoo accounts from scratch and create lenses. I've tried several times, but the login page doesn't show the required fields, and doesn't progress. Once I have made an account, I can login and create a lens, its just he signup process... Can someone please PM me with some help?

×
×
  • Create New...