Jump to content
UBot Underground

ibotubot

Fellow UBotter
  • Content Count

    88
  • Joined

  • Last visited

  • Days Won

    2

Posts posted by ibotubot

  1. So I am wondering, do members for which their profile, and the thread they created have been erased, are they allowed back with a new account ?

     

    The famous Stuna, that had no idea about Ubot and just resold Hello Insomnia's Bots claiming he did anything...

     

    We just need people to be aware of who someone is here, someone that gives Ubot bots a very bad name (look at his own bots lol)

     

    The friendly PM he sent last year, cannot click the user as no longer exists (see attached images)

     

    https://imgur.com/kP73cWt

    https://imgur.com/ARUMxSy

     

     

     

    How do I know it is him? Well, certain people are just soo smart with creating the new identity :

     

    You could look at his YT channel and see the icons of the soft are the exact same he used before, but then again coincidence, right?

     

    His profile:

    http://network.ubotstudio.com/forum/index.php/user/38659-seodog/

     

    Now, let's look at the contact info

     

    https://imgur.com/wPykcPf

     

    So, now let's search that skype:

     

    And voila, we find it on Paxful (PS Stuna/SEOdog, all mentioned pages are on archive.org so just feel free to change anything :) )

     

    https://imgur.com/XJIdOC3

     

    We could keep digging but why, everything is right in front of us.

     

    Ps

    Trying to promote his new site on BHW, already banned again for that lol

    https://imgur.com/uf3q2Oj

     

    Also banned on reddit for spamming

    https://imgur.com/HxfWh7Q

     

    Images here:

    https://imgur.com/a/VB314gs

     

    Archive.org

    https://web.archive.org/web/20191130194955/https://paxful.com/offer/6vjoWYdED17

    https://web.archive.org/web/20191130193954/http://network.ubotstudio.com/forum/index.php/user/38659-seodog/

     

    post-14885-0-13408400-1575144130_thumb.jpg

    post-14885-0-38010300-1575144135_thumb.jpg

    post-14885-0-77255900-1575144143_thumb.jpg

    post-14885-0-12083800-1575144154_thumb.jpg

    post-14885-0-32067600-1575145034_thumb.jpg

    post-14885-0-67625200-1575145041_thumb.jpg

  2. The easiest way to do that is to use a wait command and put 86400. 86400 seconds in a day.

     

    Put that at the end of your loop.

     

    Regards,

    CD

    Does that actually work for you? I am using the built-in browser and if that sits for a longer period (never checked how long but 1 hour is likely enough), it just won't load any URL at all anymore. HTTP get etc seems to still work just not the built in browser.

  3. Is this still not working for you guys? I am looking to fully automate a task now but wonder if its even worth trying. Since UBOT cannot be left running for hours or longer periods without navigating (it then just does not load any URL at all anymore), i need to schedule it to run say every 30 min and then close and repeat.

     

    Was going to use the scheduler, but cannot test this as its important it works properly (does now when I run it manually_

  4. You do realize that is not how SEMRush works? Basically sure, you simply scrape Google (easy peasy), but then comes this

     

    - Your IP will sooner or later get blocked

    - Local queries don't come by local modifiers to the keyword but someone in Miami searches marketing and gets totally different results than you do based on their IP. 

     

    Your concept is rather basic: Scrape google and then post it into a DB (local/remote). You could NEVER build a replacement for SEMRush for $90/ month ;) 

     

    Now , of course, if you want to scrape in your IP only then its fine, but you should mention that to people that are not really well versed in this space.

  5. Hi Buddy,

     

    Yes, that is indeed good to remember. However, in my case the single command in the define worked when I clicked "run node" as well as in loop they just would not want to run when I called my custom define. 

    For some reason it seems to be working now, the other day even some restarts did not solve that.

     

    On a side note, is there any way to even use the private bot bank? (I'd otherwise just do all defines in dedicated tabs, then use that bot as an include as needed.

  6. As already stated, just make sure you get high quality proxies. Many of them are either already blacklisted or known to be data centers... The best for this would be residential proxies but those are more costly and might be harder to get depending which country you are from (which to keep in mind the proxy should match that or they might just not let you cash out)

  7. I should have been more precise (screenshot did not work.) : I wanted to sort by Column A THEN Column B, so in that order.

     

    Imagine an Example of Say TV Series, You have Season 1, Season 2....but then each season has Episode 1 , 2,3....

     

    So the output should be 

     

    Season 1  , 1

    Season 1 , 2

    Season 2, 1

    Season 2 , 2

     

    But as you can see fromt he example below it just sorts column A, then it sorts B isolated now in the same process. Hope that makes more sense now

    clear table(&data)
    set table cell(&data,0,0,"Season 1")
    set table cell(&data,0,1,2)
    set table cell(&data,1,0,"Season 1")
    set table cell(&data,1,1,1)
    set table cell(&data,2,0,"Season 2")
    set table cell(&data,2,1,2)
    set table cell(&data,3,0,"Season 2")
    set table cell(&data,3,1,1)
    plugin command("TableCommands.dll", "sort table", &data, 0)
    alert(&data)
    plugin command("TableCommands.dll", "sort table", &data, 1)
    alert(&data)
    
    

    I tried working on a solution just using tables and lists, but realized it turned into way too much effort and I will rather just use Ayman's SQLite plugin.

     

    The above example is very simple s could likely be solved by adding another sort, But what If I need to sot by A,B,C and then add in descending vs. regular ascending (for letter and numbers)...

  8. Hi Nick,
     
    Basically I had this working as a normal command but it just did not work all day in a define function. I just tested it again today and seems to be working now
     
    define $scrape content {
        set(#page,$plugin function("HTTP post.dll", "$http get", "https://www.site.coml", "", "", "", ""),"Global")
        set(#xpath,"//*[contains(concat( \" \", @class, \" \" ), concat( \" \", \"main-content\", \" \" ))]","Global")
        set(#content,$plugin function("HeopasCustom.dll", "$Heopas Xpath Parser", #page, #xpath, "InnerText", ""),"Global")
        return(#content)
    }
    
    

    Thanks!

    • Like 1
  9. Yes thats what I wanted to do. For some reason that just did not work all day yesterday IN THE DEFINE. It always worked just as a regular command, but in an effort to get everything more organized I wanted to start using more defines. For now it seems today the define actually works , using http get and xpath parsing (does not render page and gets hard coded xpath in variable)

    define $scrape content {
        set(#page,$plugin function("HTTP post.dll", "$http get", "https://www.fpl.com/smart-meters/you.html", "", "", "", ""),"Global")
        set(#xpath,"//*[contains(concat( \" \", @class, \" \" ), concat( \" \", \"main-content\", \" \" ))]","Global")
        set(#content,$plugin function("HeopasCustom.dll", "$Heopas Xpath Parser", #page, #xpath, "InnerText", ""),"Global")
        return(#content)
    }
    
    
  10. Sure thing. I am not sure why it is not working with the variable, as I see the var has the right value in debug. When I use that exact value directly passed it does work, but obviously I want to keep it dynamic

     

    So tried to just return #content , but as scrape does not work value is empty as well. When I manually run the nodes for scraping it works just fine

    
    define $Simple scrape of content(#URL, #elementScrape) {
        comment("Set passed css selector to variable")
        set(#elementScrape,#elementScrape,"Global")
        comment("Set passed URL to variable")
        set(#URL,#URL,"Global")
        comment("We are grabbing first character of element")
        set(#kindOfElement,$eval("var element = \"{#elementScrape}\";
    var element1 = element.charAt(0);
    element1;"),"Global")
        comment("Now navigate to the URL")
        navigate(#URL,"Wait")
        wait for browser event("Everything Loaded","")
        if($comparison(#kindOfElement,"= Equals",".")) {
            then {
                set(#elementScrape,$eval("var element = \"{#elementScrape}\";
    var element1 = element.substr(1).trim();
    element1;"),"Global")
                wait(2)
                set(#content,$scrape attribute(<class=#elementScrape>,"innertext"),"Global")
                wait(1)
                alert(#content)
                return($scrape attribute(<class=#elementScrape>,"innertext"))
            }
            else {
            }
        }
    }
    
    

    Thanks !

  11.  

    In node view try making it:

    <id=#elementScrape>
    

    I think the variable part stuck ans is working, for some reason it sometimes works the way you posted sometime I need to go in code view and use "{#var}"

     

    Now I am just trying to get this to work in a define function. If I run the node manually inside the function it works fine but when I just call the function it does not work (neither populate the set scrape var, nor the return variable)

×
×
  • Create New...