Jump to content
UBot Underground

ibotubot

Fellow UBotter
  • Content Count

    88
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by ibotubot

  1. Yowu illd get blocked soon anyway, use SEMrush etc to do it more easily. User agent is also not the same as an actual browser, and Google is one of those companies that is very hard to bot (changing class names on reload etc)
  2. So I am wondering, do members for which their profile, and the thread they created have been erased, are they allowed back with a new account ? The famous Stuna, that had no idea about Ubot and just resold Hello Insomnia's Bots claiming he did anything... We just need people to be aware of who someone is here, someone that gives Ubot bots a very bad name (look at his own bots lol) The friendly PM he sent last year, cannot click the user as no longer exists (see attached images) https://imgur.com/kP73cWt https://imgur.com/ARUMxSy How do I know it is him? Well, certain people are j
  3. same here. Server likely down
  4. How is this any different than simply adding this to functions.php ? add_filter( 'auto_update_plugin', '__return_true' );add_filter( 'auto_update_theme', '__return_true' );
  5. This is a great read https://www.blackhatworld.com/seo/stuna-bots-is-a-scammer-he-is-now-accusing-me-of-scamming.1036479/
  6. Does that actually work for you? I am using the built-in browser and if that sits for a longer period (never checked how long but 1 hour is likely enough), it just won't load any URL at all anymore. HTTP get etc seems to still work just not the built in browser.
  7. Is this still not working for you guys? I am looking to fully automate a task now but wonder if its even worth trying. Since UBOT cannot be left running for hours or longer periods without navigating (it then just does not load any URL at all anymore), i need to schedule it to run say every 30 min and then close and repeat. Was going to use the scheduler, but cannot test this as its important it works properly (does now when I run it manually_
  8. You do realize that is not how SEMRush works? Basically sure, you simply scrape Google (easy peasy), but then comes this - Your IP will sooner or later get blocked - Local queries don't come by local modifiers to the keyword but someone in Miami searches marketing and gets totally different results than you do based on their IP. Your concept is rather basic: Scrape google and then post it into a DB (local/remote). You could NEVER build a replacement for SEMRush for $90/ month Now , of course, if you want to scrape in your IP only then its fine, but you should mention that to people th
  9. Do have 3 voices on my win 10 machine but still nothing, mhh look like I have to do some digging as that feature would come in handy for a project right now. thx
  10. Hi Nick, Did you have to install anything for the text to speech to work? Not working on my end out of the box. Installed WinLame and SOX but it is still not working. Thanks
  11. Hi Buddy, Yes, that is indeed good to remember. However, in my case the single command in the define worked when I clicked "run node" as well as in loop they just would not want to run when I called my custom define. For some reason it seems to be working now, the other day even some restarts did not solve that. On a side note, is there any way to even use the private bot bank? (I'd otherwise just do all defines in dedicated tabs, then use that bot as an include as needed.
  12. As already stated, just make sure you get high quality proxies. Many of them are either already blacklisted or known to be data centers... The best for this would be residential proxies but those are more costly and might be harder to get depending which country you are from (which to keep in mind the proxy should match that or they might just not let you cash out)
  13. I figured it is easier to use sql queries for that , as I tried it earlier with say 3 columns. Also , given that this example was pretty easy, but with 3 columns some could be strings, others numbers that would need to be sorted descending etc... It just seemed to be saving way more time just using sql for this. Appreciate the replies though !
  14. Thanks for this. Playing around with Blend now trying to get the hang of it. Found the event trigger for a button click, but trying to bind it to an open file dialog. Any pointers on that one?
  15. I should have been more precise (screenshot did not work.) : I wanted to sort by Column A THEN Column B, so in that order. Imagine an Example of Say TV Series, You have Season 1, Season 2....but then each season has Episode 1 , 2,3.... So the output should be Season 1 , 1 Season 1 , 2 Season 2, 1 Season 2 , 2 But as you can see fromt he example below it just sorts column A, then it sorts B isolated now in the same process. Hope that makes more sense now clear table(&data) set table cell(&data,0,0,"Season 1") set table cell(&data,0,1,2) set table cell(&data,1,0,"Seaso
  16. Can be closed Solution for me : use sqlite plugin and then easily sort in there to get around the text sorting limitation.
  17. Hi Nick, Basically I had this working as a normal command but it just did not work all day in a define function. I just tested it again today and seems to be working now define $scrape content { set(#page,$plugin function("HTTP post.dll", "$http get", "https://www.site.coml", "", "", "", ""),"Global") set(#xpath,"//*[contains(concat( \" \", @class, \" \" ), concat( \" \", \"main-content\", \" \" ))]","Global") set(#content,$plugin function("HeopasCustom.dll", "$Heopas Xpath Parser", #page, #xpath, "InnerText", ""),"Global") return(#content) } Thanks!
  18. Yes thats what I wanted to do. For some reason that just did not work all day yesterday IN THE DEFINE. It always worked just as a regular command, but in an effort to get everything more organized I wanted to start using more defines. For now it seems today the define actually works , using http get and xpath parsing (does not render page and gets hard coded xpath in variable) define $scrape content { set(#page,$plugin function("HTTP post.dll", "$http get", "https://www.fpl.com/smart-meters/you.html", "", "", "", ""),"Global") set(#xpath,"//*[contains(concat( \" \", @class, \" \" ), c
  19. Hi all, I want to sort a table which has a header, to be first sorted by say column A then column B, as one would do in Excel. Could not think of another solution given that for example putting each column (a and into a list and then sorting would obviously not be accurate
  20. Sorry : I tried both - Ayman's http plugin as well as Heopas plugin for the HTTP get
  21. Hi all, I am noticing that just a regular http get inside a function renders the page, while a http get command does not render the page in the browser. What could be causing this? Obviously if the page has to be rendered in the browser there is no sense for me to use http get as it is just as slow as navigate. Thanks,
  22. I tried the same with http get and xpath, but once I get the variable in the xpath I get the same issue (in addition to noticing that http get does not work properly within the function) Thanks again for the help
  23. Sure thing. I am not sure why it is not working with the variable, as I see the var has the right value in debug. When I use that exact value directly passed it does work, but obviously I want to keep it dynamic So tried to just return #content , but as scrape does not work value is empty as well. When I manually run the nodes for scraping it works just fine define $Simple scrape of content(#URL, #elementScrape) { comment("Set passed css selector to variable") set(#elementScrape,#elementScrape,"Global") comment("Set passed URL to variable") set(#URL,#URL,"Global") comme
  24. I think the variable part stuck ans is working, for some reason it sometimes works the way you posted sometime I need to go in code view and use "{#var}" Now I am just trying to get this to work in a define function. If I run the node manually inside the function it works fine but when I just call the function it does not work (neither populate the set scrape var, nor the return variable)
×
×
  • Create New...