ibotubot
-
Content Count
88 -
Joined
-
Last visited
-
Days Won
2
Posts posted by ibotubot
-
-
So I am wondering, do members for which their profile, and the thread they created have been erased, are they allowed back with a new account ?
The famous Stuna, that had no idea about Ubot and just resold Hello Insomnia's Bots claiming he did anything...
We just need people to be aware of who someone is here, someone that gives Ubot bots a very bad name (look at his own bots lol)
The friendly PM he sent last year, cannot click the user as no longer exists (see attached images)
How do I know it is him? Well, certain people are just soo smart with creating the new identity :
You could look at his YT channel and see the icons of the soft are the exact same he used before, but then again coincidence, right?
His profile:
http://network.ubotstudio.com/forum/index.php/user/38659-seodog/
Now, let's look at the contact info
So, now let's search that skype:
And voila, we find it on Paxful (PS Stuna/SEOdog, all mentioned pages are on archive.org so just feel free to change anything )
We could keep digging but why, everything is right in front of us.
Ps
Trying to promote his new site on BHW, already banned again for that lol
Also banned on reddit for spamming
Images here:
Archive.org
https://web.archive.org/web/20191130194955/https://paxful.com/offer/6vjoWYdED17
-
same here. Server likely down
-
How is this any different than simply adding this to functions.php ?
add_filter( 'auto_update_plugin', '__return_true' );add_filter( 'auto_update_theme', '__return_true' ); -
Thanks for the share..Great read.
This is a great read https://www.blackhatworld.com/seo/stuna-bots-is-a-scammer-he-is-now-accusing-me-of-scamming.1036479/
-
The easiest way to do that is to use a wait command and put 86400. 86400 seconds in a day.
Put that at the end of your loop.
Regards,
CD
Does that actually work for you? I am using the built-in browser and if that sits for a longer period (never checked how long but 1 hour is likely enough), it just won't load any URL at all anymore. HTTP get etc seems to still work just not the built in browser.
-
Is this still not working for you guys? I am looking to fully automate a task now but wonder if its even worth trying. Since UBOT cannot be left running for hours or longer periods without navigating (it then just does not load any URL at all anymore), i need to schedule it to run say every 30 min and then close and repeat.
Was going to use the scheduler, but cannot test this as its important it works properly (does now when I run it manually_
-
You do realize that is not how SEMRush works? Basically sure, you simply scrape Google (easy peasy), but then comes this
- Your IP will sooner or later get blocked
- Local queries don't come by local modifiers to the keyword but someone in Miami searches marketing and gets totally different results than you do based on their IP.
Your concept is rather basic: Scrape google and then post it into a DB (local/remote). You could NEVER build a replacement for SEMRush for $90/ month
Now , of course, if you want to scrape in your IP only then its fine, but you should mention that to people that are not really well versed in this space.
-
Do have 3 voices on my win 10 machine but still nothing, mhh look like I have to do some digging as that feature would come in handy for a project right now. thx
-
In this video we talk about the latest update and look at most of the commands and functions in it.
Hi Nick,
Did you have to install anything for the text to speech to work? Not working on my end out of the box. Installed WinLame and SOX but it is still not working.
Thanks
-
Edit : Works for me again
-
Hi Buddy,
Yes, that is indeed good to remember. However, in my case the single command in the define worked when I clicked "run node" as well as in loop they just would not want to run when I called my custom define.
For some reason it seems to be working now, the other day even some restarts did not solve that.
On a side note, is there any way to even use the private bot bank? (I'd otherwise just do all defines in dedicated tabs, then use that bot as an include as needed.
-
As already stated, just make sure you get high quality proxies. Many of them are either already blacklisted or known to be data centers... The best for this would be residential proxies but those are more costly and might be harder to get depending which country you are from (which to keep in mind the proxy should match that or they might just not let you cash out)
-
I figured it is easier to use sql queries for that , as I tried it earlier with say 3 columns. Also , given that this example was pretty easy, but with 3 columns some could be strings, others numbers that would need to be sorted descending etc...
It just seemed to be saving way more time just using sql for this. Appreciate the replies though !
-
Thanks for this. Playing around with Blend now trying to get the hang of it. Found the event trigger for a button click, but trying to bind it to an open file dialog. Any pointers on that one?
-
I should have been more precise (screenshot did not work.) : I wanted to sort by Column A THEN Column B, so in that order.
Imagine an Example of Say TV Series, You have Season 1, Season 2....but then each season has Episode 1 , 2,3....
So the output should be
Season 1 , 1
Season 1 , 2
Season 2, 1
Season 2 , 2
But as you can see fromt he example below it just sorts column A, then it sorts B isolated now in the same process. Hope that makes more sense now
clear table(&data) set table cell(&data,0,0,"Season 1") set table cell(&data,0,1,2) set table cell(&data,1,0,"Season 1") set table cell(&data,1,1,1) set table cell(&data,2,0,"Season 2") set table cell(&data,2,1,2) set table cell(&data,3,0,"Season 2") set table cell(&data,3,1,1) plugin command("TableCommands.dll", "sort table", &data, 0) alert(&data) plugin command("TableCommands.dll", "sort table", &data, 1) alert(&data)
I tried working on a solution just using tables and lists, but realized it turned into way too much effort and I will rather just use Ayman's SQLite plugin.
The above example is very simple s could likely be solved by adding another sort, But what If I need to sot by A,B,C and then add in descending vs. regular ascending (for letter and numbers)...
-
Can be closed
Solution for me : use sqlite plugin and then easily sort in there to get around the text sorting limitation.
-
Hi Nick,Basically I had this working as a normal command but it just did not work all day in a define function. I just tested it again today and seems to be working now
define $scrape content { set(#page,$plugin function("HTTP post.dll", "$http get", "https://www.site.coml", "", "", "", ""),"Global") set(#xpath,"//*[contains(concat( \" \", @class, \" \" ), concat( \" \", \"main-content\", \" \" ))]","Global") set(#content,$plugin function("HeopasCustom.dll", "$Heopas Xpath Parser", #page, #xpath, "InnerText", ""),"Global") return(#content) }
Thanks!
- 1
-
Yes thats what I wanted to do. For some reason that just did not work all day yesterday IN THE DEFINE. It always worked just as a regular command, but in an effort to get everything more organized I wanted to start using more defines. For now it seems today the define actually works , using http get and xpath parsing (does not render page and gets hard coded xpath in variable)
define $scrape content { set(#page,$plugin function("HTTP post.dll", "$http get", "https://www.fpl.com/smart-meters/you.html", "", "", "", ""),"Global") set(#xpath,"//*[contains(concat( \" \", @class, \" \" ), concat( \" \", \"main-content\", \" \" ))]","Global") set(#content,$plugin function("HeopasCustom.dll", "$Heopas Xpath Parser", #page, #xpath, "InnerText", ""),"Global") return(#content) }
-
Hi all,
I want to sort a table which has a header, to be first sorted by say column A then column B, as one would do in Excel. Could not think of another solution given that for example putting each column (a and into a list and then sorting would obviously not be accurate
-
Sorry :
I tried both - Ayman's http plugin as well as Heopas plugin for the HTTP get
-
Hi all,
I am noticing that just a regular http get inside a function renders the page, while a http get command does not render the page in the browser.
What could be causing this? Obviously if the page has to be rendered in the browser there is no sense for me to use http get as it is just as slow as navigate.
Thanks,
-
I tried the same with http get and xpath, but once I get the variable in the xpath I get the same issue (in addition to noticing that http get does not work properly within the function)
Thanks again for the help
-
Sure thing. I am not sure why it is not working with the variable, as I see the var has the right value in debug. When I use that exact value directly passed it does work, but obviously I want to keep it dynamic
So tried to just return #content , but as scrape does not work value is empty as well. When I manually run the nodes for scraping it works just fine
define $Simple scrape of content(#URL, #elementScrape) { comment("Set passed css selector to variable") set(#elementScrape,#elementScrape,"Global") comment("Set passed URL to variable") set(#URL,#URL,"Global") comment("We are grabbing first character of element") set(#kindOfElement,$eval("var element = \"{#elementScrape}\"; var element1 = element.charAt(0); element1;"),"Global") comment("Now navigate to the URL") navigate(#URL,"Wait") wait for browser event("Everything Loaded","") if($comparison(#kindOfElement,"= Equals",".")) { then { set(#elementScrape,$eval("var element = \"{#elementScrape}\"; var element1 = element.substr(1).trim(); element1;"),"Global") wait(2) set(#content,$scrape attribute(<class=#elementScrape>,"innertext"),"Global") wait(1) alert(#content) return($scrape attribute(<class=#elementScrape>,"innertext")) } else { } } }
Thanks !
-
In node view try making it:
<id=#elementScrape>
I think the variable part stuck ans is working, for some reason it sometimes works the way you posted sometime I need to go in code view and use "{#var}"
Now I am just trying to get this to work in a define function. If I run the node manually inside the function it works fine but when I just call the function it does not work (neither populate the set scrape var, nor the return variable)
Different Browser Results In A Google Search
in Scripting
Posted
Yowu illd get blocked soon anyway, use SEMrush etc to do it more easily. User agent is also not the same as an actual browser, and Google is one of those companies that is very hard to bot (changing class names on reload etc)