thewebsitegurus 0 Posted February 21, 2010 Report Share Posted February 21, 2010 Does anyone know how to get the current sites PageRank with ubot? Maybe using javascript? Quote Link to post Share on other sites
ayzo 6 Posted February 21, 2010 Report Share Posted February 21, 2010 This might help: http://abhinavsingh.com/blog/2009/04/getting-google-page-rank-using-javascript-for-adobe-air-apps/ But I think an easier way would be to just use the sub window command and have ubot navigate to a pagerank checking script and scrape the result. Quote Link to post Share on other sites
thewebsitegurus 0 Posted February 21, 2010 Author Report Share Posted February 21, 2010 Yeah, that was the only reference I could find as well. I can't get it to work though: This script cannot be used directly for web applications, mainly because of cross domain XHR limitations. I also can't find any PR checking sites that don't require a captcha. Im using this for serp scraping, so this needs to be automated. Quote Link to post Share on other sites
Aaron Nimocks 19 Posted February 21, 2010 Report Share Posted February 21, 2010 Yeah, that was the only reference I could find as well. I can't get it to work though: I also can't find any PR checking sites that don't require a captcha. Im using this for serp scraping, so this needs to be automated. Just find the a PR script for free and install it on your server then you can just browse to your own site and get it without a captcha. Here is one for free. Quote Link to post Share on other sites
thewebsitegurus 0 Posted February 21, 2010 Author Report Share Posted February 21, 2010 Just find the a PR script for free and install it on your server then you can just browse to your own site and get it without a captcha. Here is one for free. Thats a good way to get your server Ip blocked by google I've already been down that road. I was trying to use ubot because of the private proxy support. There should be a way to do this with javascript, Im pretty sure thats how all of the firefox/chrome extensions are doing it. Quote Link to post Share on other sites
alcr 135 Posted February 21, 2010 Report Share Posted February 21, 2010 There should be a way to do this with javascript, Im pretty sure thats how all of the firefox/chrome extensions are doing it. Jim is great with javascript, send him a PM and he'll give you some advices. Quote Link to post Share on other sites
webautomationlab 21 Posted February 21, 2010 Report Share Posted February 21, 2010 Thats a good way to get your server Ip blocked by google I've already been down that road. I was trying to use ubot because of the private proxy support.I use a perl PR checker at a query every 3 seconds, and I have never been blocked, and I would guess, I have probably done 2 million lookups in the last year, on two different ISP assigned IPs. Quote Link to post Share on other sites
thewebsitegurus 0 Posted February 22, 2010 Author Report Share Posted February 22, 2010 I use a perl PR checker at a query every 3 seconds, and I have never been blocked, and I would guess, I have probably done 2 million lookups in the last year, on two different ISP assigned IPs. I own a backlink service so I perform a large amount of PR checks. I've found that if you make back-to-back PR requests google will ban you after about 700 requests. Adding a random delay function will likely double this. Im currently building a custom app that will be pulling PR non-stop all day Jim is great with javascript, send him a PM and he'll give you some advicesThanks man, but I ended up just creating this as a php app. uBot was just becoming too cumbersome. Quote Link to post Share on other sites
webautomationlab 21 Posted February 22, 2010 Report Share Posted February 22, 2010 I own a backlink service so I perform a large amount of PR checks. I've found that if you make back-to-back PR requests google will ban you after about 700 requests. Adding a random delay function will likely double this. Im currently building a custom app that will be pulling PR non-stop all day With a 3 second delay, from my laptop on my home connection, I typically pull 15,000 consecutive results at a time, sometimes as many as 40,000. I have never been banned. Quote Link to post Share on other sites
thewebsitegurus 0 Posted February 22, 2010 Author Report Share Posted February 22, 2010 lol 40,000 Pr requests each with a 3 second delay = 33.3 hours For most serp scraping bots, a 3 second delay just takes up too much time. If you use just a handful of proxies randomly taking turns, you can get that time down to about 1.5 hours for 40k. Quote Link to post Share on other sites
webautomationlab 21 Posted February 22, 2010 Report Share Posted February 22, 2010 Proper workflow and time management keeps me from sitting around watching my PR script run, waiting for the results. YMMV. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.