Jump to content
UBot Underground

Gogetta

Moderators
  • Content Count

    870
  • Joined

  • Last visited

  • Days Won

    73

Everything posted by Gogetta

  1. Try either renaming or deleting this folder: AppData\Roaming\UBot Studio Use the Task Manger to make sure none of your compiled bots or browser.exe is running. After you have either renamed or deleted the folder try opening one of your bots.
  2. Thanks! Thats a handy little plugin. I never even thought to look for a plugin to help with xpath expressions.
  3. Take a look at Frank's regex example http://network.ubotstudio.com/forum/index.php?/topic/7162-using-regex-to-catch-text-between-sections/?/topic/7162-using-regex-to-catch-text-between-sections/ navigate("https://www.flashback.org/t1045448", "Wait") comment("Replace the line breaks before using Frank\'s regex example.") set(#DocNoLineBreaks, $replace($document text, $new line, $nothing), "Global") clear list(%quoted) add list to list(%quoted, $find regular expression(#DocNoLineBreaks, "(?<=<div class=\"post-clamped-text\">).*?(?=</div>)"), "Delete", "Global")
  4. It's not a bug. You are simply not clearing the data at the beginning of your loop. The way that you have it now when your script loops the response data from the previous request is being sent along with the next GET request. Either clear the cookies and headers or use the thread command to run a new instance on each loop. See this code below: ui text box("Proxy:", #Proxy) clear list(%ips) loop(20) { set(#running, "true", "Global") thread { testProxy() set(#running, "false", "Global") } loop while($comparison(#running, "=", "true")) { wait(0.2) } }
  5. r Yes, its still working. An update went out earlier today. Indexing rates are pretty good again since the latest update. Not sure about the percentage but I would say at least 50% of my links index almost instantly, with more links continuing to index a few hours afterwards.
  6. 2.3 Version Has Been Released! New Features That Were Added! Watch Auto Mode Demo video here. Watch the Index Checker setup video here. The Rerun Mode video here.
  7. There is no issue on my end with the starting of G-INDEXER. You probably have to many to urls saved in the UIState.xml and it's freezing up when you start it. Try opening the data/UIState.xml in a notepad and removing the urls between the <urls></urls> tags. Also, when you do a fresh reinstall on a new computer download it directly from the link that was sent to your email and not by copying the G-INDEXER folder to another PC. If you have any other issues email or skype me. Thanks.
  8. Well I have ran a test with all 3rd party plugins disabled and compiled an empty bot with no code in it. Still Virus total came back alerting me that it contains a virus. So yeah, it's not just the plugins that triggers the warnings.
  9. Not sure whats not working for you. I have been working on and testing the Brute Force the entire weekend and have been getting great results. Please see the video below that I just recorded showing how well the Brute Force is working since I have added some new sites. I am running the site list version 7 which was released on Friday. Check which version you are using in the data/bruteIndex.dat file. if you're running an older list then close and reopen G-INDEXER. You can also add me on skype so i can help you figure out if it's your settings. Check my profile for my skpye name. Thanks.
  10. Yes, it's still working. I have been testing it out and making some improvements the last few days. I'll record a new video showing the Brute Force once I get some time.
  11. Can find their key here. What else do you need help with exactly?
  12. Yes, I spoke with dman2306 over PM to figure out why he wasn't able to index his links with Brute Force and it's because he was running an outdated site list. As of right now G-INDEXER checks and downloads a new site list when the software is reopened. However, in the next update the software will check for a new site list when the software starts and every 24 hours while the software is running.
  13. Yes, Xevil can be used to solve the Recapcha's when using the Google Submit. You would need to setup your computers host file to intercept the call to 2captcha.com Look online for information on doing so. Or send me a PM and I can help walk you through it. If you're using just the Brute Force and not the Google Submit the cost for solving the captchas are much lower. Not sure how it compares since I have never used Magic Index. Not sure if Google Submit is working like it was to index pages. Some people say its working but that you need a lot of proxies to index a large amount o
  14. The only function that requires quality proxies be used is the Google Submit. But with Brute Force anything should work really. I have even used Tor proxies and never had a problem indexing. Try increasing the Brute Force timeout and see if that helps. Well if Storm Proxies isn't complaining about the way G-INDEXER is using their proxies I would just say stay with them. I really like their services and when my subscription ends I will resubscribe since I have never had a problem using them. I did hear back from them though and they told me the same as you that they had a problem with thei
  15. More than likely the entire domain or sub domain was already banned.Try creating a short url from another page on the same domain on goo.gl by hand and if you get the same error then its because that domain is banned.
  16. Yes, I am working on another update right now that will include threads. Right, I don't suggest using it on any of your websites because its a more blackhat way of forcing the index.
×
×
  • Create New...