Jump to content
UBot Underground


  • Content Count

  • Joined

  • Last visited

  • Days Won


Gogetta last won the day on September 29 2019

Gogetta had the most liked content!

Community Reputation

263 Excellent

About Gogetta

  • Rank
    Advanced Member

Profile Information

  • Gender
    Not Telling

Contact Methods

  • Yahoo
  • Skype

System Specs

  • OS
    Windows 8
  • Total Memory
    More Than 9Gb
  • Framework
    v3.5 & v4.0
  • License
    Developer Edition

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Try either renaming or deleting this folder: AppData\Roaming\UBot Studio Use the Task Manger to make sure none of your compiled bots or browser.exe is running. After you have either renamed or deleted the folder try opening one of your bots.
  2. Thanks! Thats a handy little plugin. I never even thought to look for a plugin to help with xpath expressions.
  3. Take a look at Frank's regex example http://network.ubotstudio.com/forum/index.php?/topic/7162-using-regex-to-catch-text-between-sections/?/topic/7162-using-regex-to-catch-text-between-sections/ navigate("https://www.flashback.org/t1045448", "Wait") comment("Replace the line breaks before using Frank\'s regex example.") set(#DocNoLineBreaks, $replace($document text, $new line, $nothing), "Global") clear list(%quoted) add list to list(%quoted, $find regular expression(#DocNoLineBreaks, "(?<=<div class=\"post-clamped-text\">).*?(?=</div>)"), "Delete", "Global")
  4. It's not a bug. You are simply not clearing the data at the beginning of your loop. The way that you have it now when your script loops the response data from the previous request is being sent along with the next GET request. Either clear the cookies and headers or use the thread command to run a new instance on each loop. See this code below: ui text box("Proxy:", #Proxy) clear list(%ips) loop(20) { set(#running, "true", "Global") thread { testProxy() set(#running, "false", "Global") } loop while($comparison(#running, "=", "true")) { wait(0.2) } }
  5. r Yes, its still working. An update went out earlier today. Indexing rates are pretty good again since the latest update. Not sure about the percentage but I would say at least 50% of my links index almost instantly, with more links continuing to index a few hours afterwards.
  6. 2.3 Version Has Been Released! New Features That Were Added! Watch Auto Mode Demo video here. Watch the Index Checker setup video here. The Rerun Mode video here.
  7. There is no issue on my end with the starting of G-INDEXER. You probably have to many to urls saved in the UIState.xml and it's freezing up when you start it. Try opening the data/UIState.xml in a notepad and removing the urls between the <urls></urls> tags. Also, when you do a fresh reinstall on a new computer download it directly from the link that was sent to your email and not by copying the G-INDEXER folder to another PC. If you have any other issues email or skype me. Thanks.
  8. Well I have ran a test with all 3rd party plugins disabled and compiled an empty bot with no code in it. Still Virus total came back alerting me that it contains a virus. So yeah, it's not just the plugins that triggers the warnings.
  9. Not sure whats not working for you. I have been working on and testing the Brute Force the entire weekend and have been getting great results. Please see the video below that I just recorded showing how well the Brute Force is working since I have added some new sites. I am running the site list version 7 which was released on Friday. Check which version you are using in the data/bruteIndex.dat file. if you're running an older list then close and reopen G-INDEXER. You can also add me on skype so i can help you figure out if it's your settings. Check my profile for my skpye name. Thanks.
  • Create New...