Jump to content
UBot Underground

Gogetta

Moderators
  • Content Count

    870
  • Joined

  • Last visited

  • Days Won

    73

Posts posted by Gogetta

  1. Take a look at Frank's regex example

    http://network.ubotstudio.com/forum/index.php?/topic/7162-using-regex-to-catch-text-between-sections/?/topic/7162-using-regex-to-catch-text-between-sections/

    navigate("https://www.flashback.org/t1045448", "Wait")
    comment("Replace the line breaks before using Frank\'s regex example.")
    set(#DocNoLineBreaks, $replace($document text, $new line, $nothing), "Global")
    clear list(%quoted)
    add list to list(%quoted, $find regular expression(#DocNoLineBreaks, "(?<=<div class=\"post-clamped-text\">).*?(?=</div>)"), "Delete", "Global")
    
    
    • Like 1
  2. It's not a bug. You are simply not clearing the data at the beginning of your loop. The way that you have it now when your script loops the response data from the previous request is being sent along with the next GET request. Either clear the cookies and headers or use the thread command to run a new instance on each loop. See this code below:

    ui text box("Proxy:", #Proxy)
    clear list(%ips)
    loop(20) {
        set(#running, "true", "Global")
        thread {
            testProxy()
            set(#running, "false", "Global")
        }
        loop while($comparison(#running, "=", "true")) {
            wait(0.2)
        }
    }
    define testProxy {
        plugin command("HTTP post.dll", "http auto redirect", "Yes")
        plugin command("HTTP post.dll", "http max redirects", 5)
        set(#soup, $plugin function("HTTP post.dll", "$http get", "https://www.whatsmyip.com/", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36", "", #Proxy, ""), "Local")
        add item to list(%ips, $plugin function("File Management.dll", "$Find Regex First", #soup, "(?<=id=\"shownIpv4\">).*?(?=<\\/p>)"), "Don\'t Delete", "Global")
        set(#soup, $nothing, "Local")
    }
    
    

    I don't have Storm Proxies but it should work when you run it.

  3. r

    Can you tell me if this product is still working after all the G Updates, and what the average time and % of indexing rate is? <_<

     

    Yes, its still working. An update went out earlier today. Indexing rates are pretty good again since the latest update. Not sure about the percentage but I would say at least 50% of my links index almost instantly, with more links continuing to index a few hours afterwards.

  4. G-Indexer is not loading properly. When I start it, it does not start. I have tried it on many different PCs and not a single one is working. Is there going to be a fix. I also find that the software is very unresponsive. 

     

    There is no issue on my end with the starting of G-INDEXER. You probably have to many to urls saved in the UIState.xml and it's freezing up when you start it. Try opening the data/UIState.xml in a notepad and removing the urls between the <urls></urls> tags.

     

    Also, when you do a fresh reinstall on a new computer download it directly from the link that was sent to your email and not by copying the G-INDEXER folder to another PC. If you have any other issues email or skype me. Thanks.

  5. You can not use certain plugins that have licensing. They are encrypted and so is ubot to a point.

     

    Send your bots to AV's to run through, should help.

     

    Regards,

     

    CD

     

    Well I have ran a test with all 3rd party plugins disabled and compiled an empty bot with no code in it. Still Virus total came back alerting me that it contains a virus. So yeah, it's not just the plugins that triggers the warnings.

  6. Hi, I have g-indexer but the brute force mode does not seem to be working anymore?

     

    Anyidea what's going on?

     

    Not sure whats not working for you. I have been working on and testing the Brute Force the entire weekend and have been getting great results. Please see the video below that I just recorded showing how well the Brute Force is working since I have added some new sites.

     

     

    I am running the site list version 7 which was released on Friday. Check which version you are using in the data/bruteIndex.dat file. if you're running an older list then close and reopen G-INDEXER. You can also add me on skype so i can help you figure out if it's your settings. Check my profile for my skpye name. Thanks.

  7. Hi, Is this indexer still working with the brute force option today?

     

    Yes, I spoke with dman2306 over PM to figure out why he wasn't able to index his links with Brute Force and it's because he was running an outdated site list. As of right now G-INDEXER checks and downloads a new site list when the software is reopened. However, in the next update the software will check for a new site list when the software starts and every 24 hours while the software is running.

  8. Hi, you say that you support Xevil for Google captcha solving. Could you please advise whether it is the XEvil software from Xrumer. I figured that it may be cheaper to run the XEvil application than to use 2captcha. Please could you advise how XEvil works and whether it can be run for Google submit function?

     

    Thank you!

     

    Hi, you say that you support Xevil for Google captcha solving. Could you please advise whether it is the XEvil software from Xrumer. I figured that it may be cheaper to run the XEvil application than to use 2captcha. Please could you advise how XEvil works and whether it can be run for Google submit function?

     

    Thank you!

     

    Yes, Xevil can be used to solve the Recapcha's when using the Google Submit. You would need to setup your computers host file to intercept the call to 2captcha.com Look online for information on doing so. Or send me a PM and I can help walk you through it.

     

    I have no idea how much this software even taps into 2captcha. I have not seen our 2captcha balance reduce at all (and we have processed close to 300,000 URLs so far)...

     

    If you're using just the Brute Force and not the Google Submit the cost for solving the captchas are much lower.

     

    i am using magic index , is it different to it  ? it dont also require gmail accounts .

    thank

     

    Not sure how it compares since I have never used Magic Index.

     

    what method we are using now, Brut Force only or does the normal method work too again?

     

    Not sure if Google Submit is working like it was to index pages. Some people say its working but that you need a lot of proxies to index a large amount of urls. Give it a try on a few urls and see.

     

    Hi. Give me yours email or skype. I have one private question before buying.

     

    Sent you a PM.

     

    Hey guys, just a word of warning. I don't know if this has been stated or not but DO NOT use the force indexing on the money site. I have reviewed the sites it submits to and they are chinese sites that can hurt your money sites. I wouldn't even advise them on T2 unless you know how to clean the links. Not trashing the tool but be very careful on the force indexing option. It has its place but I don't know if it's been stated to not use towards the money site at all.

     

     Yes, its stated at the top of the sales page as well as a word of warning within the software. Using the Google Submit function is safe for your own sites and domains, however using the Brute Force directly on your own pages is not recommended.

    • Like 2
  9. I've been comparing my buddy's proxies vs mine for the past 2 hours.

     

    Same supplier, same setup, totally different results.

     

    My proxies work with Scrapebox and on other tools like GSA SER, etc.

     

    However, when I use mine on for G-Indexer, it's like they arent working bc nothing gets indexed.

     

    When I use his, almost instant indexation.

     

    I've been messing with this for too long and can't pinpoint any difference.

     

    Is there perhaps a specific preference on proxies (semi, dedicated, location, etc)?

     

    Thx

     

    The only function that requires quality proxies be used is the Google Submit. But with Brute Force anything should work really. I have even used Tor proxies and never had a problem indexing. Try increasing the Brute Force timeout and see if that helps.

     

    Update on Storm Proxies. They eventually wrote back and said that they had a problem with their PayPal account and needed everyone to resubscribe. I ended up going with Proxy Rack but so far have not seen any URLs index. With storm I was at least seeing close to 100% indexing within 24 hours.

     

    Any recommendations? It would seem that reverse connect are a better way to go, otherwise we would burn through massive amounts of proxies fast?

     

    Well if Storm Proxies isn't complaining about the way G-INDEXER is using their proxies I would just say stay with them. I really like their services and when my subscription ends I will resubscribe since I have never had a problem using them. I did hear back from them though and they told me the same as you that they had a problem with their paypal account.

  10. Hi - ended up purchasing 4 licenses (so far). Been running it 24/7 ever since. It seems to be improving with each version. On the proxies link, you are directing people to Storm Proxies. We already had 2 x 200 thread subscriptions. The only software we were using with Storm was G-Indexer (3 and 15 minute gateways). We got 2 "love letters" today telling us our accounts had been terminated (might have been for a different reason - not sure).

     

    We are processing about 20,000 URLs each day on each license (only 4 installed so far), so about 80,000 URLs. Can you recommend any other proxy provider (or method) that we can use to keep processing these requests?

     

    I sent you a PM.

  11. Bought it.

     

    1) Is it correct that brute force doesn't make use of multiple threads yet? (any plans to change that?)

     

    2) Why you don't recommend brute force for new sites?

     

    Thanks

     

    Yes, I am working on another update right now that will include threads.

     

    Right, I don't suggest using it on any of your websites because its a more blackhat way of forcing the index.

×
×
  • Create New...