Jump to content
UBot Underground

[Sell] Traffic Titan - New Traffic Software With Source Code


Recommended Posts

I wouldn't even waste your time on HMA as Google is aware of all their IPs and it would not make a difference in using them.

 

If http post is the only way to use the rotating proxies right now I'd suggest releasing that as I'm sure most people already have

the http post plugin already and if not it's worth the investment anyhow.

 

Then if enough people left request it without then start trying to figure out the way around it.

  • Like 2
Link to post
Share on other sites
  • Replies 110
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

Even though I have traffic systems already built with and without exbrowser, I enjoy taking a look how others are doing things.  I have bought quite a few of Nicks products, and they are very educatio

Guys you do NOT want to miss picking this one up! Every other software like this is severely flawed and doesn't really look at all like real traffic because 99% of the other bots out there that fake v

Wow Chris, thank you!

Posted Images

I wouldn't even waste your time on HMA as Google is aware of all their IPs and it would not make a difference in using them.

 

If http post is the only way to use the rotating proxies right now I'd suggest releasing that as I'm sure most people already have

the http post plugin already and if not it's worth the investment anyhow.

 

Then if enough people left request it without then start trying to figure out the way around it.

 

Open to other VPN suggestions, I may still add in HMA though let's hear what others have to say!

 

I got it working without HTTP Post :) Thankfully Python is an option.

 

Update underway, just need to make a quick video explaining how to use it!

Link to post
Share on other sites

V 1.0.2 

 

Added: Backconnect proxy option

 

Important: Please watch the video to understand how this option works, in short:

 

1. Nothing will happen until your first proxy rotation (so if you're proxies rotate every 10 minutes then nothing may happen for up to 10 minutes 30 seconds). This is to avoid getting caught in the middle of a rotation.

2. Always set the total time to be lower than the rotation time by at least 1-2 minutes. So if you have 6 proxies and 3 threads that visit just the homepage, then each thread can only visit for 4 minutes (8 minutes total time used in the rotation leaving some wiggle room)

 

The hits are distributed as such: 1 proxy is used once per rotation on a random seed url.

 

  • Like 1
Link to post
Share on other sites

Thanks for the update Nick!

 

Wanted to follow up with the feedback Chris provided earlier and wholeheartedly agree with him. I have purchased pretty much every script source that Nick has put out and have learned something from every one of them (and developed some enhanced products with them as well). Nick does go above and beyond in helping his customers, listening to our feedback, suggestions and tries to provide solutions to situations as they come up.

 

All Nick's scripts have been to notch and Traffic Titan is no exception to this. Highly recommend the script (and anything Nick puts out), with 2 updates already released just in the first few weeks!

 

And like a few other comments, even if I don't need one of his script sources, I pick it up anyway to continue supporting him and will most likely learn something from it as well.

 

Thanks as always Nick!

  • Like 2
Link to post
Share on other sites

Thanks for the update Nick!

 

Wanted to follow up with the feedback Chris provided earlier and wholeheartedly agree with him. I have purchased pretty much every script source that Nick has put out and have learned something from every one of them (and developed some enhanced products with them as well). Nick does go above and beyond in helping his customers, listening to our feedback, suggestions and tries to provide solutions to situations as they come up.

 

All Nick's scripts have been to notch and Traffic Titan is no exception to this. Highly recommend the script (and anything Nick puts out), with 2 updates already released just in the first few weeks!

 

And like a few other comments, even if I don't need one of his script sources, I pick it up anyway to continue supporting him and will most likely learn something from it as well.

 

Thanks as always Nick!

 

Thank you for the feedback :)

Link to post
Share on other sites
V 1.0.3

 

Fixed: After Retry variable in GUI named incorrectly

Updated: user agent list for Chrome & Firefox

 

This is a rather important update, there was a misnamed variable that could cause some functionality in the retry links and after retry to not work properly, so please update!

 

Chrome and Firefox user agent list completely revamped, all updated user agents and tested each.

 

Also tested some things for next week including using plugins, (possibly) Firefox profiles, random incognito mode (only in Chrome right now - not sure about FF will try to figure that out during this week). Finally, still unsure of the VPN situation if anybody wants that then let me know but so far nobody has said much on that subject.

  • Like 1
Link to post
Share on other sites

Hello Nick,

really an good Script. But  i have two Questions:

1. I know, that Google dont like Browsers with empty Cache and Cookies. Can you add an Function, so that i load an Big List with Websites, the Bot visits randomly 5-10 Sites of the List, and then comes to my Site?

2. How to schedule the Bot, so that per example he runs at 9am 5 Visits, at 11:20 7 Clicks an so on?

Thanks 

Peter

Link to post
Share on other sites

with the intention of giving ideas and suggestions and following the comment of pbsolution, it would be great that you might have some functions that have traffic spirit, as the traffic source percent and the traffic curve, so could be done more real

Link to post
Share on other sites

Hello Nick,

really an good Script. But  i have two Questions:

1. I know, that Google dont like Browsers with empty Cache and Cookies. Can you add an Function, so that i load an Big List with Websites, the Bot visits randomly 5-10 Sites of the List, and then comes to my Site?

2. How to schedule the Bot, so that per example he runs at 9am 5 Visits, at 11:20 7 Clicks an so on?

Thanks 

Peter

 

Fully intend on creating a module that will bascially be a referer system where the bot clicks to the website from another url. As for being multiple levels deep, I don't think G can track this? unless they all have G analytics I suppose? Not sure PM me your thoughts!

 

with the intention of giving ideas and suggestions and following the comment of pbsolution, it would be great that you might have some functions that have traffic spirit, as the traffic source percent and the traffic curve, so could be done more real

 

Not quite sure what you want but PM me!!!!

Link to post
Share on other sites

So I got to thinking about this last night and Google has profiles that they track across multiple

browsers and applications. You pretty much can't use the internet without Google knowing a lot

about you and your history.

 

Unfortunately with all programs I've come across on the internet..none of them actually have or

build a profile to look like a real internet user.

 

Every software out there comes in looking like a brand new user with no history.

 

So I got to thinking about this and what that implies and I think in theory what we need to do to'

simulate this is to build a tool that actually has a unique profile and history. Meaning every time

a profile access the internet it does it from a single IP and has a search history.

 

Think about how you use the internet and what that looks like to Google. You probably always visit

certain topics, read specific websites and are part of specific groups based upon interest.

 

This would be no different for anyone else.

 

Except in the case of bots like ours there is no profile. There is no history of traffic or interests. So 

to Google we just appear like a baby that popped out of the oven.

 

It's not natural.

 

So I have to ask the experts on here with some of this stuff...Is it possible to build those types of things

with Ubot?

 

Meaning a natural looking profile. Where the software creates a profile for the browser and ONLY uses

that profile to visit and search within given parameters of niches and interests etc and logs in from a static

ip address for that profile?

  • Like 1
Link to post
Share on other sites

A built up profile would be nice of history, cookies, randomized plugins etc.
Or you can disable webrtc and call it a day.

 

Google can only track things if a user allows them to.  by default firefox and chrome allow webrtc

firefox is much easier to disable so that things of browser history, cache, cookies, plugins, and much more information cant be tracked.

chrome is harder as there isnt a option to disable it out of the box, and requires a plugin and some partially work, some fully, some broken etc.  have to test a bunch of them.

 

also being as big as google is keep in mind that at some point a traffic user, is going to be logged into google as well.  so you might build in a login sequence to log into google first

 

then there is also randomized bouncing to consider.  not all traffic results to a non bounce.  so its good to have some bounces.  and even bounce from 1 site, to click another search result

 

 

I could go on and on.  Have had a google traffic software for years now, that Ive built up and in the process of converting and thank you
Nick this release saved me some time with this release for the basis to the overall project then building in what else is needed.

Link to post
Share on other sites

Yes but in this case we WANT Google to have a profile built on the user. Otherwise it doesn't look like normal

traffic. We can do everything else right but if the profile isn't there Google will wonder why all this traffic is blocked

or looks new.

Link to post
Share on other sites

Hello together,

i want to make me a List with Sites (Blogs,Forums,Q&A etc) which are niche related. So that List would be an Big List with per Example 20.000 Sites or more.

I want that the Bot goes to Random Number of this Sites, surf there a little bit, and the goes to Google and search for my Keyword. And then he goes to my Site and stay there with filled Cookies and Cache.

So this will look more like an Natural User.

So in Coding:

1. Import my List.

2. Surf to 5-10 Sites of the List

3. Search at Google for Key

4. Click my Site and surf there for a while.

 

And im willing to pay for this. I have some good results with sites with no Links, only with bought Traffic, they haved jumped from site 2 to site 1 of Serp`s without Links. I think i can make this too, with an Bot like this for Easy Keywords.

Peter

 

PS: Here i found an interesting Article from Rand for this.

Link to post
Share on other sites

Update: 1.0.4 - Extensions

 

Added: ability to load 0 to 5 random extensions per browser

 

Added: Chrome/Firefox folders for the user agents & extensions
 
I have added in a few extensions to the folders already that you can use.
 
Important, please read:
 
Extensions are pulled from the new Chrome & Firefox folders, each has a folder inside named "Extensions"
 
The extensions will add 0-5 randomly from the folder, if there are 3 then it would be 0-3 randomly.
 
Please make sure the extensions don't open a new page when installed, most of them will so you will need to test this:
 
I will have to make a longer guide later but for now do the following:
 
1. Install the extension in your normal Chrome or Firefox browser first, wait for it to be fully installed. If it opens a new page then move on to the next one. These extensions may be able to be used later but for now its not supported.
 
2. Don't forget to remove the extensions from your main browser if you don't want them.
 
How to get the files:
 
For Chrome:
 
It looks for the .crx file, you can get that by using this website: http://chrome-extension-downloader.com/then save the .crx file to the Chrome -> Extensions folder inside of Traffic Titan folder.
 
For Firefox:
 
It looks for the .xpi file. Right click on the "Add to Firefox" button then do "Save Links As" then choose the Firefox -> Extensions folder inside of the Traffic Titan folder.
Link to post
Share on other sites

Hi Nick,

 

so it's traffic comes from direct entry only?

 

thanks bro

 

Yes, at least for now. I want to add in a way for it to go to another URL first then click the link to your website. One thing to note is that there likely wont be a way to fake the referer unless maybe you use a plugin but then it would have to be configured before its loaded somehow so it's pretty specific.

Link to post
Share on other sites

Regarding the profiles bit, I haven't looked into it yet but there must be a way to create a profile right? We may be able to make a profile generator which creates profiles for Traffic Titan to be used..

  • Like 1
Link to post
Share on other sites

Update 1.0.5

 

Fixed: bug in the backconnect proxy ip check causing an error

 

This is pretty urgent so I wanted to make sure it got done for today. There was an error that one of you found (not sure if they want to be named) that could basically cause the program to just stop because of it. So this update fixes that.

 

Moving forward

 

I am off to FL tomorrow for a week, I'll still be around online and on email and working with some of you as well but I won't be on all day like I normally am working so there won't be an update for this next week - but the week after. Around that time I'll be releasing the Google part of this as well since I know some of you have been waiting and we have the backconnect proxies and so far nobody has come running to me begging for a VPN to be put in so it seems like a good time to do that.

 

If anybody feels like reverse engineering the profiles feel free otherwise I'll start to look into that - I know it's possible as I've seen it elsewhere. I believe you can find the FF profile in:

C:\Users\YOURUSERNAMHERE\AppData\Roaming\Mozilla\Firefox\Profiles

Replace YOURUSERNAMHERE with your username.

Link to post
Share on other sites
  • 3 weeks later...
V 1.0.6 7/15/2016

 

Added: blacklist to filer urls which contain blacklist items

 

How to use:

 

You can add blacklist.txt to the application folder to use a blacklist that will work with URLs.

 

If the URL contains a blacklist item then remove it (so Traffic Titan won't navigate to it).

 

Example usage:

 

wp-login.php will trigger and remove this URL: http://imautobots.com/wp-login.php so that Traffic Titan will not navigate to that page.

 

Please keep in mind that adding things like: http or // or .com and things like that can end up removing the whole list so choose your blacklist items wisely.

  • Like 1
Link to post
Share on other sites

Update will be Saturday, not Friday.

 

 

Hello Nick,

any News about the Google Search and the Profiles?

Greetings 

Peter

 

Hi Peter, for the Google part, I just need to figure out where I am going to add it in and how to display it, for the profile part I am not sure how to generate them, but I would like to know if anybody knows.

Link to post
Share on other sites

What about using Pash's new Windows Automation Plugin?

http://www.screencast.com/t/LDxGfr2zD2

I used another plugin of Pashs to fill out the text box...not sure if it's currently possible using the Windows Automation Plugin but its an idea.

  • Like 1
Link to post
Share on other sites

V 1.0.7

 

Fixed: blank browser issue by removing a problematic plugin for Chrome

 

Still working on some other updates for this but since I won't be getting those done tonight I thought I'd just push this small update because some may have experienced an issue with some blank browsers loading up and this should fix it.

Link to post
Share on other sites

For the profiles, you can find the cookies, history and extensions at:

 

 

C:\Users\YOURUSERNAMHERE\AppData\Roaming\Mozilla\Firefox\Profiles

 

The places.sqlite has a bunch of info including the history. There is also a form history (sqlite file).

 

So it's possible to modify those, its just a matter of piecing it together as there are a lot of little parts like when you add a site to the history you would have to also add the host as well as a reference to when it was visited and so on.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

×
×
  • Create New...