Jump to content
UBot Underground

Making blacklist


Recommended Posts

Ok i have a file with a bunch of urls that i navigate in a loop.

ANd lets say i want to make a blacklist of domains [but rmemeber that urls on top may have exact pages on domain, so 1 domain can have 50 urls in 1st file] so then it wont navigate to them.

I was trying to think of some idea but i really cant. Anybody?

It could even be maybe clearing out the list on the start before navigating anywhere but i dont know how to do that on .txt files.

Link to post
Share on other sites

I cant follow exactly what you want to do.

 

But it sounds like you want to separate stuff and to make sure it is saved. The way to do that is to create separate lists as you are going through and then save them.

Link to post
Share on other sites

Ok ill put it more clearly ;)

Lets say i habve FileA and FileB

I want bot to check every adress in fileA and if its in fileB then i want it to be deleted from the list and then list saved o nthe end. Any idea?

Link to post
Share on other sites

Okay very nice and thanx for it, that solves my problem but partially :) So now we have this claeaner which cleans on 1:1

if http://google.com found then deletes it.

But lets say 1st file [the one to be clened] has:

http://google.com/page1.html

http://google.com/page2.html

http://google.com/page3.html

 

and file2 [teh blacklist] has just the main domain:

http://google.com

 

is there a way to modify the list cleaner in that case ?

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...