Jump to content
UBot Underground

Aaron Nimocks

Moderators
  • Content Count

    432
  • Joined

  • Last visited

  • Days Won

    7

Posts posted by Aaron Nimocks

  1. I tried this for about 30 minutes (sent you an email too).

     

    I really dont know why I can't scrape yahoo URLS. It almost seems like a software issue but I don't like pointing the blame. I know the code WILL work and it does but ONLY scrapes the one I actually right click and scrape on. Then if I right click and scrape on a result that didnt save and run it again (using the exact same format!) it will now scrape that result too.

     

    Now when I go to the next page, none of them will scrape, unless I physically right click and scrape that individual result. Once again, it will ONLY scrape that one.

     

    Im at a loss here.

  2. Sure. First, my bad page is in Turkish.

    The Url;

    http://www.yeniprogram.gen.tr/download/17855/MP3-Rocket.html

     

    In the top of the page, there is a breadcrumb section. In my example page it says;

     

    Ana Sayfa > Internet > Dosya Paylaşımı > MP3 Rocket

     

    I would like to get the names; ie;

     

    Internet

    Dosya Paylasimi

    MP3 Rocket.

     

    I can get that one too today probably. Do you want them in a list with each name on a separate line? Or in a variable?

     

    For Billywizz here is your tutorial.

  3. It is very good to have lots of tutorial videos. But most of the time, example pages have perfect html. Most of the time, I encounter, very bad html coding. So please, make a tutorial for scraping, a bad coded html page. (ex: no class, no span tag, or class id tag used multiple times so diffucult to diffrentiate fields.)

     

    Can you post an example of a bad page that you would like done?

  4. To expand/elaborate, lets say you have a CSV file that has "unlimited" fields, for this example say 50 and also 50 rows and I want to get this UBot to, based on what that particular sites requires, it chooses the correct variable from each row, BUT you have to call out a specific field....

     

    I don't really know off hand how you would do this.

     

    At first thought I would say if you know exactly what field you always want to call up then put the name of that field in each field. So if A12's data was = to "butter ball" then change it to "A12 butter ball". Then after you get the data then just replace "A12" with $nothing.

     

    My recommendation would be to make the CSV file in a different format so that you can easily read it with UBot.

     

    Guess it is hard to see what you are trying to make that would require that many columns and rows.

     

    Also thanks for the blog bot purchase. Im not the creator, just an affiliated reseller. :)

  5. If I am understanding you correctly (as an example) in your CSV you would have

     

    First 10 rows are digg username/passwords

    Next 10 rows are stumbleupon username/passwords

    Next 10 rows are whatever username/passwords

     

    If you want to access certain ones in the script then you first need to load all of them into a list as shown on numerous tutorials. Now you KNOW what list posistions are certain logins.

     

    So if you want to get a random stubmleupon login you would just set the list position at random from 10-19 and then you have it.

     

    If you are going through a loop and using the 10 digg logins then you would set the list position at 0 and then loop 10 times to only use the digg ones.

     

    If you wanted to use the stumbleupon ones then you set list position to 10 then loop until 19.

     

    Am I on the right track on what you are trying to do?

  6. What exaclty are you trying to make?

     

    Do you just want something that will log into article directories and submit them? I can make a video tutorial on how to do this one since I already made something similar. If you want to just tell me exactly what you want done I can most likely do this Friday or Saturday. I wouldn't be able to do it before then though.

×
×
  • Create New...