Jump to content
UBot Underground

Martin

Fellow UBotter
  • Content Count

    10
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Martin

  • Rank
    Member

Profile Information

  • Gender
    Not Telling

System Specs

  • OS
    Windows 8
  • Total Memory
    < 1Gb
  • Framework
    v3.5
  • License
    Standard Edition

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Martin

    Write to csv

    Good question. When I think about it, I actually only need the first three backlink URLs. So the final file would look like: Domain1,backlinkURL1,backlinkURL2,backlinkURL3 Domain2,backlinkURL1,backlinkURL2,backlinkURL3 Domain3,backlinkURL1,backlinkURL2,backlinkURL3 Domain4,backlinkURL1,backlinkURL2,backlinkURL3 etc. Of course, it could fluctuate below 3 backlink URLs (0, 1, 2, or 3 backlinks). If no backlinks are present, there's no need to insert it into the list. That means in some places the file could look like: ... DomainT,backlinkURL1,backlinkURL2,backlinkURL3 DomainL,backlinkURL1,
  2. Hi guys, I'm a little stuck. The bot I'm working is basically supposed to: 1) Load list of domains from a file into a list 2) Go to Yahoo!'s domain explorer and do a search 3) Scrape the backlinks (just on the first page of Yahoo's results) 4) Then write the domain that was checked along with the backlink URLs into the first row in a csv The resulting file should look like this in principle: Domain1,backlinkURL1,backlinkURL2,backlinkURL3,backlinkURL4, etc. Domain2,backlinkURL1,backlinkURL2,backlinkURL3,backlinkURL4, etc. Domain3,backlinkURL1,backlinkURL2,backlinkURL3,backlinkURL4, etc.
  3. Martin

    Write to csv

    P.S. Sorry, forgot to click 'Attach this file'. Here's the attachment.
  4. Martin

    Write to csv

    Hi guys, I'm a little stuck. The bot I'm working is basically supposed to: 1) Load list of domains from a file into a list 2) Go to Yahoo!'s domain explorer and do a search 3) Scrape the backlinks (just on the first page of Yahoo's results) 4) Then write the domain that was checked along with the backlink URLs into the first row in a csv The resulting file should look like this in principle: Domain1,backlinkURL1,backlinkURL2,backlinkURL3,backlinkURL4, etc. Domain2,backlinkURL1,backlinkURL2,backlinkURL3,backlinkURL4, etc. Domain3,backlinkURL1,backlinkURL2,backlinkURL3,backlinkURL4, etc.
  5. Gogetta, thank you so much. It's working. Now for some serious scraping :-) - Martin
  6. Sounds great. When you get a chance, can you please attach the updated version of your bot? Thank you very much for your help. - Martin
  7. Thank you both for your help and advice so far. @Gogetta, the script isn't producing the result I expected in as much as the text file I specify to hold the scraped URLs is empty after the run. Here's what I did: Search term: "Powered by vBulletin" AND inurl:register.php Pages to Scrape: 2 Search within: Anytime Delay between pages: 30 In 'Save Location' I specified a text file on my harddrive. Gogetta, is it working for you if you do the same? Best regards, Martin
  8. Can you show me a screen dump of how you made that work? Thanks, Martin
  9. Hello, I'm doing some scraping in Google and for certain search phrases I get more than 100 results. That means I have to navigate to the next page and scrape while the 'Next' link exists on the page. However, I can't figure how to identify the 'Next' link. The identifiers I find when using the 'Choose by attribute' in UBot are not actually present in the code. Anyone know how to do this? Also, how do I create the flow? Obviously, I want to keep navigating to the next page and scrape the URLs until I hit the last page, and then go to the next search phrase and repeat the process. Not sure
  10. Martin

    Session cookies

    Never mind guys...I think I can handle it with a java command... location.reload(true)
×
×
  • Create New...