Jump to content
UBot Underground

LordPorker

Fellow UBotter
  • Content Count

    120
  • Joined

  • Last visited

Community Reputation

7 Neutral

About LordPorker

  • Rank
    Advanced Member

Profile Information

  • Gender
    Male
  • Location
    UK

System Specs

  • OS
    Windows 7
  • Total Memory
    8Gb
  • Framework
    v3.5
  • License
    Developer Edition

Recent Profile Visitors

2702 profile views
  1. Didn't work out for me, although it worked with another automation suite that I use. Oh well...
  2. So I'm creating a traffic bot right now. Finding it difficult getting the bot to click on a random link on the page. Has anyone managed to solve this? Note: I don't want to scrape links, and then go to the page; it's important to actually click the link.
  3. Thanks Pash. But can we work directly with a txt file with this plugin, ie delete say a url?
  4. No, the actual var (or value) present in the list.
  5. Hi Pash. Is there a command in your plugin that can delete a variable name (or anything that I specify) from an actual .txt file? Re .txt file; can we also delete duplicates, and delete by using regex?
  6. Thanks, but my issue is working directly with a text file (not UB's virtual list). I need to delete an item from an actual .txt file. Tried it with remove from file, but no luck there (as this command is only meant for UB's own list). Why can't we work directly with a txt. file? It's a joke, as getting an item from a list, and then deleting it directly, is a basic function in programming, lol. Oh well.
  7. So I pasted in a regex that I crafted into find regular expression: (?<=<span>)[0-9]{13}(?=</span></td>) But as soon as I click ok, it strips out the important curly brackets, like this. Is this a known issue? Thanks.
  8. I kinda follow you, but the issue is that I'm taking a random url from an actual file. If I delete position 0, then the random url is still in the text file. So far at the moment, I've managed to grab a random line from a list using this: set(#listing,$random list item($list from file("C:random.txt")),"Global") The next issue is deleting said line from the list, ideally using a variable search for and to delete the line.
  9. Kinda stumped with this too; why can't UB simply take a line (even better, would be a choice to select a random one), directly from a file, then once it's taken, delete it? Saving a large file into UB must take a toll on resources, no? Anyway, so I've saved the file into UB using: add list to list with list from file. Now that I have my (virtual) list, I need to take a url and then delete it. Can anyone explain the process?
  10. Thanks Abbas for the tip re NP++. What I was trying to get across, was how can I get rid of duplicates in a file via UB? What I'm currently doing is: add list to list, and using list from file (here I set delete duplicates) > and then save all the data back to the original file (this also overwrites the original data).
  11. Also, is there a way to delete duplicates in the file without copying the file content into add list to list, and relying on the delete duplicate option? I would've thought there was a straight forward command for this procedure?
  12. Thanks Pash, I'll try that but UB should'nt keep looping the damn message (thus causing more issues!). Btw, the error msgs were part of a loop while so there's half the answer. Maybe open a ticket and report the bug.
  13. Sorry to raise an old thread, but I'm having the same issue. Using add list to list (which has it's own delete duplicate feature), is ok , but sometime you'll still end with duplicates in the main file. To simplify the steps, I'm scraping a site and saving all of the results straight to a file (via the append to file command). How can I target the duplicates in the actual file? Thanks.
  14. Sounds like a plan! I'll try that, and should hopefully sort it out. Thanks!
  15. So I'm using the append to file function (always using: to end), and on the first run it saves all the urls ok. But after that, instead of appending the urls at the end (on a new line), it just adds it to the last url. Example: http://www.ebay.co.uk/itm/PERSONALISED-CUTE-SLEEPING-PIGS-FUNNY-PHOTO-BIRTHDAY-OTHER-CARD-/371658720542?hash=item568896051e:g:ppoAAOSwv0tVNAnDhttp://www.ebay.co.uk/itm/PERSONALISED-CUTE-SLEEPING-PIGS-FUNNY-PHOTO-BIRTHDAY-OTHER-CARD-/371658720542?hash=item568896051e:g:ppoAAOSwv0tVNAnD http://www.ebay.co.uk/itm/PEPPA-PIG-MDF-SLING-BOOKCASE-NEW-BEDROOM-FURNITURE-OFFICIA
×
×
  • Create New...