Jump to content
UBot Underground

LordPorker

Fellow UBotter
  • Content Count

    120
  • Joined

  • Last visited

Posts posted by LordPorker

  1. Hey Tony, 

     

    Try this logic

     

    Set #TotalRecords = List Total

    Set #RandomRecord = $Rand(1, #TotalRecords )

     

    Now Select ur item from list

    set(#listing,$random list item(#RandomRecord),"Global")

     

    and than Delete it

     

    Hope it helps

     

    Thanks, but my issue is working directly with a text file (not UB's virtual list). I need to delete an item from an actual .txt file.

     

    Tried it with remove from file, but no luck there (as this command is only meant for UB's own list). Why can't we work directly with a txt. file? It's a joke, as getting an item from a list, and then deleting it directly, is a basic function in programming, lol. Oh well. :unsure:

  2. I kinda follow you, but the issue is that I'm taking a random url from an actual file. If I delete position 0, then the random url is still in the text file.

     

    So far at the moment, I've managed to grab a random line from a list using this:

    set(#listing,$random list item($list from file("C:random.txt")),"Global")
    
    

    The next issue is deleting said line from the list, ideally using a variable search for and to delete the line.

  3. Kinda stumped with this too; why can't UB simply take a line (even better, would be a choice to select a random one), directly from a file, then once it's taken, delete it?

     

    Saving a large file into UB must take a toll on resources, no?

     

    Anyway, so I've saved the file into UB using: add list to list with list from file. Now that I have my (virtual) list, I need to take a url and then delete it. Can anyone explain the process?

  4. Thanks Abbas for the tip re NP++.

     

    What I was trying to get across, was how can I get rid of duplicates in a file via UB? What I'm currently doing is: add list to list, and using list from file (here I set delete duplicates) > and then save all the data back to the original file (this also overwrites the original data).

  5. Sorry to raise an old thread, but I'm having the same issue. Using add list to list (which has it's own delete duplicate feature), is ok , but sometime you'll still end with duplicates in the main file.

     

    To simplify the steps, I'm scraping a site and saving all of the results straight to a file (via the append to file command).  How can I target the duplicates in the actual file?

     

    Thanks.

  6. So I'm using the append to file function (always using: to end), and on the first run it saves all the urls ok. But after that, instead of appending the urls at the end (on a new line), it just adds it to the last url.

     

    Example:

    http://www.ebay.co.uk/itm/PERSONALISED-CUTE-SLEEPING-PIGS-FUNNY-PHOTO-BIRTHDAY-OTHER-CARD-/371658720542?hash=item568896051e:g:ppoAAOSwv0tVNAnDhttp://www.ebay.co.uk/itm/PERSONALISED-CUTE-SLEEPING-PIGS-FUNNY-PHOTO-BIRTHDAY-OTHER-CARD-/371658720542?hash=item568896051e:g:ppoAAOSwv0tVNAnD
    http://www.ebay.co.uk/itm/PEPPA-PIG-MDF-SLING-BOOKCASE-NEW-BEDROOM-FURNITURE-OFFICIAL-/371153019547?hash=item566a71a29b:g:zG4AAOSwBLlVetpd
    http://www.ebay.co.uk/itm/Pig-Out-Snack-Bowl-/272164565955?hash=item3f5e45b3c3:g:N5MAAOSwvgdW4dk2
    

    How do I get it to save on the next new line?

     

    Thanks.

  7. ***Update***

     

    Ok, I've added the two vars min/max price together like:

    comment("Check for min / max prices")
    if($add(#ebay_minimum_price,#ebay_maximum_price) = $nothing) {
        then {
            alert("nothing")
        }
        else {
            alert("price inc")
        }
    }
    

    The above will at least check if there is a min/max price value.

  8. Thank you both, that works great, but I'm going to add a spanner into the works.

     

    With the minimum and maximum price input, the user is able to make the bot chose one of four possible paths:

     

    1) No price input for either min/max

     

    2) Only a price for min

     

    3) Only a price for max

     

    4) Both prices for min/max

     

    How would you check for all possible scenarios?

  9. Those are the default plugins that come with Ubot. If you delete them, Ubot will download them again.

    But Ubot Launcher is the best tool to enable / disable plugins.

     

     

    Keep in mind... If you enable / disable plugins via the Ubot Studio Plugins Menu. Always restart Ubot Studio! 

    Some Plugins don't work correctly if you don't restart Ubot Studio.

     

    That's why I ONLY enable / disable Plugins via Ubot Launcher. And I always Restart if I want to enable a new plugin.

     

    Dan

     

    uBot Launcher is preventing UB reloading those plugins ok.

     

    I've deactivated all of the above plugins, and am only using ExBrowser, and it's working fine.

  10. Hi.

     

    Just looking to keep UB lean (esp when compiling), and need some advice on what to keep and what to ditch.

     

    Context: I'll be using Dan's ExBrowser and Abbas Recaptcha plugin. I won't be working with FTP/databases.

     

    So which plugins are crucial to UB, and which ones can I get rid of:

     

    Advanced file,
    Database commands,
    FTP commands,
    Table commands,
    Windows commands

     

    Thanks.

×
×
  • Create New...