Jump to content
UBot Underground

Search the Community

Showing results for tags 'lists'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Announcements and News
    • Join the UBot Community
  • General
    • General Discussion
    • Mac and UBot Studio
    • Journeys
    • Buy, Sell, Free
    • Scripting

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start





Website URL







Found 15 results

  1. Hi there, I'm trying to figure out how to handle proxies in my multithreaded bot. Right now, there's a master list of search terms it uses, and a master list of proxies it pulls from. For the search terms it pulls, it's using standard threading and next list item, and for the proxy grabbing it uses a random proxy from the master list. Everything works great for the most part, but now I'm trying to figure out how to handle bad/blocked proxies. I had originally tried copying a the global Master Proxy List to a local list inside the thread, pulling a random proxy from that list,
  2. I'm looking for a simple way to shuffle a list in Ubot. What I'm wanting to do is randomly pick from a big list of hashtags, without choosing the same hashtag more than once. Seems the easiest solution would be to shuffle the hashtag list and then grab the first x number of tags. Perhaps someone has already done this before or perhaps there is a better way to do this. Thanks!
  3. I am in need of some help here, big-time! I am utterly lost and bewildered, and I don't have a clue as to why this is so hard for me to grasp... I am having a huge problem in trying to figure out the logic, of how I should handle the list comparing behind a seemingly simple Twitter follow and unfollow structure. Please allow me to explain, and perhaps someone with experience can enlighten me to what I am doing wrong? I want to create a bot that goes to twitter, scrapes users to follow, follows them, then checks after a day or two who followed back, and unfollow those that don't follow back
  4. Hi guys I have a scraped list and I'd like to be able to show the results within the Load HTML command. I've tried this using a loop (also tried just posting the variable and also the list) but I can't get each item to appear on a new line within the HTML, sounds simple but it just doesn't seem to want to work for me. Anybody help please? set(#results,$scrape attribute(<outerhtml=w"<a href=\"/web/*/*\">*</a>">,"innertext"),"Global") add list to list(%theresults,$list from text(#results,$new line),"Delete","Global") loop while($comparison($list total(%theresults),"> Gr
  5. ok so im trying to figure out how to load a list of accounts via ui command, then rotate through the accounts in sequential order to perform a task? Also with the task it is going through a list of users on a website, how do i add those users to a blacklist so that the bot does not run the same task on it twice?
  6. set(#links,"http://www.aliexpress.com/item/Exquisite-Portable-USB-Mini-Flexible-Silicone-PC-Keyboard-Foldable-for-Laptop-Notebook-WH-Suzie/2036904886.html? http://www.aliexpress.com/item/Fighting-Nation-Russian-backlit-gaming-keyboard-gamer-led-backlight-3color-breathing-switchable-light-wired-USB-for/32415574297.html? http://www.aliexpress.com/item/2-4GHz-G-Mouse-II-C120-Air-Mouse-T10-Rechargeable-Wireless-GYRO-Air-Fly-Mouse-Keyboard/32424642668.html? http://www.aliexpress.com/item/Modern-Design-Pure-White-Ultra-Thin-Design-2-4GHz-Wireless-Keyboard-Cover-Mouse-Kit-for-Desktop/32325530848.html
  7. Hey Guys, I scraped a list that ended up pulling a full address and I want to parse end one of the line items to columns in a table (see attached picture), how do I do this? I tried 'add the list to table as column', but when I save it to a CSV and then open the file, only the first line ("Amf") of the whole address was saved. Looking to have each line go to a different column (Name, Address, phone #, etc.) Any idea? Probably an easy fix but I can't figure it out.
  8. Hey there! Newby ubot fan here. I'm wondering why I'm getting this error. I'd say I'm correctly using "set list position" which is supposed to reset my list in a loop when using the "next list item". Any idea? My script so far: clear cookies allow javascript("Yes") clear table(&tablacostosenvio) add list to list(%urls, $list from file("urls-looma.txt"), "Delete", "Global") clear list(%codigospostales) add list to list(%codigospostales, $list from file("codigospostales.txt"), "Delete", "Global") set list position(%codigospostales, 0) loop($list total(%urls)) { navigate($next list i
  9. Hello all, I have pulled a list of names using the scrape attribute function but in my list the names appear with a lot of spaces in them for example John (spaces after too) Mike Sally Carol My question is how can I remove all the spaces then use this list as a comparison to say if List item = Mike then do this... Thank you
  10. Here is what Im trying to do: add list to list(%GET THE NAME FROM A VARIABLE, $plugin function("TableCommands.dll", "$list from table", &data, "Column", #column number), "Don\'t Delete", "Global") Why: I want to create 10 lists from 1 CSV file but there are certain commands I use on all columns: Add list to list set lists position clear list the set variable to the first list item If I could choose the name of the list from a variable that would be set from another list I could then just loop everything Im open to ideas, thanks
  11. Hey I'm trying to scrap a page for some keywords and place it in a list, then I want that list to check with another list from a text file. I then want the bot to click all those keywords(links) that are on both of the lists. I tried diffrent sulutions and guides for many many hours now without any luck. Would love some help, or so quick advice on how I should solve this.
  12. Hi can someone give me some advise on the best way to do a url:keyword list So basically I need a list or keywords with corasponding urls, I have seen this done in this format before: http://myurl.com:mykeyword http://myurl.com,mykeyword Which was done in text files, Is this the best way to do it? Maybe a user interface way? Considering on URL may have multiple keywords it may get a bit repetative doing it that way, any suggestions? Also if I do it that way how do i go about importing that into ubot as url into 1 var and matching keyword into another var? Just FYI I am not new to cod
  13. Hi there, currently building an ebay scraper, but I'm having one problem - for search results, after the initial search results, it comes up with "more items related to" as well as "x items found from eBay international sellers". This can be avoided, it says how many results it found, so I can just scrape them all and then only keep the first *insert however many results it found* - how do I do this though? Is there a way to delete multiple items from a list, from a certain point onwards? Or is there a better way to get around this?
  14. I posted this before but I made a bot to go to facebook and search for a word thus giving me a list of pages. I want to scrape those pages into a list and capture each one. My code is: type text(<aria-label="search">, #search, "Standard") clear list(%pages) add list to list(%pages, $scrape sttribute(<class="text">, fullhref"), "Delete", "Global") but when it scrapes the data it only gives me one link and not all the links. This is all happening within a javascript window, I presume. Any help will be much appreciated! I am including screen shots to show you what I mean. Yes
  15. Hey everyone, Got another issue I can't seem to figure out. This one should be easy, but for some reason I just can't put it together. So I've got a huge list with 100k+ rows that I need to use over the course of around 100 loops. But when I try and create a table out of the file, I think it's too big because Ubot just locks up and stops responding. I was wondering if there is any way to add 2,000 rows from the file for each loop, so I don't have to upload the entire 100K and grab from that in my loops. Basically: upload from file 2,000 rows and add to a list, complete the loop using
  • Create New...