Jump to content
UBot Underground

VaultBoss

Fellow UBotter
  • Content Count

    790
  • Joined

  • Last visited

  • Days Won

    34

Everything posted by VaultBoss

  1. Always look for a specific element on the success page (the very first page the visitor usually gets redirected to after signup) and if you found it, then it means you're signed in, otherwise, you're not. Include the verification logic in a loop with say... 3 retries and if after 3 retries you're still not signed in, you can expect something is wrong so you can move on. Hope this helps...
  2. Get rid of that line of code
  3. You can't do what you want, @uniquebot. The KWs are not public knowledge, they're not indexed by Google.
  4. Nope! Lists can be set to "Local", but tables have no Advanced options setting like that; they are sort of 'Global' by default.
  5. Use a replace applied to the text/list item/table cell where you store each scraped email, using this regex: .*@ and replace with $nothing. You'll be left with only what you need/want.
  6. They have different meaning though... $either is the OR while $both is the AND operator in IBS. If you want to find/scrape only when ALL the keywords (2 or more) are present, not when only SOME of them are, then the way to go is $both (works for 2 conditions, but you can cascade them for more) On the other hand, if you want to scrape ALL instances, whether one keyword, two or more, in any combination, are present, then the way to go is to use $either.
  7. Lists are zero-numbered. That means: 1st list item position is 0 2nd item position being 1 and so forth. When you are trying to loop a list for its total items, you should deduct 1 from the $list total, because a list with say... 3 items will have them numbered 0,1,2 but the list total function will return 3. As such, when you get to the list item number 3 you will get an error (there is no such list item index)
  8. No, you wouldn't integrate in UBot... it only helps you run your computer faster. Ultimately, that will help your bots run faster and smoother too, but indirectly.
  9. And even MORE - today, it's free: http://www.ubotstudio.com/forum/index.php?/topic/13919-tools-useful-software-process-lasso-for-taming-nasty-resource-eating-windows-processes-and-services/&do=findComment&comment=77614
  10. I felt that this is worth letting all UBotters know: Today, GAOTD (GiveAwayOfTheDay) website has one of THE most powerful tools to grab for free by anybody: http://www.giveawayoftheday.com/process-lasso-pro-6-5/?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+giveawayoftheday%2Ffeed+%28Giveaway+of+the+Day%29 I am not trying to promote GAOTD per se, although it is a very valuable resource and website, but I am sure many people here would find this piece of software extremely useful. I have no affiliation with neither GAOTD, nor the sw dev for this one; just my deep a
  11. Why don't you scrape all the data (the 3 different sets) into a list and apply various data cleaning after that on the list with regex, for instance, to keep only what you need? Usually, when the page you scrape is coded poorly, class/id-wise, it is best to just take as much as you can and clean things within UBS. Hope this helps you...
  12. Here is some code that would help you scrape the relevant data and then clean it up to retrieve only what you need (the age) from it: load html("<div class=\"about\"><a class=\"link\" href=\"viewprofile.aspx?profile_id=53515440\">Roxie610</a> 40 Actively seeking a relationship. <font color=\"green\"> Online Now</font></div>") set(#Age, $scrape attribute(<innerhtml=w"<a class=\"link\" href=\"viewprofile.aspx?profile_id=*\">*</a>*</font>">, "innerhtml"), "Global") set(#Age, $replace regular expression(#Age, "<[^>]*>", $n
  13. I've changed your code a lil' bit: clear list(%userid) clear list(%headline) clear table(&tempTable2Save) add list to list(%userid, $scrape attribute(<class="link">, "innertext"), "Don\'t Delete", "Global") add list to list(%headline, $scrape attribute(<class="headline">, "innertext"), "Don\'t Delete", "Global") add list to table as column(&tempTable2Save, 0, 0, %userid) add list to table as column(&tempTable2Save, 0, 1, %headline) save to file("C:\\Users\\eric\\Desktop\\Bots\\pof_users.csv", &tempTable2Save) You were trying to save the lists you created into a sin
  14. This one would also remove the TLD from the file, if you want so: (?<=(\\))[a-zA-Z0-9\-_]*(?=\.) This way, your: c:\temp\test\myfile.xxx ----> myfile only...
  15. Also, inside your threads, you could still use $list item and the incrementing variable safely, as long as you declare them "Local" withjin a DEFINE that you call for each thread.
  16. Go to your friend, Google... and type in the search box ALL the string below (don't just click the link): site:http://www.ubotstudio.com/forum multithreading This will bring up lots of related threads - where you will find the advice you're seeking, as well as video tutorials, etc... Best way to start is to watch what/how other people did it already.
  17. As per the instructions here: http://www.ubotstudio.com/forum/index.php?/topic/13858-scripting-forum-rules-and-guidelines/ ...this seems to be the best fit forum for your post. You should delete the other 2 unfitted ones, as you would only piss off the mods with them. As for multi-threading... it looks like you might need to try your hand with a few simpler things first. Multi-threading poses issues even for very experienced coders, just so you know... In particular, in your case, you need to make use of the in new browser and thread commands in your UBS (if you have them in the PRO ve
  18. Loop through all the pages and continue to add list to list till you finish scraping. Only AFTER that, add the list to a table if you wish so. The way you're doing it, will, indeed, overwrite the list collected from the previous page.
  19. By what you describe, you do not have a list there, but a variable containing two links, separated by a new line, which only LOOKS LIKE a list, but the two links are in fact under the same 'umbrella' so to speak. Naturally, your looping will bring the same link, because it calls the same variable twice. Also, like I always advocate across this forum, try to get into the habit of losing the $next list item from your daily toolset and replace that with $list item instead, to make sure you call exactly the list item/element you want. In order to have your list be a real list as you want it,
  20. You need to loop and compare your existing URLs (from the csv) with the scraped URLs from the page. As long as you do the looping correctly, you won't have issues. My guess is that your code is faulty on the looping side of things. Most people make the mistake of looping using $next list item, in my experience. Best way is to use $list item instead in conjunction with an indexing, cycling variable. That way you will always refer to an existing list element and you will not get the 'List out of range' error anymore. BUT You would still have to address the exceptions, in any case - as in, co
  21. You can refer to data in a table cell, specifically. As with lists, it is always better to refer directly the element, rather than indirectly... in other words, do not use $next list item, but $list item instead. To do so, you will have to loop through the list, using an indexing counter (of elements = rows in the case of lists, or in case of tables, either rows OR columns) Lists are, basically tables, but with a one dimension (a single row of data) So to access a table cell, you will need a looping indexing variable for looping the rows OR columns, whichever you prefer, in the first pla
  22. Strange enough, on THIS forum, I can see the LIKE button and can hit it, but I can't see the results anymore (the people WHO liked a post already) for more than 2 months now... Is this the same for all of you guys? I just liked the post above (illmill's) for a test. Do YOU see the LIKE there?
  23. Sorry, Buddy... Tried to, but I don't see a Like button either...
  24. I'm a bit puzzled... what do you need the pj's for? I don't even recall where the heck are mine... Working from home has its perks, like this one. My troubles start when I wanna get out, though - 'cause THEN my clothes might be a bit... 'obsolete' LOL
×
×
  • Create New...