Jump to content
UBot Underground

crazyflx

Moderators
  • Content Count

    279
  • Joined

  • Last visited

  • Days Won

    8

Everything posted by crazyflx

  1. I don't know that you can. I deleted the icon resource entirely, and somehow, that same small icon still shows up inside the upper left hand corner of uBot. Anybody with more knowledge on this topic care to chime in? I've been changing icons on my bots for quite some time now, and I've never been able to figure out how to get rid of/change that one. Does anybody know how?
  2. Ohhhh, I'm an idiot, I didn't understand your question. I thought you were trying to pull data from a URL that was in block text format and turn it into a list. There is an incredibly easy way to turn a set of block text from the UI of uBot into a list. Here it is: http://img442.imageshack.us/img442/8683/userinterfaceexample.jpg
  3. Can you provide the URL you're trying to pull the data from?
  4. Give this a go (it works for me): random click modified.ubot
  5. Out of the 5 or 6 people to download the two bots above nobody has any input?
  6. I downloaded another bot somewhere on the forum that seemed to work with multi-threading, however I just can't get it to work. I tried it in IE windows, sub windows & FF windows. I even tried just running thread commands with no windows at all, and nothing seems to work.
  7. I've attached two bots to this post. The first one performs an actual task inside of a "thread -> in sub window" but there is not a second thread command. So essentially, it is just performing one task, and it works perfectly. The second bot is an EXACT match to the first, only I added a SECOND thread command after the first, and the second thread command is an EXACT match to the first thread command, with a simple change of a string of words...it doesn't work at all. Feel free to download both of them & compare. Maybe somebody can tell me why it doesn't work. keyword scraper wor
  8. That will make life a lot easier. The less I have to mess with JavaScript, the happier I am (because I know diddly about JS).
  9. I'm not sure if this has been mentioned on the forum before, but I remember a long time ago I had a question about this. I was trying to make a bot, and was using the randomly generated last name & was adding 4 random numbers to it (like this: salvadore3927). I was using that "formula" to generate random usernames when signing up to sites. I kept running into a problem though...most sites have a limitation on username length, and I would randomly get usernames that were beyond that length and the bot would stop. Now, I knew what that limitation was, but I had no way "shorten" my user
  10. Yes, sometimes it does, sometimes it doesn't and sometimes it scrapes the same thing twice. It looks that way...after doing some reading on the topic, both on the thread you provided & a couple of others I found, it looks like it just isn't meant to "actually" multi-thread just yet...maybe in the future
  11. I'll head on over there in a minute to read about it. Thanks It saves the file, but there is nothing in it...however, if I move those same commands OUTSIDE the "thread" command, it works perfectly & saves the file WITH the appropriate contents (so I know I didn't screw up when "selecting" what to scrape). I tried it with & without subwindows, and in both it saves a blank .txt file. But, without subwindows & threads (letting each process run individually & not simultaneously) it saves the .txt file with the correct contents. In other words, it simply doesn't work.
  12. I haven't been around for a little while, so maybe I missed something, but I can't get this thread command to work. I've attached a bot, anybody care to tell me why it doesn't work? I've tried it multiple ways, however the two I've attached were the two that I thought would be the most successful...however, they don't work at all. (I just clicked the first two things I came across on uBot to test with...you'll see what I mean). test.ubot test no subwindow.ubot
  13. I'm sure everybody who has ever really started to "get into" using uBot has run into the issue of having problems opening/logging into/scraping emails from certain free email providers. Well, here is a nice little trick (that I've kept to myself for a little too long now) that will help you get around all that without any forwarding/changing providers/etc, etc, etc. It's called mail2web (see here: http://mail2web.com/ ). Using their incredibly SIMPLE HTML BASED interface, you can log into any email providers inbox & open/scrape/read (anything) right from their interface. What's mor
  14. No problem at all, happy to help. As for only pulling the URLs that are listed under the recent posts, the only way that would be possible would be if there were some sort of characteristic those URLs all shared. For instance: http:// thisisablog.com/recent-posts/this-is-a-post-url.htm Then you would use: Choose by Attribute -> href|*/recent-posts/*|wildcards Add to List -> List of Sites URLs -> Scrape Chosen Attribute|href But I doubt that each blog you're navigating to is going to have the same URL format as every other blog.
  15. Check out my attached example. Change the "thisisablog.com" to a real blog URL & then add a "save to file -> List of Sites URLs" command to see what you've saved. If you've got any questions let me know. Example.ubot
  16. Future reference, when trying to solve a problem, you'll get a significantly better answer if you describe in detail EXACTLY what it is you're trying to do & where you're trying to do it. Tell me (on this thread or via PM if you'd like) what you're trying to scrape/gather & from what URL you're trying to scrape/gather them from, and I'll give you a very detailed explanation on how to do it (so that you learn "how to fish" as opposed to being "given a fish" so to speak).
  17. I posted this exact response on your other thread (that is virtually the same question): Here you go man, a working example that does the following: Visits the page where you need to save all the feeds Scrapes that pages XML urls & adds them to a list Downloads each XML file from the list of URLs AND dynamically creates the file names for them If you have any questions, let me know. P.S. - It will save the file to "my documents" with the file name 1.xml If there were 3 urls to download, they would be saved as: 1.xml 2.xml 3.xml etc, etc, etc. It uses an incremented vari
  18. Here you go man, a working example that does the following: Visits the page where you need to save all the feeds Scrapes that pages XML urls & adds them to a list Downloads each XML file from the list of URLs AND dynamically creates the file names for them If you have any questions, let me know. P.S. - It will save the file to "my documents" with the file name 1.xml If there were 3 urls to download, they would be saved as: 1.xml 2.xml 3.xml etc, etc, etc. It uses an incremented variable as the file name to save. You'll see when you download the example & check out th
  19. I edited the OP, but for those who have already read/reread it, I'll mention it here. I originally had the checkout page on a site of mine, but I had that site hosted with JustHost & that turned out to be a nightmare. I've since let the site "die" so to speak, so if you're interested in either a single user license for this software ($20 USD) or an unlimited user license ($99 USD & allows you to package with bots you sell to others an unlimited amount of times) please just PM me & I'll give you my PayPal email address to send the funds to. Once I receive the funds, I'll email
  20. I'll have to talk to my programmer about that. I normally sell the software for $20. If you'd like a license to hand this software out to the buyers of your bots, I'd sell it for $99, which would allow you to package it along with your bots to as many people as you'd like. It WON'T however give you the ability to SELL it as a standalone app. Yes, sorry about the delay, I haven't been on this forum as much lately I'm afraid.
  21. Trust me, I'm not saying a single bad word about uBot. I've used uBot to make more money than probably anything else I've ever used...bar none. You have to admit however, that to program a bot to verify accounts when you're making hundreds a day...it takes awhile (not so much to program, but to actually have it verify each account takes quite some time).
×
×
  • Create New...