Jump to content
UBot Underground

crazyflx

Moderators
  • Content Count

    279
  • Joined

  • Last visited

  • Days Won

    8

Posts posted by crazyflx

  1. I don't know that you can. I deleted the icon resource entirely, and somehow, that same small icon still shows up inside the upper left hand corner of uBot.

     

    Anybody with more knowledge on this topic care to chime in?

     

    I've been changing icons on my bots for quite some time now, and I've never been able to figure out how to get rid of/change that one.

     

    Does anybody know how?

  2. Hi,

     

    i'm not trying to pull anything so far.

     

    I just want a UI block text from Ubot containing items on each lines to become a list variable in Ubot... without doing anything weird.

     

    Thanks,

     

    Cheers,

     

    Ohhhh, I'm an idiot, I didn't understand your question. I thought you were trying to pull data from a URL that was in block text format and turn it into a list.

     

    There is an incredibly easy way to turn a set of block text from the UI of uBot into a list. Here it is:

     

    http://img442.imageshack.us/img442/8683/userinterfaceexample.jpg

    • Like 1
  3. I've attached two bots to this post. The first one performs an actual task inside of a "thread -> in sub window" but there is not a second thread command. So essentially, it is just performing one task, and it works perfectly.

     

    The second bot is an EXACT match to the first, only I added a SECOND thread command after the first, and the second thread command is an EXACT match to the first thread command, with a simple change of a string of words...it doesn't work at all.

     

    Feel free to download both of them & compare. Maybe somebody can tell me why it doesn't work.

    keyword scraper works.ubot

    keyword scraper doesn't work.ubot

  4. Great share crazy! Just to let you know, I think this will be included in the new pro & developer license also.

     

     

    I believe that the Pro & Dev licences will have string handling functions like Trim, so no need to mess about with javascript functions like substring, etc. ;)

     

    That will make life a lot easier. The less I have to mess with JavaScript, the happier I am (because I know diddly about JS).

  5. I'm not sure if this has been mentioned on the forum before, but I remember a long time ago I had a question about this.

     

    I was trying to make a bot, and was using the randomly generated last name & was adding 4 random numbers to it (like this: salvadore3927). I was using that "formula" to generate random usernames when signing up to sites.

     

    I kept running into a problem though...most sites have a limitation on username length, and I would randomly get usernames that were beyond that length and the bot would stop. Now, I knew what that limitation was, but I had no way "shorten" my username or make sure that it was only X length.

     

    I came up with a solution, but forgot to come back on here & provide it so others who had the same problem might be able to find it...so, here it is (it's actually very simple).

     

    http://img593.imageshack.us/img593/1211/examplep.jpg

     

    I've also attached a bot that you can download that will randomly generate a new username on each press of the play button, and each time it does it makes sure that it isn't longer than 10 characters & then it displays the newly "trimmed" username on screen.

    Example.ubot

    • Like 3
  6. might actually be a limitation, which Seth will expand. It seemed even in Praney's example, sometimes it scrapes, sometimes it doesn't.

     

    Yes, sometimes it does, sometimes it doesn't and sometimes it scrapes the same thing twice.

     

    I think Seth explained it before in one of the threads mentioned above. It was only introduced to sort the problem with dialog boxes.

     

    Praney

     

    It looks that way...after doing some reading on the topic, both on the thread you provided & a couple of others I found, it looks like it just isn't meant to "actually" multi-thread just yet...maybe in the future ;)

  7. Hi Rob,

     

    There has been an extensive debate about Muti-Threadig lately..

     

    Refer to my post :

     

    http://ubotstudio.com/forum/index.php?/topic/5132-multitasking/page__view__findpost__p__2272

     

    Praney :)

     

    I'll head on over there in a minute to read about it. Thanks :)

     

     

    how is it not working for you? Is it not saving to file? Are you wondering about the sub windows dissappearing after the process in the sub window is over?

     

    It saves the file, but there is nothing in it...however, if I move those same commands OUTSIDE the "thread" command, it works perfectly & saves the file WITH the appropriate contents (so I know I didn't screw up when "selecting" what to scrape).

     

    I tried it with & without subwindows, and in both it saves a blank .txt file.

     

    But, without subwindows & threads (letting each process run individually & not simultaneously) it saves the .txt file with the correct contents.

     

    In other words, it simply doesn't work.

  8. I haven't been around for a little while, so maybe I missed something, but I can't get this thread command to work.

     

    I've attached a bot, anybody care to tell me why it doesn't work?

     

    I've tried it multiple ways, however the two I've attached were the two that I thought would be the most successful...however, they don't work at all.

     

    (I just clicked the first two things I came across on uBot to test with...you'll see what I mean).

    test.ubot

    test no subwindow.ubot

  9. I'm sure everybody who has ever really started to "get into" using uBot has run into the issue of having problems opening/logging into/scraping emails from certain free email providers.

     

    Well, here is a nice little trick (that I've kept to myself for a little too long now) that will help you get around all that without any forwarding/changing providers/etc, etc, etc.

     

    It's called mail2web (see here: http://mail2web.com/ ).

     

    Using their incredibly SIMPLE HTML BASED interface, you can log into any email providers inbox & open/scrape/read (anything) right from their interface.

     

    What's more is, since you can log into their interface with ANY email provider, you no longer need to code separate subs/scripts for each email provider you need to retrieve mail from!

    • Like 1
  10. Thanks Crazyfix, that works great.

     

    Is there any way i could maybe just pull the urls that are listed under the recent posts section of the blogs ?

     

    No problem at all, happy to help.

     

    As for only pulling the URLs that are listed under the recent posts, the only way that would be possible would be if there were some sort of characteristic those URLs all shared.

     

    For instance:

     

    http:// thisisablog.com/recent-posts/this-is-a-post-url.htm

     

    Then you would use:

     

    Choose by Attribute -> href|*/recent-posts/*|wildcards

    Add to List -> List of Sites URLs -> Scrape Chosen Attribute|href

     

    But I doubt that each blog you're navigating to is going to have the same URL format as every other blog.

  11. Ok im back, sorry to keep bugging you guys, but im really having a hard time getting this.

     

    i haven't really been able to anything that i set out to today, and i know it's not the software. i just don't get the concepts or variables and constants and how it all works together just yet. i been watching the tutorials all day.

     

    but that's not why i wrote this, i want to know how to scrape or gather rss feeds, i saw one post that said to page scrape and get these <B></B> but i can seem to do that for some reason. i don't even see them when i right click on the page.

     

    is there something im missing? and i don't want a bot to do it, i want to actually learn how.

     

    Future reference, when trying to solve a problem, you'll get a significantly better answer if you describe in detail EXACTLY what it is you're trying to do & where you're trying to do it.

     

    Tell me (on this thread or via PM if you'd like) what you're trying to scrape/gather & from what URL you're trying to scrape/gather them from, and I'll give you a very detailed explanation on how to do it (so that you learn "how to fish" as opposed to being "given a fish" so to speak).

  12. I posted this exact response on your other thread (that is virtually the same question):

     

    Here you go man, a working example that does the following:

     

    Visits the page where you need to save all the feeds

     

    Scrapes that pages XML urls & adds them to a list

     

    Downloads each XML file from the list of URLs AND dynamically creates the file names for them

     

    If you have any questions, let me know.

     

    P.S. - It will save the file to "my documents" with the file name 1.xml

     

    If there were 3 urls to download, they would be saved as:

     

    1.xml

    2.xml

    3.xml

     

    etc, etc, etc.

     

    It uses an incremented variable as the file name to save. You'll see when you download the example & check out the source code.

    Example.ubot

  13. Here you go man, a working example that does the following:

     

    Visits the page where you need to save all the feeds

     

    Scrapes that pages XML urls & adds them to a list

     

    Downloads each XML file from the list of URLs AND dynamically creates the file names for them

     

    If you have any questions, let me know.

     

    P.S. - It will save the file to "my documents" with the file name 1.xml

     

    If there were 3 urls to download, they would be saved as:

     

    1.xml

    2.xml

    3.xml

     

    etc, etc, etc.

     

    It uses an incremented variable as the file name to save. You'll see when you download the example & check out the source code.

    Example.ubot

    • Like 1
  14. I edited the OP, but for those who have already read/reread it, I'll mention it here.

     

    I originally had the checkout page on a site of mine, but I had that site hosted with JustHost & that turned out to be a nightmare. I've since let the site "die" so to speak, so if you're interested in either a single user license for this software ($20 USD) or an unlimited user license ($99 USD & allows you to package with bots you sell to others an unlimited amount of times) please just PM me & I'll give you my PayPal email address to send the funds to.

     

    Once I receive the funds, I'll email & PM you a download link for the software.

  15. Can't you add a batch of accounts instead of 1 by 1?

     

    I'll have to talk to my programmer about that.

     

    Is there some sort of offer you can give for ubot programmers so that we are able to hand this out to our users

     

    would help loads if something like this was possible

     

    If it is then please PM - I want one

     

    thanks

     

    abs

     

    I normally sell the software for $20. If you'd like a license to hand this software out to the buyers of your bots, I'd sell it for $99, which would allow you to package it along with your bots to as many people as you'd like. It WON'T however give you the ability to SELL it as a standalone app.

     

    Any time now.

     

    Yes, sorry about the delay, I haven't been on this forum as much lately I'm afraid.

  16. Don't you got talking bad about ubot...

     

    Still, this tool will be helpful.

     

    Trust me, I'm not saying a single bad word about uBot. I've used uBot to make more money than probably anything else I've ever used...bar none.

     

    You have to admit however, that to program a bot to verify accounts when you're making hundreds a day...it takes awhile (not so much to program, but to actually have it verify each account takes quite some time).

×
×
  • Create New...