Jump to content
UBot Underground

drstew

Members
  • Content Count

    20
  • Joined

  • Last visited

Community Reputation

1 Neutral

About drstew

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks Pash. I don't actually have the skills to add to this code you've provided, which is why I want to hire someone to complete this for me. I've sent you a PM.
  2. Hi Guys, I need a very simple bot created to preform the following: Navigate to URLFill in registration form and store detailsNavigate to internal URLRetrieve and store detailsI'd like this bot to be multi-threaded and support proxies. A clear spec will be provided to the developer. Thanks.
  3. Hey Folks, I just wanted to give a shout out to Patrick! I recently posted a thread looking for a developer to make me a quick small bot for some automation. I had a quick chat with Patrick on Skype and sent over a spec sheet and he had a bot to me in a very short turn around time. Not only that the quality of the bot is fantastic. Does everything I needed it too. I've now commissioned Patrick for another bot. His communication has also been very responsive even though we're in timezones at the opposite ends of the world. I highly recommend Patrick and will be using his services in
  4. Hi Guys, I need a very simple bot created to preform the following: Login to website Navigate to page with form to submit Select & Cut top xx lines from a .txt file into memory Paste the cut lines mentioned above into the website form Click Submit Pause 5-10 secs Repeat steps 3-6 until .txt file is emptyPlease contact me with your quotes and questions. Thanks.
  5. I must be tired. As it turns out my bot is actually working correctly
  6. Thank you for the response BotBuddy. My issue is I have a list of URL's that I want to grab the meta keywords and description for each one. It grabs the first URL's meta data and thats it. When using the $meta keyword nodes do they just pick up the metadata automatically if a site has it listed in it's head tags?
  7. Hey Guys, I thought this was pretty easy to do but for some reason my little bot will only get the first URL in the list. I'm sure this is something very easy for someone to point out what I'm doing wrong. Thank you. scrapeMetaKeywords.ubot
  8. It's all good I figured it out. I didn't have the $list total set for the loop cycles Thanks again for your help I do really appreciate it. The level of support from this forum is amazing
  9. Thank you for your help I really appreciate it. I know it's pretty noob stuff, I haven't used ubot in sometime and your right I should go back over the tutorials. I just tried running it again and the loop is still failing (error: "could not loop") with the count added. It's not even browsing to the first site as soon as it hits the loop it fails. To me the loop looks ok would it mean I'm loading the list incorrectly from the file?
  10. So I gave it a shot and made the attached script (saveimages.ubot) to try and save an image per list entry but my loop is failing. Any ideas how I should be doing this? http://resellersrus.net/savingimages.png
  11. Perfect thanks mate that was exactly what I needed
  12. Thanks BotBuddy that is exactly what I wanted. Do you know if you can resize the images? Or maybe I can use some other software to do a mass image file resize?
  13. Hey Guys, I have a list of URL's that I would like to browse to and take a screenshot of their homepage and save the image as a certain size. Does anyone know if this is possible and how it might be done?
  14. Hey Guys, I'm a uBot user and at the moment I'm working on a project in which I need to get some data scraped quickly. I would like to hire someone to make me a uBot script to scrape results from a directory site I have access too. This will be easy for many I just don't have the time at the moment to take the time to do it. Please PM me if you're interested. Thanks.
  15. Oh my thanks for your assistance. I managed to figure it out just before I refreshed this site. Thanks for the tip on the $my documents ill be using that from now on as I swap between computers every day.
×
×
  • Create New...