Jump to content
UBot Underground

whoami

Fellow UBotter
  • Content Count

    422
  • Joined

  • Last visited

  • Days Won

    8

Everything posted by whoami

  1. I cannot download it, its giving me an error of AWS. <Error><Code>AllAccessDisabled</Code><Message>All access to this object has been disabled</Message><RequestId>8A1FC033B8D477B7</RequestId><HostId>mBJz3DsIiaLQHgQGAZnkuCQvshyUiNKjilcp+UI0nEcCRgv5bbBAqp38XrcYUQUVOqEBGTfR404=</HostId></Error>
  2. I think this is going to be useful for most of advanced ubotters, This is a way of helping understanding more technology. But if this is not accepted in this forum, please let me know.... IRC was the best 10 years ago, now other softwares have eclipsed, but many savvy people keep using it for proper communication with special individuals. So, if you never heard or configured eggdrops, this is something similar. So first you will need to install NodeJS, and I will consider that you already have it, then you should have NPM also installed, then just install the irc library: npm
  3. Ive added the 2 courses for learning UBot. We got the basic course: http://network.ubotstudio.com/forum/index.php/topic/19958-sell-yellowpages-scraper-source-code-video-tutorial-guide-55min/ We have the intermediate course: http://network.ubotstudio.com/forum/index.php/topic/19969-sell-video-tutorial-25-hours-and-full-source-code-of-mantacom-scraper/ And soon to come the advanced course that will be a fiverr promoter software using several plugins and letting you know how to do it. It will probably be 5 hours long or more, depending on features. The course will be sold with the source code
  4. Hey mate, well.. As you can see it is not a simple course with theoretical explanations, but it is a course on how to do something very objective, plus in the process I explain certain important things that need to be known in order to acomplish what you want to do using uBot Studio. Like in the First Unit (there are 3) you can learn this: You will learn:Create your own commands for re-using code.Sending parameters to the commands for inner actionUsing Wildcards for scrape attribute and for clickReverse Engineer for scraping encrypted attributes.Fake Browser/User-Agent, Referrer and Proxy/
  5. Hola, yo también hablo español. Cualquier cosa ya sabes
  6. I got requested to add to the course: Filter by rank selector on the scraper (as a GUI checkbox)What else do you want guys?
  7. Hello everyone, I just did an enter in a blog explaining how you can download all images from a profile and doing pagination with a line of bash code. This can be executed in any OS, also on Windows if you set up the curl module. So what it is going to do, is using curl to mass scrape and paginate a tumblr profile, so you will get a list of URLs that will be processed with a while loop inside the curl, and then saving it on the folder you are running this command. But first… You might need to install cURL in your server, dont worry is easy: 1 sudo apt-get install c
  8. Probably those keywords dont have adwords results? You can make sure searching for them on your browser and watching if it has any campaign for those especific. But if you type like "lawyer new york" or "flowers california" it most probably have an adwords campaign. NOTE: I changed nameservers to Cloudflare, DEMO will be down a little bit.
  9. http://i.imgur.com/zOyQwLX.jpg Hey mates, if you want to learn a new method on how to run a multithreaded browser operation to get Links and Scrape those at the same time, choosing how many browsers you want open with different commands using Local variables. What you will learn: Creating "Commands" with parameters. Reverse Engineer a website for being able to scrape it. (manta.com) Multithreading using Browser (you choose how many browser might operate scraping links or data from links. Tricks for faking being other computers and referrer. Basics for Regular expressions (email, phone, links
  10. Excellent, thanks for the idea! Will of course take one of those in count.
  11. Im glad you liked mate, Im preparing the last unit for the second one. I had some sound problems but fixing them with Adobe Audition. I will upload probably tomorrow or Sunday. Will notify when it is ready. After the second, I will make an advanced course using some plugins or getting OAuth tokens with different social networks API for makin g requests, Also using Abbas Advance Plugin with PHP, but right now having problems adapting cURL module on it, so once I have it working will make the third one. More coming you bet they will. I like this and I dont care failing on live, because everyb
  12. Thanks mate! Im about to finish a second course which is intermediate difficulty. It shows how to multitask using different browsers and stuff. Showing nice tips from this course but applied in a better way. It is for manta.com and will show nice tips on how to reverse engineer it and bypass some attribute encryptions using wildcards.
  13. It is, what is the problem? You type a keyword and not displaying? Maybe no campaigns on that. But if you type those test keywords that appear on it with other bid cities combinations then it might work correctly. Let me know what is the issue, maybe a screenshot will help.
  14. http://content.screencast.com/users/r0dvan/folders/Snagit/media/a249cff6-302a-47b8-b6a2-f43c93b82618/10.04.2016-19.02.png This is a source that contains functions for getting suggestion for all this networks, Google, Bing, Yahoo, Wikipedia, Amazon and Android Markeplace. You can adapt it into your SaaS software, also it is very fucking well documented, so you will learn: Creating functions to reuse them. Merge arrays Do a request to JSON response gateways from Google, Bing, Yahoo, Wikipedia, Amazon and Android with file_get_contents. Iterate an array to push into a global array. Iterat
  15. You can test me with 1 bot if you want. If everything works ok, assign more. PM me.
  16. http://content.screencast.com/users/r0dvan/folders/Snagit/media/8840781a-3c2b-497d-8ad8-ab11c4240e47/10.04.2016-18.29.png I have something very fucking juicy for you Marketer, stop wasting money on shitty descriptions and titles that won’t help to increase your CTR, clicks, conversions and more. Make sure to know how the biggest of biggest are doing their campaigns so you don’t fuck around with your wallet. Save some time, save energy, cut the shit out, buy this and put it on working on your server. In case you don’t know how to install, I might offer you 10 min support to make it work on yo
  17. Hey guys.. Im offering you today the full source code to scrape Yellow Pages, but not only that. I recorded the whole time it took me to code it explaining all the caveats to take in count when coding a scraper. LEVEL: beginner http://content.screencast.com/users/r0dvan/folders/Snagit/media/3247d260-2e09-4ee0-8f47-dc7c28ac0202/10.04.2016-17.52.png This is great for learning how to: Scrape using Scrape AttributeScrape using Scrape pageGetting data into tablesSaving data to CSVCreating the user interfaceLooping between list items in a better way.Making your bot reliable. http://content.screen
  18. Hello Abbas.. Im having a difficult time adding cURL library for PHP.. I will really appreciate your help giving instructions on how to include it on the php folder and use it. Thanks!
  19. So, lets be clear. Are you doing API requests for this? You use token and secret in order to access API?
  20. I have added another premium source code. This is a Keyword Suggestion Scraper Tool from Google, Bing, Yahoo, Wikipedia, Amazon and Android It will go to 2 levels deep. In total it grabs more than 500+ keywords in less than 1 minute. GO TO DOWNLOAD HERE: http://wizardofbots.com/network/pro...ogle-bing-yahoo-wikipedia-amazon-and-android/ you will learn: Creating functions to reuse them.Merge arraysDo a request to JSON response gateways from Google, Bing, Yahoo, Wikipedia, Amazon and Android with file_get_contents.Iterate an array to push into a global array.Iterate the global array to store li
  21. This is a PHP code that uses previous wget command and will go all around a URLs txt list to download each of them. You can add more commands for wget so you can crawl and download entire site(do a mirror) and such. This includes a wide documentation on how to use wget to add custom commands and achieve what you want. http://wizardofbots.com/network/product/mass-url-website-cloner-in-php/ Its great for: LearnProfitDownloading all URLs to view offline when on a plane.Save hour or stop downloading stupid programs that might infect your PCUnderstand how tech works.Support more coding lesson
  22. Learn how to download a full URL so you can view it offline(clone URL images, js, css, folders, html) at with only 1 line of code: http://wizardofbots.com/network/how-to-download-html-css-js-and-images-from-specific-url/ You can use this if you want to clone sites and edit directly on Dreamweaver in order to recreate the design and modify using css. Its like this both together:
  23. Hello everyone, I have faith in this community because thanks to UBot I learned many things, also thanks to all this community that always answer and is full of very good people like Dan, TJ, and so forth. I feel I have something in debt with most of you, so I decided to make my coding Journey (10,000 hours of code) in all languages I can so I can share the knowledge. There will be also updates of sources that are deeper in tasks that I will need to charge at least $1 USD per source, in order to have a coffee or a beer. Or anything, you know, this way I can fund this Journey and you guys can
  24. Aymen, just wondering.. Did you added the dropdown and images as you told us you would do?
×
×
  • Create New...