Jump to content
UBot Underground

ronaldod

Members
  • Content Count

    30
  • Joined

  • Last visited

  • Days Won

    1

ronaldod last won the day on October 31 2017

ronaldod had the most liked content!

Community Reputation

4 Neutral

About ronaldod

  • Rank
    Advanced Member

Profile Information

  • Gender
    Male

System Specs

  • OS
    Windows 10
  • Total Memory
    More Than 9Gb
  • Framework
    unsure
  • License
    Professional Edition

Recent Profile Visitors

1833 profile views
  1. And how do i send that to the browser ?
  2. Make a button that is connected to defined command with all the commands in it.
  3. I use for certain sites rotating proxy's (10 min delay). That solved for me many problems.
  4. I use in some cases $exists or use the time out in the wait for browser event. And often i use both. Just that can also cause some new errors as scrape can bug hard with not correct loading webpages.
  5. Don't have that problem at this moment. It looks like it is fixed.
  6. If i am remember right amazon is a "normal" captcha with just a bunch of letters. You just need to send in the picture and it is not the same solution like with the recaptcha from google where you need to send in the site id.
  7. You need to scrape the data-sitekey. Take a look at the instruction how to find it at : https://2captcha.com/2captcha-api#solving_recaptchav2_new Just you need to change the way you scrape it from the site in the demo code.
  8. Same issue still here. Even on simple pages. Happens to me if the page had a timeout and then repeats itself every next time i use a scrape command. $exists or page scrape. Maybe we need to make a poll how many people have this problem. As i think this issue is more common then posted here.
  9. Also on a few sites i use it on they changed the locations. New scrape codes....
  10. REQUIREMENTS CRON NOT NEEDED for trials!recommended PHP 5.3+recommended VPS or Dedicated server Read the whole post as it is does more the just basic licensing.
  11. Why do not a match on links you would like to have. .com/ etc etc..
  12. AWS i left that place because of flawed performance. Even opening a taskmanager caused 100% load. Like said before VPS have problems like they are mostly oversubscribed. Meaning you get much less then what you pay for. And if you look in detail you see mostly they give you only 50% of a cpu thread. Not 100% or so. They like to avertisie with SSD as fast but not with any garantee on CPU performance.
  13. Take a look at this. I think this what you want and more. http://network.ubotstudio.com/forum/index.php/topic/19075-sell-ubotlocker-unleashed-licensing-complete-mod/
  14. What you can do to avoid this is to use a lot of "new browser". This avoids a lot of browser errors.
  15. I ran ubots from nucs, old pc's VPS and all without problems. Only when i use video stream. Then i need a load of power. But that is the video plugin in the browser. For the rest no issues on performance. More problems with scrape bugs.
×
×
  • Create New...