Jump to content
UBot Underground

ilovepizza

Fellow UBotter
  • Content Count

    21
  • Joined

  • Last visited

Community Reputation

2 Neutral

About ilovepizza

  • Rank
    Member

Profile Information

  • Gender
    Not Telling

System Specs

  • OS
    Windows 8
  • Total Memory
    4Gb
  • Framework
    v3.5 & v4.0
  • License
    Developer Edition

Recent Profile Visitors

2943 profile views
  1. I see. So this more like a known fact and not a bug then... may need to switch to the exbrowser plugin sooner or later...
  2. Hi all, I dont want to report a bug yet because it may be just me... but when I navigate to ipleak.net with the built in browser I can see my true IP under the webrtc results (and the proxy IP in the remaining results). I am using: UBot Dev Edition 5.9.43 Built in Chrome 49 With elite private proxies (Naturally, with a Firefox browser that has webrtc disabled the same test is negative) Anyone else seeing this? BW, J
  3. Hi Aymen, could you post another download link pls? Thanks
  4. Fantastic Dan, will try it later! Thank you!
  5. Thank you guys for the suggestions. I was using private proxies and even tried without any proxy. The site wasnt blocking / refusing requests when I navigated straight to it without http get. Since I am new to socks I was wondering if the site could potentially block requests when no browser is used. Could that be the case - theoretically?
  6. Or you could do it this way... clear cookies clear all data navigate("http://www.tripadvisor.ca/Restaurants-g155032-Montreal_Quebec.html","Wait") wait(1) add list to list(%company name,$scrape attribute(<class="property_title ">,"innertext"),"Delete","Global") add list to table as column(&restaurant,0,0,%company name) add list to list(%company url,$scrape attribute(<class="property_title ">,"fullhref"),"Delete","Global") set(#RowCounter,0,"Global") loop($list total(%company url)) { navigate($next list item(%company url),"Wait") wait(1) add list to list(%reviews,$s
  7. Hi guys, I am just starting out with the http post plugin and have tried to find the answer for the following question on the forum but no luck. I am on Ubot 5.5 revision 13. Can you let me know why I get a 403 error if I try "http get" from this site: http://myip.is/ while it works perfectly fine with google? This works with google: ui text box("Keyword:",#Keyword) set(#google_results,$plugin function("HTTP post.dll", "$http get", "https://www.google.com/search?q={#Keyword}", $plugin function("HTTP post.dll", "$http useragent string", "Firefox 27.0 Win7 64-bit"), "", #Pr
  8. Ok so I uninstalled Aymens HTTP Post and now the built-in socks-container works again. I will post this into HTTP Post Plugin thread and see if anyone knows more.
  9. I was using the Aymens plugin instead of the socket container because I would like to use proxies. Oddly enough the socket navigate get (your code above) does not return any content - even though I was using this successfully before. I am beginning to think there might be an issue with my setup. All I did lately was updating to revision 13 and installing Aymens plugin. Can anyone verify Aymens HTTP Get works just fine with myip.is?
  10. Hi guys, I am just starting out with the http post plugin and have tried to find the answer for the following question on the forum but no luck. Can you let me know why I get a 403 error if I try "http get" from this site: http://myip.is/ This works with google: ui text box("Keyword:",#Keyword) set(#google_results,$plugin function("HTTP post.dll", "$http get", "https://www.google.com/search?q={#Keyword}", $plugin function("HTTP post.dll", "$http useragent string", "Firefox 27.0 Win7 64-bit"), "", #Proxy, ""),"Global") But not with myip.is (403) set(#myipis,$plugin function("HTTP post.d
  11. Ok, it seems the reqid is assigned for each "session" (e.g. 450186). Once the page is loaded, the reqid is set. If you reload the google profile, you get a fresh reqid. But the difference between the two ist the number of seconds that have passed between the refresh. That means the "reqid-to-be-assigned" is counting upwards with every second that passes. If 100 seconds have passed the new reqid will be 450286. The first digit of the reqid is a counter for the number of requests made with this reqid. In our example it would be the 4 in 450186. With every request this number counts up. Now w
  12. Hi all, I am trying to scrape Google reviews via socks. The problem: Google only displays approx. 8 reviews when the Google profile is loaded. The next batch of remaining reviews are loaded when the "More" button is pressed. I guess it is loaded via javascript. The Google Profile URL remains unchanged. My question: Is there any way to scrape all reviews of a profile via socks? PS: I haven't tried Aymens HTTP Post plugin. Would it work with Aymens plugin? How? Any idea is appreciated. Thank you. ----- I am sorry - I am very new to this so maybe none of this makes much sense - but he
×
×
  • Create New...