Jump to content
UBot Underground

steelersfan

Fellow UBotter
  • Content Count

    203
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by steelersfan

  1. YOU never heard of virustotal, so it must be garbage. Yet itt is widely used by everyone when checking out software across many different mediums. Yet YOU, the person with a clear bias and agenda deem it to be worthless and suggest people ignore it. NO! You are being a biased, disingenuous, and hostile little spin weasel, nothing new here... From your post, you DON'T seem to understand his or anyone's concern, and only seem yourself concerned with making excuses for ubot and downplaying the reality that it faces as a product that is supposed to produce clean software that won't trigger pot
  2. The core software has not been updated in over 6 months. The last time i complained about this, the owner just came along and tried to spin what I said, and mentioned an update that was like 5 months old (so technically not 6 months, okay fair enough). Then the thread was deleted altogether. Now, here we are a long time after that, and STILL NO UPDATES! Yet I am foolishly paying for them, as I suspect many of you are as well. What a waste of money, just to get people who are hostile and secretive to come here and make excuses, spin, placate, and outright lie by omission that this program is in
  3. I want to get this, is it still up to date with current yelp? When was it last updated? Any discount available? Thank you.
  4. I agree, I mean why even bother subscribing? I didn't cancel mine just to see how long updates would take, and because it is peanuts to me. Also it gives me the absolute right to bitch and rant, which I am damn well going to do. But I expected as much. There is no incentive to take care of such a small base of loyal customers, making money will always come first. The only customer bases that get catered to, are the ones who are large enough to depend upon for profits. Sadly, we aren't enough people, so there is no incentive to please this group. We get bastardized and marginalized because th
  5. Why are we paying for updates, when none come for 6 months? There is a multitude of things that need to be fixed, and countless upgrades that are constantly ignored. Why are we made to wait so long for any updates at all? Why is there NO COMMUNICATION OF DIRECTION? Why are questions like mine ignored or worse yet deleted? And why would any business owner adopt such a cowardly way to deal with their customers valid concerns?
  6. Nothing you have said has directly addressed the point I made. Typical of disingenuous people. I don't care how long it took you. SSUB probably took just as long, and you sold it at a lower price point, perhaps a mistake on your part, but nonetheless you made that choice. Now you have the audacity to tell your customers who have put up with your piss-poor support and communication throughout owning SSUB, that they can get 30% off of a price that was inflated 3 times, for nothing really different? You have to be either a really deceptive and greedy person, or really daft. And I'm not going to b
  7. yeah big and 30% are two different things... 30% is small - big is 60% So it's either pay 102 bucks (Nearly 3 times the cost of SSUB when new - with the discount!!!), or I have to be stuck with a broken and nearly abandoned tool...
  8. LOL! 147 bucks with 30% off is a slap in the face to owners of the software that you abandoned along with them. So much for a "big" discount... Good luck bloodsucking this already tiny community dry, I doubt many will be interested after being treated as such...
  9. Very well said my friend! This is my thoughts exactly as well. This guy really lives up to the name "lazy" and so much so that it is pathetic! It should be lazy/greedy/inconsiderate botter! This kind of crap happens way too much in online business in general. He'll be having "family issues", maybe he is breast-feeding 6 kids at once, got into 3 car accidents and found out he had cancer all in the same hour, grandma died, dog ran away, etc. Massive excuses and no actual consideration for customers at all. The only reason he even has a market is because he has the only UI tool worth using for
  10. Not for nothing, but Lazy Botter, you have a real bad habit of dropping out of touch and ignoring things for long times. That is a horrible way to deal with customer service, and makes people not want to give you their money easily. Even if you have a monopoly on UI builders which are needed for professional looking bots. Seriously man, the fix for this current problem is not even that hard to do and push out. You have no excuse to be so uncaring to your customer base, it is really annoying! Have some consideration for the people who put money in your pockets, this kind of hiding and ignorin
  11. Any news on SSUB update? And any ETA on the new program? Also, it would be nice if the new program could do data grids,graphs and pie-charts, and hopefully section partitioning. How about it?
  12. Indeed, thanks to your help and code docta! Thanks guys! Oh, and I look forward to your tutorials launch! If you need any help with ideas or lesson plans, let me know. I would be glad to help, and you know I have a history of asking challenging and instructive questions! Questions that I'm sure everyone will benefit from at the intermediate level.
  13. So I figured it out! It feels kind of hack'y, but I think it is a fine way to loop the process... I just used "loop while" with "contains" from the comments got, using the word nextPageToken So what I did was to get the comments page renewed within the loop. All is working as intended now! The code is a lot cleaner than doing it the ubot way, a lot less complicated, and a lot less prone to mistakes or rewrites needed. (Since it totally uses the YouTube API)
  14. Hmm, I am having trouble figuring out how to loop this properly. I was thinking that I needed "loop while", and I thought to use "exists" that must scan a page, but am unsure how to make it work. How can I look into the file and validate the existence of the next get request, to set the loop while?
  15. So I have had a bit of trouble absorbing all of this, but then I got a hold of the json plugin and started again from scratch. This time around I think I have a firm understanding of the process. So here I will try to explain what I have found, and hopefully it will help me to finally get this done, as well as help others who struggle with this kind of thought process as well. ui text box("YouTube API Key:",#ytapikey) ui text box("Video Id:",#videoid) divider divider set(#getcomments,$plugin function("HTTP post.dll", "$http get", "https://www.googleapis.com/youtube/v3/commentThreads?key={#yta
  16. Much thanks brother! I couldn't get subscription to work for me, you saved the day!
  17. I'm confused as to how to implement this Nick. Is the long wall of code within the "#json string" variable the code that returns after the youtube API is called? Meaning, I can place that variable within the "#json string" variable, correct? Or is that actual code that I will need to write into the "#json string" variable?
  18. Yes, can you please update this? Are you alive buddy?
  19. So, let me see if I understand the code: First I do a get request by setting a variable to the get request, and plugging in the API data string needed. It then spits out the data plus the next page token. (I would scrape the comment data somehow to preserve it at this point as well, correct?) Next I set that nextpagetoken to another variable via set command and the scraping with json parser. Then I use the API string again, but with the next page token in the API string, which will bring me the data needed to repeat the process until all comments were reached?I would just have to figure out ho
  20. When I run the API in http get. Here is what I get: { "kind": "youtube#commentThreadListResponse", "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/F1bK0Wo4VY6v25XGUBJOlK7iXd0\"", "nextPageToken": "QURTSl9pMy04NTVUSE1IOEtVWS0yQlFxUlpPdFFUY2RtLWZ3c0Z5cUkyalNESG93R1ZSM21kTmM2T1p4S3puUG5Dd1FPR3lIQ1pJSWR1VEJCQWItWjJnelowdExZWTBJemlVMm5YOW1TeEJhTWNDQTZWTnFiSzBSYjFQQ3RaNm9TaEI5", "pageInfo": { "totalResults": 10, "resultsPerPage": 10 }, "items": [ { "kind": "youtube#commentThread", "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/WDLSlWYZUCyHIGbMzZC8iFqyQJY\"", "id": "z23njbupps3dhrqclacdp4334f0
  21. I have that plugin already. Honestly it is too much trouble to do this with it. It uses oauth as opposed to direct key calls, it has no documentation, and it is limited to the same maximum as this method. It's also sadly a PITA to use, and this method is a lot easier, as it just uses the public key to run API functions in one HTTP GET request. If that plugin had some actual documentation and good training how to set up properly and use, I would recommend it. However, it does not, so I wouldn't recommend it at all.
  22. How would I scrape the next page token? I think that is where I am stuck at
  23. Hmm, so that string would scrape the first page then go to the next page and scrape it as well? I don't see how it would work to keep going page after page, or would I have to loop that string in ubot, and make it happen over and over until no more results?
  24. I have a bot that scrapes youtube comments, and the way I have it set up currently, it uses ubot to get the data, not http get. I want to use http get, or the YouTube API, because I don't want a video to load on the end user when they run the bot (a very unprofessional outcome!). My first choice is to use http get, but when I get the video url data, no comments come up at all. Is there a way around this, without having to use the API? This would be my preferred method, because the API limits the comments to 100, and I want to get all comments for any given video (sometimes in the thousands!)
×
×
  • Create New...