Jump to content
UBot Underground

Weird Issue, $Subtract_Lists Works In Ubot But Doesn't When Bot Is Compiled


Recommended Posts

Hi,

 

This really is strange. I made a bot that, a part of it:

 

  1. Scrapes news' urls (%urls) from a website and saves them in a text file(%fileUrls).
  2. Next time (e.g next day) does that again and saves the newer urls - subtracting %urls from %fileUrls - into the file(%newUrls).

 

To test the source code, manually removed a few urls from the file and the ran the bot, it correctly shows new news (urls) and save them in the file. shows the designing is correct.

 

But when I compile it, it always returns "0" as the new urls.

 

Used alert command to show $list_total of every list and compiled again.

It shows correct numbers inside UBot (source code). but compiled bot doesn't show the right number of result of subtracting %urls from %fileUrls.

 

I'm really stuck at this. have you guys had similar issue?

 

What would be the alternate method for subtracting lists and save the results?

 

Thanks

Link to post
Share on other sites

Pash, I'm not sure if you have exBrowser plugin installed, but I use it.

 

Here is the code, tested and (not)works as I said.

 

Create an empty file named "urls.txt" in drive "C:" before running it.

clear all data
add list to list(%fileUrls,$list from file("c:\\urls.txt"),"Delete","Global")
plugin command("ExBrowser.dll", "ExBrowser Launcher", "Chrome", "", "")
plugin command("ExBrowser.dll", "ExBrowser Navigate", "https://webdesign.tutsplus.com/categories/html")
set(#titles,$plugin function("ExBrowser.dll", "$ExBrowser Scrape List Elements", "//li[contains(@class, \'posts__post\')]/article/header/a[contains(@class, \'posts__post-title\')]/h1[contains(@class, \'nolinks\')]"),"Global")
set(#urls,$plugin function("ExBrowser.dll", "$ExBrowser Scrape List Elements Attribute", "//li[contains(@class, \'posts__post\')]/article/header/a[contains(@class, \'posts__post-title\')]", "href"),"Global")
add list to list(%titles,$list from text(#titles,$new line),"Delete","Global")
add list to list(%urls,$list from text(#urls,$new line),"Delete","Global")
add list to list(%newUrls,$subtract lists(%urls,%fileUrls),"Delete","Global")
alert("New urls: {$list total(%newUrls)}")
append to file("c:\\urls.txt","{$new line}{%newUrls}","End")

Run inside Ubot and then compile.

 

If there's new urls in "urls.txt" file, remove some manually and run it again.

Link to post
Share on other sites

Pash, I'm not sure if you have exBrowser plugin installed, but I use it.

 

Here is the code, tested and (not)works as I said.

 

Create an empty file named "urls.txt" in drive "C:" before running it.

clear all data
add list to list(%fileUrls,$list from file("c:\\urls.txt"),"Delete","Global")
plugin command("ExBrowser.dll", "ExBrowser Launcher", "Chrome", "", "")
plugin command("ExBrowser.dll", "ExBrowser Navigate", "https://webdesign.tutsplus.com/categories/html")
set(#titles,$plugin function("ExBrowser.dll", "$ExBrowser Scrape List Elements", "//li[contains(@class, \'posts__post\')]/article/header/a[contains(@class, \'posts__post-title\')]/h1[contains(@class, \'nolinks\')]"),"Global")
set(#urls,$plugin function("ExBrowser.dll", "$ExBrowser Scrape List Elements Attribute", "//li[contains(@class, \'posts__post\')]/article/header/a[contains(@class, \'posts__post-title\')]", "href"),"Global")
add list to list(%titles,$list from text(#titles,$new line),"Delete","Global")
add list to list(%urls,$list from text(#urls,$new line),"Delete","Global")
add list to list(%newUrls,$subtract lists(%urls,%fileUrls),"Delete","Global")
alert("New urls: {$list total(%newUrls)}")
append to file("c:\\urls.txt","{$new line}{%newUrls}","End")

Run inside Ubot and then compile.

 

If there's new urls in "urls.txt" file, remove some manually and run it again.

Oh. sorry. i not have exBrowser plugin.

Link to post
Share on other sites

I tried to replace exBrowser's codes with UBot's. It crashes every time scraping urls, and gets closed (so happy bought exBrowser!).

 

Actually, you just have to replace this line with UBot's built-in command(s) to scrape urls of titles on the page::

 

Ignore titles scraping.

set(#urls,$plugin function("ExBrowser.dll", "$ExBrowser Scrape List Elements Attribute", "//li[contains(@class, \'posts__post\')]/article/header/a[contains(@class, \'posts__post-title\')]", "href"),"Global")

Link to post
Share on other sites

Thanks HelloInsomnia for try the code.

 

Did you remove some of the lines from urls.txt file after the first run - which gathers urls - and running in again?

 

My customer tested the compiled bot and confirmed that it doesn't work on his system too !!

 

I found no reported bug related in https://tracker.ubotstudio.com/ to make sure it's covered in updates.

Link to post
Share on other sites

Thanks HelloInsomnia for try the code.

 

Did you remove some of the lines from urls.txt file after the first run - which gathers urls - and running in again?

 

My customer tested the compiled bot and confirmed that it doesn't work on his system too !!

 

I found no reported bug related in https://tracker.ubotstudio.com/ to make sure it's covered in updates.

 

I ran it blank, removed 4 lines ran it, ran it without a file as well I believe. I know I tried it a few ways and it worked fine for me.

Link to post
Share on other sites

I'm having similar issues with my compiled bots. Having a bit of back and forth with support. 

 

So it's not just me.

 

I appreciate if you let me know if they consider and have solution for that.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...