Jump to content
UBot Underground

Recommended Posts

While I was scraping items from youtube (mobile version), I got this errors:

 

http://imgur.com/FIIDeyl    >>>>  Error converting value True to type blah blah blah

 

http://imgur.com/nUdLvP0 >>>>   Cannot deserialize JSON blah blah

 

 

The errors happen always when the list reaches the same amount of items (1.090).  Some advice?

 

thanks

 

 

 

Luis Carlos

post-6857-0-86147300-1401685562_thumb.jpg

post-6857-0-17239800-1401690150_thumb.jpg

Link to post
Share on other sites

Thanks Buddy, this is my script.  The video in the example has several comments, just want to scrape the outerhtml after clicking on show more one hundred times in order to get enough comments.

set user agent("iPhone")
navigate("http://m.youtube.com/#/watch?v=FYJIR21KDEM", "Wait")
wait(5)
click(<innertext=w"COMMENTS*">, "Left Click", "No")
wait(5)
loop(100) {
    click(<outerhtml=w"<span role=\"button\" class=\"*\" title=\"Load more posts\" tabindex=\"0\"*>Show more</span>">, "Left Click", "No")
    wait(4)
    plugin command("OSCommandsFunctions.dll", "os free memory")
    run javascript("window.scrollTo(0, document.body.scrollHeight);")
}
add list to list(%test, $scrape attribute(<outerhtml=w"<img src=\"*\" width=\"*\" height=\"*\" alt=\"\" class=\"*\" oid=\"*\">">, "outerhtml"), "Delete", "Global")

Link to post
Share on other sites

Well.

 

I don't have that plugin but after I removed it and reduced the 100 to about 10 it ran fine for me.  Why are you scraping images or rather their html tags?  From your comment up above I thought you were targeting "comments".  Anyway...

 

I think (and I am guessing here) it looks like you could be consuming too much memory for UBot so it is dying on you.  In this case, the Scrape attribute command is probably hurting you.

 

At this point, I would switch to the "$element offset" node and scrape each one separately.  After grabbing a few then save them to a file (appending as you go), reset your list and continue using "$element offset" node.

 

Just my thoughts.

 

Buddy

Link to post
Share on other sites
  • 3 years later...
  • 4 weeks later...

Same issue still here. Even on simple pages. Happens to me if the page had a timeout and then repeats itself every next time i use a scrape command. $exists or page scrape.

Maybe we need to make a poll how many people have this problem. As i think this issue is more common then posted here.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...