Jump to content
UBot Underground

Cannot Deserialize Json Array Into Type 'system.boolean'


Recommended Posts

Hey guys,

 

How do you fix this error ... Cannot deserialize JSON array into type 'system.boolean'. I was trying to scrape a particular value on page and within a loop b/c I placed a 10-second time interval between scrapes.

 

Thanks!

Link to post
Share on other sites
  • 1 month later...

I have the same issue

 

I am looping this code through different product codes from Amazon (I have not included that code as it is not where it is crashing) . After about  a minute or so, and maybe 300 reviews from Amazon it crashes. Do I need to slow it down? Most pages are fine, but it seems suddenly it does not like something with this deseralize json error. After stopping the script and manually trying the add list to lists, they both crash again until I Run the bot again.

 

loop($subtract(#do_list_total,1)) {
    navigate($next list item(%URL),"Wait")
    click(<class="a-link-emphasis a-text-bold">,"Left Click","No")
    loop(12) {
        if($exists(<(innertext=w"Next page*" AND outerhtml=r"(<a href=)")>)) {
            then {
                add list to list(%stars,$scrape attribute(<data-hook="review-star-rating">,"innertext"),"Don\'t Delete","Global")
                add list to list(%review,$scrape attribute(<id=w"customer_review-*">,"innertext"),"Don\'t Delete","Global")
                add list to table as column(&reviewResult,0,1,%stars)
                add list to table as column(&reviewResult,0,2,%review)
                set(#review_count,$list total(%stars),"Global")
                loop(#review_count) {
                    add item to list(%fill_dp,$list item(%dp_codes,#dp_counter),"Don\'t Delete","Global")
                }
                add list to table as column(&reviewResult,0,0,%fill_dp)
                append to file("{#filepath}\\{#botname}\\reviews.csv",&reviewResult,"End")
                append to file("{#filepath}\\{#botname}\\reviews.csv",$new line,"End")
                clear list(%fill_dp)
                clear list(%review)
                clear list(%stars)
                clear table(&reviewResult)
                click(<innertext=w"Next page*">,"Left Click","No")
                wait for browser event("Everything Loaded","")
            }
            else {
                wait for browser event("Everything Loaded","")
                add list to list(%stars,$scrape attribute(<data-hook="review-star-rating">,"innertext"),"Don\'t Delete","Global")
                add list to list(%review,$scrape attribute(<id=w"customer_review-*">,"innertext"),"Don\'t Delete","Global")
            }
        }
    }
    increment(#dp_counter)
}

Link to post
Share on other sites

True, but knowingly what the error is all about would still be good to know. Do you think it is a bug or an error in my code? 

 

For those who are interested in the Regex and Xpath solution, which seems to get around this issue, here it is. Scrapes Amazon reviews.

 

  add list to list(%section,$plugin function("XpathPlugin.dll", "$Generic Xpath Parser", $document text, "//*[contains(@id, \'customer_review\')]", "outerhtml", "False"),"Don\'t Delete","Global")
        set(#list_count,$list total(%section),"Global")
        set(#counter,0,"Global")
        loop(#list_count) {
            add item to list(%Stars,$find regular expression($list item(%section,#counter),"(?<=title=\")([0-9]\\.[0-9])(?= out of 5 stars)"),"Don\'t Delete","Global")
            increment(#counter)
        }
        add list to table as column(&resultsTable,0,0,%Stars)
        add list to list(%Review,$plugin function("XpathPlugin.dll", "$Generic Xpath Parser", $document text, "//*[contains(@class, \'a-size-base review-text review-text-content\')]", "innertext", "False"),"Don\'t Delete","Global")
        loop(#counter) {
            add item to list(%trim_review,$trim($next list item(%Review)),"Don\'t Delete","Global")
        }

         add list to table as column(&resultsTable,0,1,%trim_review)

Edited by charliefinale
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...