Jump to content
UBot Underground

juno

Members
  • Content Count

    9
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by juno

  1. I've been scraping a site for probably 2 years now and I just noticed today the URL changed to https. Is there something they might be doing to keep my bot from navigating to that url? When I call it the built in browser stays on the default Ubot page. It refuses to load the html.  This is happening in uncompiled mode.

     

    I'm running the latest version 6.04 Developer Edition

  2. Using an AWS EC2 image (Windows Server 2016) to run a compiled bot (most current version of Ubot) and the GUI never comes up despite running as administrator or changing the executable compatibility settings. When you look in running programs it shows using 0% cpu and "Loading..." (screenshot attached).

     

    When checking the Windows event logs it shows event ID: 1002 Source: Application hang. It also states "The program my_compiled_bot.exe version 4.0.0.0 stopped interacting with Windows and was closed."

     

    I can launch a lot of these and the task manager will show each one of these as a list as they all keep hanging.

     

    Any ideas how to troubleshoot this?  

     

    post-31874-0-43221900-1543409248_thumb.png

  3. Thanks so much Code Docta! One question: If I already have the data in a table form from scraping and html table, why would I need to do these steps where I write it to a file just to read it back into a table again?

     

    save to file("{$special folder("Desktop")}\\test-data.csv","r1,cell 2,cell 3,cell 4,cell 5
    r2,cell 2,cell 3,cell 4,cell 5")
    comment("INSERT INTO db_table (col1, col2, col3, col4, col5) VALUES (\'cell1\', \'cell2\', \'cell3\', \'cell4\', \'cell5\')")
    clear table(&goes into DB)
    create table from file("{$special folder("Desktop")}\\test-data.csv",&goes into DB)
    
    
    
    
    • Like 1
  4. I have scraped an html table with as many as 100 rows with 5 columns. Practically everything I have been trying treats the entire table as one variable. For instance, I'm able to write the table to a file.

     

    What I want to be able to do is loop through each row, one by one and explode out the row to 5 variables: cell1, cell2, cell3, cell4, cell5.

     

    I then want to use these (while still remaining in the loop for this row) in a SQL query:

    INSERT INTO db_table (col1, col2, col3, col4, col5) VALUES ('cell1', 'cell2', 'cell3', 'cell4', 'cell5')

     

    Seems very simple but I can't quite make this happen in Ubot. How do you break up (or explode) table rows out into variables like that?

     

    Thanks.

  5. I'm using a save file with page scrape as the data within the file but the file ends up being a zero byte file. I did a right click - view source on the page in the browser panel and none of the tabular data is present in the source, only a text statement within the html saying "forbidden to access this site using an automated program".

     

    I'm not sure exactly how the server knew I was using ubot, I cleared cookies, set a referrer and a user agent but it still knew. Maybe all the javascript on the site could tell by the movement of the mouse that it wasn't a natural person.

     

    How to get around all of this with ubot?  Is it possible?

×
×
  • Create New...