vader
Fellow UBotter-
Content Count
9 -
Joined
-
Last visited
Community Reputation
0 NeutralAbout vader
-
Rank
Newbie
Profile Information
-
Gender
Not Telling
System Specs
-
OS
Windows 8
-
Total Memory
< 1Gb
-
Framework
v3.5
-
License
Standard Edition
Recent Profile Visitors
The recent visitors block is disabled and is not being shown to other users.
-
Ok, got it. Thanks.
-
@IRobot Thanks a lot for that code!!...my bot is now functioning properly. My code was pretty close to yours, but the one thing I was missing (which may have been the cause of my error) was the "Clear List %row" at the beginning of the loop. Can you explain why this is needed? I "assumed" that clear list would clear everything out and I'd just keep looping through Row 1 data over and over....apparently I just don't understand how clear list works. Anyway, thanks again for providing the sample code, it was a HUGE help and big timesaver for me. Cheers!
-
I don't think I'm missing any commas. My input file is being created by Excel (Save as > CSV), so I'm not manually inputting any commas. Excel's default CSV format is to organize things like I originally listed (no comma after the last value in a row). Since this is standard CSV formatting, I'm puzzled as to why my row1 and row2 data is getting mixed together when I put it into the form fields. Perhaps I'm just missing a Ubot command somewhere in my loop that'll solve the problem (I'm very new to Ubot still). Thanks.
-
vader started following Trouble filling in forms w/ data from CSV
-
I'm having some issues pulling data from a CSV to fill in some html forms. The problem appears to be with how the CSV is being read. For example, data in a typical CSV looks like this: row1_item1, row1_item2, row1_item3, row1_item4 row2_item1, row2_item2, row2_item3, row2_item4 row3_item1, row3_item2, row3_item3, row3_item4 I'm trying to fill in 4 form fields, but I'm getting some quirks when Ubot cycles to the 4th field. Here is what I get the first loop through for values for each field (using the sample format above as an example): 1. row1_item1 2. row1_item2 3. row1_item3 4. row1_it
-
Seth, Thanks for the tips. I "think" I follow what you're saying...I'm still a big newb, but will play around w/ what you suggested. One question though, how can I separate the City/State/Zip (ex. Everett, MA 02149) since there aren't any <BR>s in between those? I really need to have that data in separate columns of my final DB. I guess if worst comes to worst I could do some creative Excel cleanup (text to columns) after, but I was hoping to avoid that. Thanks.
-
Thanks for the reply. I stepped back a couple of DIVs and this is what I get: <DIV class=dealer><DIV class=dealerorder>1</DIV> <DIV class=dealerinfo> <DIV class=dealerdetail><A onmouseover="window.status=''; return true;" onmouseout="window.status=''; return true;" href="results.aspx?cs=2&dealer=206944"><STRONG>Acme Cars of Boston </STRONG></A></DIV> <DIV>100 Broadway<BR>Everett, MA 02149<BR>(617)381-9000<BR><STRONG>2.7 miles away</STRONG> </DIV></DIV> <DIV class=action> <
-
I'm trying to scrape some addresses and I'm running into problems b/c the target site doesn't really have any detailed labeling in the code. For example: </strong></a></div> <div> 100 Broadway<br /> Everett, MA 02149<br /> (617)381-9000<br /><strong> 2.7 miles away</strong> </div> </div> So, I can get the street address (100 Broadway), but I can't figure out how to isolate City, State, Zip, & Phone. I really want to pull all of these elements separately (as opposed to chucking the entire address into 1
-
Figured this out after the ~10th time through. Seems there was a random space in the outerhtml that was throwing off the matching, thus producing zero results. Grrr.
-
I'm trying to follow along with the Ubot video which demonstrates how to make a Google kw scraper. I'm up to the part where the keywords get scraped, put in a list, and then saved to a txt file. When I run it, my txt file keeps coming up empty. I've re-done it a few times now, but keep getting the same results (nothing). Any help is greatly appreciated. Thanks. google-keyword-scraper-bot.ubot