Jump to content
UBot Underground

christojuan

Members
  • Content Count

    38
  • Joined

  • Last visited

  • Days Won

    1

christojuan last won the day on August 24 2017

christojuan had the most liked content!

Community Reputation

5 Neutral

About christojuan

  • Rank
    Advanced Member

Profile Information

  • Gender
    Not Telling

System Specs

  • OS
    Windows 7
  • Total Memory
    8Gb
  • Framework
    unsure
  • License
    Developer Edition

Recent Profile Visitors

2230 profile views
  1. Hey - thanks for your response. Unfortunately, I don't understand what you are suggesting Let me try asking from another angle: The google scraper part of this bot is here: https://www.screencast.com/t/QUmxmt0fnr70 I added the highlighted section to try to get the script to take the 2 input keywords %keyword_inputs and place them in the first column of the table next to the associated url result. It should look like this kw1, url 1 kw1, url 2 kw1, url 3 ... kw1, url 30 AND kw2, url 1 kw2, url 2 kw2, url 3 ...kw2, url 30 all of the above in the same table The problem is that
  2. Hello, The script below does an excellent job of navigating to a list of urls, taking a full page length screenshot, saving it with a unique name, and then saving to a specified folder. I have 2 additional things that I'd like it to do and would appreciate some input: 1) Desktop and Mobile screen shots - The script appears to take a screenshot based upon the current size of the window. In the example screenshot below: https://www.screencast.com/t/RVzft6Gd7LpY test-new-0.png is a screenshot of a responsive website that was generated when I manually narrowed the window before running the
  3. Hi,I'm trying to modify a free Google Scraper example script by Nick (Hellloinsomnia). http://imautobots.co...oogle-scraper/ (note that in another post there is a fix/update (http://network.ubotstudio.com/forum/index.php/topic/21896-ez-google-scraper-given-key-not-present-error/?do=findComment&comment=133868) that you may find necessary if you decide to download and help. There are 4 tabs and one is named Google and is where the scrape is executed: The script is awesome, but I'm trying to dump the url results from each keyword query search into a table whererow 1/col 1 is keyword 1 url
  4. Hey Nick! Just changing those 2 lines prevented saving, but I changed line 73 to the same as your suggestion for line 103 and it worked! Thanks for your help! Chris
  5. Hi Nick, I have 1.9.1.0 (File Management) just redownloaded to confirm that it's the most current. and 1.2.20 for the Local Dictionary (just downloaded yesterday). I also have these other plugins and versions:https://goo.gl/tcQVn7 any insight on a fix would be awesome. Thanks! Chris
  6. Hi, Nick has a very cool looking free Google scraper here: http://imautobots.com/downloads/ez-google-scraper/ I was excited to download and learn about scraping Google with proxies so I added all of the required Aymen plugins and tried to run the script, but keep getting this error. https://www.screencast.com/t/kVbSmCimz3 Has anyone run into this? If so, do you have any suggestions to fix? Any feedback / help would be appreciated. Thanks! Chris P.S. I am running the developer version of 5.9.55
  7. Hey Nick - Thanks for the reply. the first 3 lists for fine for me too. The issue is that the List04 is not capturing the urls associated with each of the 3 listings in the Google Local Pack. BTW I am using Chrome 49 and I tried using the user agent you referenced, but no go I'm trying to get each of the 3 urls like the first one in this example https://www.screencast.com/t/hiZZGuACaLe where you can see https://www.collaborativelawrlh.com/ Is it possible to pull that with xpath? Any help/insights would be appreciated. Chris
  8. Hi, I'm trying to scrape results from Google using x path but I'm struggling with one issue; As you can see below I am able to scrape: List01 - top 10 google search results List02 - Google Local pack titles List03 - Google Adwords url but I am also trying to scrape the urls associated with List02 into List04. The Xpath that I am appling below works when using a tool call seotoolsfor excel (which allows use of xpath to scrape data), but it does not seem to be working here. Any insights would be greatly appreciated. clear list(%list01) clear list(%list02) clear list(%list03) clear list(%list
  9. Awesome. Thanks very much Nick!
  10. Hi, the script below generates rows of results in columns B,C, and D that SHOULD align in the same rows as the data in column A BUT BCD is generated in the rows AFTER column A results are complete see: https://www.screencast.com/t/9AXG7l2Zp https://www.screencast.com/t/BLLygP0T4hX This is all in spite of what (to me) appears to be a correct use of $newline in rows 39 and 76. Could someone please take a look and tell me how I can include the search terms in A in the aligned rows with the search terms in B,C and D? Thanks very much! clear list(%list11) clear list(%list22) clear list(%lis
  11. Nevermind. I figured it out (//div[@id='ppcSummary']/table/tbody/tr/td/table[@class='blue stat_table']/tbody/tr/td[@class='stat_label']/h3[contains(text(),'IMPRESSIONS')]/../../td[2]) (//div[@id='ppcSummary']/table/tbody/tr/td/table[@class='green stat_table']/tbody/tr/td[@class='stat_label']/h3[contains(text(),'CLICKS')]/../../td[2]) (//div[@id='ppcSummary']/table/tbody/tr/td/table[@class='orange stat_table']/tbody/tr/td[@class='stat_label']/h3[contains(text(),'CONTACTS')]/../../td[2])
  12. Hi, below is an example of some code from a page I am trying to scrape. here is the x path that I am trying to use to scrape that data (//table[@class=green stat_table]/tbody/tr/td[@class=stat_label]/h3[contains(text(),'CLICKS')]/../../td[2])[6] that expath does a good job of grabbing the "37" or whatever number in in the location where the h3 reads "CLICKS". That's fine for this page, but some pages don't work because the trailing code /../../td[2])[6] iwhere the [6] exists needs to sometimes be [5] or [4] etc.. SO, I'm thinking that since the h2 "PPC SUMMARY" is the consistently un
  13. Absolutely perfect! I'll tell you what, it's the selfless help from people like you, and Nick (Code Docta), Dan (bot-factory), Buddy S., Aymen, Deliter, and Valo (and that's just in the few weeks that I have been posting) that has made this work. This is why ubot will work for me and I will continue to subscribe. I love the learning and very much appreciate the help.
  14. Hey Nick - Thanks very much for your response and solution. https://www.screencast.com/t/9554pJKoN3 For some reason, I'm getting a script error. I'm not experienced enough to understand how to debug in this scenario. Can you please help? Chris
×
×
  • Create New...