fakankun 0 Posted November 6, 2010 Report Share Posted November 6, 2010 ok this is what im trying to do: without ubot i would normally just right click the link on this page:http://links2rss.com/convert.php the one titled "link" and then click "save link as" which then downloads the webpage how would you do it with ubot? and please no theory, i've been at this for hours i need a working solution. NOTE: i don't just want to scrape the url to file. i want to actually download or save the page. so please don't tell me how to save the link, i need to know how to download or save the page. Quote Link to post Share on other sites
IRobot 43 Posted November 6, 2010 Report Share Posted November 6, 2010 Please do not start multiple threads on the same topic - continue the thread that you started:http://ubotstudio.com/forum/index.php?/topic/5268-how-do-i-save-what-i-download-to-a-folder-instead-of-a-file/ All the questions that you've asked were answered on that thread. Quote Link to post Share on other sites
New Guy 4 Posted November 6, 2010 Report Share Posted November 6, 2010 Here's how I would do it. See the attached bot. Make sure the file that you save it to is an xml file though, or you can just rename it after.File Save.ubot Quote Link to post Share on other sites
fakankun 0 Posted November 6, 2010 Author Report Share Posted November 6, 2010 thanks, it didn't work when i ran it though, what would i need to change or add on my end in order for it to work? Quote Link to post Share on other sites
fakankun 0 Posted November 6, 2010 Author Report Share Posted November 6, 2010 nevermind i got it to work. but what i still dont get is why can i just save the files to a folder instead of a pre-existing file? i don't want to have to create 250 file FIRST ya know. isn't there a way to just download consistently and have it all go into it's own folder like normal downloads do? Quote Link to post Share on other sites
MiriamMB 63 Posted November 6, 2010 Report Share Posted November 6, 2010 If you are saving to file, the file does not have to exist. You can have a filename set to a variable, and then add another incremented variable that gives the file name a number.So if you want to create a file named blue.xml and you want to add a number to every other. Xml file named blue, Just set the word blue to a variable. Set the number 0 or 1 to a variable. Increment the number variable.When you're saving to the file, decide where you want them saved. So it would look like this for example: Save to file>$document folder\variable "blue"variable"1".xml nevermind i got it to work. but what i still dont get is why can i just save the files to a folder instead of a pre-existing file? i don't want to have to create 250 file FIRST ya know. isn't there a way to just download consistently and have it all go into it's own folder like normal downloads do? Quote Link to post Share on other sites
crazyflx 22 Posted November 6, 2010 Report Share Posted November 6, 2010 I posted this exact response on your other thread (that is virtually the same question): Here you go man, a working example that does the following: Visits the page where you need to save all the feeds Scrapes that pages XML urls & adds them to a list Downloads each XML file from the list of URLs AND dynamically creates the file names for them If you have any questions, let me know. P.S. - It will save the file to "my documents" with the file name 1.xml If there were 3 urls to download, they would be saved as: 1.xml2.xml3.xml etc, etc, etc. It uses an incremented variable as the file name to save. You'll see when you download the example & check out the source code.Example.ubot Quote Link to post Share on other sites
fakankun 0 Posted November 6, 2010 Author Report Share Posted November 6, 2010 thank thank, many thanks man Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.