Automator to download files from web that have changed since last week?
-
I wish to automate downloading an mp3 from a specific web directory (I have permission, and it is not password protected) each week, but only the single mp3 file that has changed since the previous week. I run a small LP/Internet radio station and we syndicate certain shows (with permission) and each one has its own unique way of getting files to their affiliates. This one show dumps their files into a web directory (http://www.ecoshock.net/affiliates/). I can easily write an Automator workflow that will Launch Safari, go to that directory and only download files that start with ES_ and end with _Affiliates.mp3, but I'd like to only download the most recently added mp3 that matches the criteria I mentioned. I assume it could be accomplished by adding a line or two of Applescript to the Automator workflow, but I do not know how to write Applescript. Here's my current Automator workflow: 1) Launch Application - chooses Safari 2) Get Specified URLs - points to the directory I listed above 3) Get Link URLs from Webpages - Only return URLs in the same domain as the starting page 4) Filter URLs ALL Name contains ES_ Name contains _Affiliates.mp3 5) Download URLs Right now this is to a folder on Desktop, but I eventually will have it add to iTunes and add some tags, create a Smart Playlist, and then close Safari. I ass-u-me I can swap in a line of Applescript for step 4 above or add it right after, but I am open to other options. PS: If it was via FTP access I could also synchronize the directory using Transmit or Fetch, but that isn't an option.Even checked with the Panic developers and they confirmed.
-
Answer:
It looks like the filename contains the date of the show. You could use something like this: wget "http://www.ecoshock.net/affiliates/ES_`date +%y%m%d`_Affiliates.mp3" ...and wrap that in the aforementioned cron job. Schedule it to run weekly after they've dropped the MP3, the morning after or something along those lines. You may need to install wget (or curl), which entails its own set of tasks, but isn't too hairy. On reasonable Linux hosts, both of those tools should be standard issue.
terrapin at Ask.Metafilter.Com Visit the source
Other answers
Thanks, all. I think I was able to combine all the tips here with the features I want from Automator (add import to iTunes, and add tags). I used "run shell script" at the start of my automator forkflow with curl curl http://www.ecoshock.net/affiliates/ES_`date +%y%m%d`_Affiliates.mp3 > ~/Documents/EcoShock/ES_`date +%y%m%d`_Affiliates.mp3 and the programmer has informed me that the show is updated in that directory on Saturday evenings. So I have the set it to run Saturday late in the evening. Will test tonight. Thanks again!
terrapin
The solution is a relatively trivial script of just a few lines run daily by a cron job. Don't mess about with applescript and the multiple hacky steps you've proposed.
turkeyphant
I don't know how one would get the timestamp... Could you just download the URLs that aren't already downloaded?
Monochrome
Ah, also - you'd want to make sure the job runs on the same day as the day the dropped it, or the output of the 'date' command would create a filename that doesn't exist on the webserver.
jquinby
You don't even need a Webserver, you should be able to do it all locally. And there are multiple ways to get timestamp info if you can't just use the filename or one of the other methods already mentioned.
turkeyphant
Thanks for the ideas! Chiming in to say — it it wasn't clear — I am a serious noob when it comes to much of the suggested methods. My colleague is reading this thread, and he understands the wget, but not how to "wrap it in a ... cron job"
terrapin
cron is a program that runs on your Mac. You'll need to configure it from the command line, but it can be used to run scripts and whatnot on a regular schedule - once daily, every 1st and 15th of the month, once a year on New Year's Day, every 15 minutes, and so on. http://code.tutsplus.com/tutorials/scheduling-tasks-with-cron-jobs--net-8800.
jquinby
Nice. curl is fabulously useful. One suggestion - if redirecting output like that goes wonky, try using the "-o" (output) switch like so: curl http://www.ecoshock.net/affiliates/ES_`date +%y%m%d`_Affiliates.mp3 -o "~/Documents/EcoShock/ES_`date +%y%m%d`_Affiliates.mp3"
jquinby
Don't put double-quotes around the ~ part of the path, or the shell will fail to expand ~ to the pathname of your home folder. In fact there's no need to use the double-quotes at all in this instance, since the pathname contains no spaces; if it did, you could use something like~/"Folder Name With Spaces/Eco Shock/ES_`date +%y%m%d`_Affiliates.mp3"
flabdablet
Related Q & A:
- How do you change your last name when you send an email if you have changed your settings?Best solution by Yahoo! Answers
- How has the American agriculture changed during this last century?Best solution by en.wikipedia.org
- How has education changed in the last 20 years?Best solution by Yahoo! Answers
- Why can't I download files from my email?Best solution by Yahoo! Answers
- How to download files and pics on a blank DVD discs?Best solution by Yahoo! Answers
Just Added Q & A:
- How many active mobile subscribers are there in China?Best solution by Quora
- How to find the right vacation?Best solution by bookit.com
- How To Make Your Own Primer?Best solution by thekrazycouponlady.com
- How do you get the domain & range?Best solution by ChaCha
- How do you open pop up blockers?Best solution by Yahoo! Answers
For every problem there is a solution! Proved by Solucija.
-
Got an issue and looking for advice?
-
Ask Solucija to search every corner of the Web for help.
-
Get workable solutions and helpful tips in a moment.
Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.