Efficient Methods for Downloading Level 2 QuikSCAT Data

Efficient Methods for Downloading Level 2 QuikSCAT Data

Postby podaac » Mon Nov 28, 2016 2:41 pm

User Question:
What is the most efficient method for requesting/downloading via ftp significant quantities (say, a full global month, or a full global year) of QuikSCAT L2B v3.1 (12.5 km resolution) swath data. Downloading individual swath files via the web interface is tedious to say the least. QSCAT data do not seem to be available through HiTIDE or REVERB.

Thanks for your help and attention.
---------------------------------------------------------------------
Reply to User:
We just released this new L2B QuikSCAT dataset a few weeks ago, and you are correct in that it is not yet available in HiTIDE which would be the most appropriate tool for your use case.

It will be in HiTIDE very soon!

In the meantime may I offer this little "workflow" that leverages the PO.DAAC web service for granule discovery. For example to stream a full months of granule "end points" (FTP, OPeNDAP links) you could do this in a web browser:

http://podaac.jpl.nasa.gov/ws/search/gr ... rPage=1000

You can do much more with the UNIX web client 'wget'. For example from a UNIX command line:

% wget "http://podaac.jpl.nasa.gov/ws/search/granule/?datasetId=PODAAC-QSX12-L2B31&startTime=2005-03-01&endTime=2005-03-31&itemsPerPage=1000" -O QS_31.xml
% grep ftp QS_31.xml | cut -c15-117 > QS_31_FTP.out
% wget -i QS_31_FTP.out

In three steps we have: 1) executed the web service call and returned output in 'QS_31.xml'; 2) extracted just the FTP URLs and cleaned them up; 3) downloaded the cleaned list of FTP URLs

Of course if you want to be more elegant this all could be accomplished in a line or two of UNIX calls.

The ideal service is HiTIDE of course, so we will get that available as soon as possible.
---------------------------------------------------------------------
Additional Reply to User:
What was previously recommended should definitely help.

In addition, I think you should consider using our new PO.DAAC Drive service: https://podaac-uat.jpl.nasa.gov/drive/

Aside from providing an API for users to query our data archive and download data files in bulk using command line tools and scripts (e.g., Curl and Wget), the PO.DAAC Drive offers an additional capability in allowing external users to mount a virtual drive to their machine for even easier access, which can allow users to transfer entire directories of data in a single grab.

PO.DAAC recently presented the relatively new PO.DAAC Drive service during the October 2016 EOSDIS Webinar, which can be viewed at the following link: https://www.youtube.com/watch?v=VZmzWANEKBs

As you'll see when accessing PO.DAAC Drive, you are required to first obtain a NASA Earthdata account (which I'm sure you already have) which is required for user authentication and improved usage metrics.

Please continue to let us know if you need any additional assistance with these services.
podaac
Site Admin
 
Posts: 312
Joined: Mon Oct 22, 2012 4:00 pm

Return to Ocean Wind

cron