SMAP – Tools

Documents

Name Description
ATBD Documents Algorithm theoretical basis documents are listed with the products available for download.
Product Specification Documents Product specification documents are listed with the products available for download.
ASF SMAP User Guide For Level 1 products, a streamlined guide to accessing data, using SMAP products, understanding acronyms and abbreviations, and more.
Radar Backscatter Calibration, L1B_S0_LoRes and L1C_S0_HiRes Beta Level Data Products Provides analysis and assessment of calibration quality of SMAP radar normalized backscatter cross-section for the L1B_S0_LoRes and L1C_S0_HiRes beta level data products. Dated 8/5/2015.
SMAP Handbook Written in 2013 as a compendium of information on the project near its time of launch. Contains essential information on programmatic, technological, and scientific aspects of the mission.
Ancillary Data Reports (table includes links to cited data) Application-related descriptions of datasets used with science-algorithm software in generating SMAP science-data products, as well as links to EASE grid information relevant to SMAP products. Also included are links to the ancillary data cited in the reports.
Publications Lists publications on remote sensing of soil moisture since 2001. Please submit additional publications to uso@asf.alaska.edu, with "SMAP Publications" on the subject line.

Tools

Name Description
ASF Software Tools ASF offers several tools that can be used on many datasets.
SMAP Analysis Client Interactive application developed by the Jet Propulsion Lab (JPL) / NASA for Soil Moisture Active Passive (SMAP) data. Users may want to start with the calendar icon on the far right of the page.
GDAL The Geospatial Data Abstraction Library helps explore the general contents of SMAP Hierarchical Data Format (HDF) 5 data. The gdalinfo tool summarizes the data structure in the file. The gdal_translate tool can extract the SAR data out of the structure and store it into a more versatile GeoTIFF.
ArcGIS (commercial GIS system) Recognizes the HDF5 structure and is able to extract the SAR data on the fly. Does not calculate standard statistics for this data layer and is slow in rendering the data this way. Can be significantly improved by converting the HDF5 data layer HH into a GeoTIFF file using the gdal_translate tool.
HDFView Provided by the HDF group; looks at data in two ways. For quantitative analysis, the selected data layer must be opened as spreadsheet. For a visual analysis, HDFView provides the image view. The program has no options for stretching data in a statistical fashion. However, the user can manually change the brightness and contrast.
Panoply Developed by NASA's Goddard Institute for Space Studies, primarily used for global datasets of lower resolution.
Interactive Data Language The Interactive Data Language (IDL) provides a more programmatic means to visualize the SMAP HDF5 data. The IDL H5 browser has very limited functionality in terms of changing visual value ranges and stretching the imagery for visualization purposes, but it does provide users the ability to view the HDF5 products.
Worldview quick-look tool This tool from NASA's EOSDIS provides the capability to interactively browse global, full-resolution satellite imagery and then download the underlying data. Most of the 100+ available products are updated within three hours of observation.
HDF Group The HDF Group offers a list of software using HDF5.

ASF list of potentially useful software tools.

International Polar Year – GIIPSY – Space Task Group Meetings

2006

Briefing to WMO, GEO and IPY JCOMM SCOBS, Oct. 12 (Drinkwater)
Briefing to IGOS Cryosphere Team, Oct. 16, ESTEC
Briefing at NSIDC, Oct. 25, Boulder
Presentation at IGS Nordic Branch, Tromso, Oct. 2006 (Lytle)
GIIPSY participation in CSA Radarsat-1 Archive Processing Meeting, Nov. 8, Ottawa
GIIPSY Team meeting, Dec 12 
Briefing to WCRP CLiC, Boulder, Colorado, Dec. 6 (Drinkwater)

Tuesday December 12, 2006, 6:00PM-8:30PM
AGU Fall Meeting, San Francisco
Marriott Convention Center, Room Sierra K

PERSPECTIVES (~45-60min)

10 min GIIPSY Overview and Meeting Objectives (K. Jezek)
~5 min discussion on how GIIPSY fits into IPY (D. Carlson)
~5 min IGOS and GIIPSY (V. Ryabinin)
~5 min ESA presentation on ESA IPY AO proposals (Einar-Arne Herland, ESA)
~5 min CSA perspective on GIIPSY and Radarsat-1 Archive (P. Briand)
~5 min Japanese perspective on GIIPSY (J. Ukita)
~5 min NASA perspective on GIIPSY (S. Martin)
~5 min Ground Station perspective on processing large volumes of data (N. La Belle-Hamer)
~5 min DLR perspective on GIIPSY (K. Jezek for I. Hajnsek)

TOPICS FOR DISCUSSION (K. Farness) (1-1.5 hour)

1) Community input to flight agencies on archival data processing and new data acquisitions

2) Identify key IPY legacy data set(s)
  What?
  Where?
  When? (for example: during windows of seasonal melt or alternatively during cycles of ICESAT data acquisitions?)

3) Related issues and potential additional resources
  Sensors and Spacecraft
  Acquisition Planning
  Receiving Ground Stations
  Processing Facilities
  Calibration and Validation
  Distribution

4) Data management issues
  Processing and product distribution
  Product format
  Metadata tagging for data explicitly collected for IPY including:
    Insitu data
    GIIPSY pull from the CSA archive

FINAL PARTICIPANTS LIST

2007

Space Task Group meeting in Geneva, January 17-19, 2007
IPY Launch Event, Paris, March 1, 2007
Space Task Group telecom, June 15, 2007 – (ESA Data Portfolio)
Space Task Group Telecon, August 8, 2007 – (ASAP (RADARSAT-1) Portfolio)
IICWG, October 2007, ESRIN, Frascatti
Space Task Group Meeting, November 26-27, 2007, Darmstadt (Agenda, Presentation, and Background Material)

Geneva, WMO Headquarters

Day 1

Day 2
  • Leaving an IPY legacy
  • “The Polar Snapshot” K. JezekExample Acquisitions
  • Geo Contribution (M. Rast and E. Sarukhanian)
  • Mechanisms for collecting IPY data requirements
  • SCOBS survey of IPY sat. data needs (E. Sarukhanian)
  • Other Agency IPY AOs (All)
  • Establishing the priorities for near-term/medium-term/long-term actions
  • Acquisition planning/Tasking satellites (All)
  • Data Management/Metadata standards (All)
  • Archiving and Data Distribution (All)
  • Data Policy (All)
  • Discussion on Agency Commitments
  • Baseline plans for data acquisitions
  • Archiving and distribution

Day 3 (half-day)
  • Missing Agencies
  • Consolidation of Action Items
  • Time and place of next Meeting
  • Meeting Closure

Summary Documents

Report to CM-7
Slides presented by T. Mohr to CM-7
STG1 Summary

2008

STG SAR Workshop, March 2008, St. Hubert (Agenda, Presentations, and Background Material)
Space Task Group Meeting, May 5-6, 2008, ESRIN Frascati, Italy (Details and Summary Reports)
STG IPY Session, SCAR, July 2008, St. Petersburg, Russia
STG SAR Workshop 2, Sept. 30-Oct. 1, 2008, DLR Oberpfaffenhovfen, Germany (Details)

Canadian Space Agency, St. Hubert, Canada

As part of ongoing IPY Space Task Group activities, the Canadian Space Agency hosted an IPY SAR/InSAR workshop in response to an action item from STG-1.

The goal of the meeting was to develop an acquisition strategy for SAR and InSAR data that achieves the maximum number of IPY science objectives in such a way as to distribute the acquisition load across the different agencies – understanding that no single agency can accommodate all of the tasks. This will require a level of coordination between the space agencies that has not yet been attempted. The workshop focused on data acquisition requirements, as outlined by the scientific community, based on Agency strategic plans for IPY and unique capabilities of their systems. Data processing and distribution issues were briefly addressed but the primary objective was to populate the archive given the unique opportunity of IPY.

The workshop main tasks were:

1) To review existing GIIPSY science requirements as outlined in the Global Inter-agency IPY Polar Snapshot Year (GIISPY) Strategy Document.

2) To present the Agencies strategic priorities in line with IPY science activities.

3) To present and review SAR data which are being collected, and are planned for collection during the IPY.

4) To present the satellite and ground segment operators system capabilities and constraints related to the acquisition of data in support to IPY.

5) To attempt to forge a coordinated / multi-agency SAR acquisition plan for the remainder of IPY acquisitions.

In order to meet these ambitious objectives, invited attendees, which include members of the science and operational communities, space agencies operating SAR satellite; space agencies with data transmission, ground segment and processing capabilities, and ground segment operators, were requested to actively contribute to the achievement of the workshop objectives.

Logistical Information

Map to CSA

AGENDA and PRESENTATIONS

Agenda
Meeting Introduction (YC)
STG Background (YC)
GIIPSY Background (KJ)
ESA Portfolio (HL)
ESA and IPY (YD)
ESA Constraints (HL)
DLR Strategic Objectives (ED)
DLR IPY Data Constraints (ED)
TerraSAR Capabilities (DF)
TerraSAR Data Access (DF)
CSA Strategic Objectives (YC)
R1 Constraints (RSJ)
R2 Allocation (DdL)
ASF and JAXA (DA)
Ice Services (DL)
CNES Complementary Optical Studies (ET)
Summary and Action Items (YC and KJ)
Meeting Minutes
Participants List

Data Acquisition Plans

Strawman Acquisition Strategy Spreadsheet
Antarctic Acquisition Template
Sea Ice Acquisition Template
Arctic Land Ice Acquisition template

DLR, Oberpfaffenhofen, Germany

As a follow up to the STG action to enhance the coordination among the SAR Space Agencies in response to the IPY scientific objectives, a SAR/InSAR coordination workshop was held on 5/6 March 2008 at the Canadian Space Agency. As a result of this meeting, the SAR operating Agencies agreed on the strategic guidelines for imaging activities and for thematic priorities as outlined by the science community. Despite the significant way forward and considerable efforts we are putting to fulfill our commitments, there is still a need to develop a SAR Acquisition Consolidated Plan and discuss the way forward for a SAR processing strategy and data dissemination plan.

As part of the SAR coordination process, the German Aerospace Center (DLR) is organising a second meeting to address the harmonisation and coordination of the acquisition plans. The objectives of this meeting are:

* Consolidate the current SAR planning and imaging activities occurring under the auspices of IPY/STG in order to avoid gaps and overlaps and optimize resources (i.e. thematic / instrument matrix, common planning tool, etc.)
* Distribute imaging load according to Agencies’ capacities and priorities, and develop acquisition plans
* Look at a short/med/long term planning approach to continue the acquisitions (if at all possible)

The meeting will be held at the DLR facilities in Oberpfaffenhofen (Germany).
We now established the date of the workshop on 30 Sept./1 Oct.2008.

Logistical Information

Directions to DLR
Hotels close to DLR
Hotels in Munich
DRAFT Agenda

GIIPSY Draft Acquisition Plan Recommendation for TSX.

These plans were prepared to provide DLR with an estimate of SAR loading for Antarctic regions south of about –80°. Comments welcome. These are in addition to any individual proposals and to several super-site areas that the STG is considering in Greenland and Antarctica.

Filchner Ascending
Filchner Descending
TAM Ascending
TAM Descending

JAXA IPY Web Site (Comments welcomed by Dr. Masanobu Shimada)

Japanese: JAXA website | JAXA IPY Dataset page

English: JAXA website | JAXA IPY Dataset page

PALSAR Animations. Antarctic: /A/B/C

ESA/ESRIN Acquisitions and Plans

Information on upcoming ERS-2 acquisition from O’Higgins Station
Acquisition map for O’Higgins Station
Animation of ESA SAR data over Lincoln Sea. Courtesy Ron Kwok

Participants List

Agenda

Presentations
1) Buckreuss and Roth, DLR
2) Crevier, CSA
3) Jezek, OSU
4) Del Rio Vera, Drinkwater and Laur, ESA
5) Saint-Jean, CSA
6) Rigby, MDA
7) Hajnsek, DLR
8) Floricioiu, DLR
9) Braun, U Bonn
10) Helm and Miller, AWI
11) Hall, KSAT

Minutes
TSX Subgroup meeting (Thursday, Oct 2)
SAR Subgroup meeting Summary
IPY CGMS Report
Meeting Minutes
Followup Telecon Minutes

2009

STG Meeting, February 2009, Geneva (Agenda, Presentations, and Background Material)
3rd SAR Workshop, June 2009, Frascatti (Agenda, Presentations, and Background Material)
STG Meeting, December 2009, Geneva (Agenda, Presentations, and Background Material)

2010

Oslo IPY Meeting and STG Meeting (June) (Agenda, Presentations, and Background Material)

Other

Polar Gateways Meeting, Barrow, Alaska, January 2008
SCAR/IASC IPY Open Science Conference, July 2008 St. Petersburg, Russia
IPY STG SAR Workshop, March 5, 6 Canadian Space Agency, Montreal, Canada
2nd SAR Workshop Sept 30-Oct. 1 DLR Wessling Germany
2008 AGU Fall Meeting (Presentations and informal get together)
2008 AGU FALL MEETING TASK FORCE FOR REMOTE SENSING OF PERMAFROST. G. Grosse and C. Duguay. Monday, 15 December 2008, 5:00-6:30 pm in the San Francisco Marriott Pacific Room A (Agenda)
STG-4, Feb. 3-4, 2009, WMO, Geneva

cropland_data

SMAP – Data and Imagery

SMAP Data Products

Instrument events timetable

The SMAP baseline science-data products will be publicly available through two NASA-designated data centers, the Alaska Satellite Facility (ASF) and the National Snow and Ice Data Center (NSIDC) DAAC. ASF will distribute the Level 1 radar products and NSIDC will distribute the radiometer and Level 2-4 products. All products will conform to the HDF-5 standard.

SMAP synthetic aperture radar (SAR) data have a spatial resolution of 1-3 km over the outer 70% of the swath (L1C_S0_HiRes product). The low-resolution radar data are ‘slices’ of resolution approximately 5 km x 30 km (L1B_S0_LoRes product). The radiometer data have a spatial resolution (IFOV) of 39 km x 47 km, nominally referred to as 40-km resolution (L1B_TB product). The L1C_TB data products are resampled TB data on a 36-km Earth grid and will have spatial resolution ~10% greater than the L1B_TB data (depending on the resampling method).

SMAP science-data products will be generated on the Science-Data System production system using science software developed on the SDS Testbed. The science software is based on algorithms for each product described in the Algorithm Theoretical Basis Documents (ATBDs). 

(Beta Level 1 radar products now available for download)

Short Name Description Gridding (km) Latency* ATBD** Product Spec Doc Source
L1A_Radar Raw radar data in time order (half orbit) - 12 hrs n/a View ASF
L1A_Radiometer Radiometer raw data in time order - 12 hrs n/a NSIDC
L1B_S0_LoRes Low-resolution radar σ ο in time order (half orbit)
Backscatter analysis document
5x30 12 hrs View View ASF
L1B_TB Radiometer TB in time order 36x47 12 hrs View NSIDC
L1C_S0_HiRes High-resolution radar σ ο (half orbit gridded)
Backscatter analysis document
1
(1-3)***
12 hrs View View ASF
L1C_TB Radiometer T B (half orbit, gridded) 36 12 hrs View NSIDC
L2_SM_A Soil moisture (radar, half orbit) 3 24 hrs View NSIDC
L2_SM_P Soil moisture (radiometer, half orbit) 36 24 hrs View NSIDC
L2_SM_A/P Soil moisture (radar/radiometer, half orbit) 9 24 hrs View NSIDC
L3_F/T_A Freeze-thaw state (radar, daily composite) 3 50 hrs View NSIDC
L3_SM_A Soil moisture (radar, daily composite) 3 50 hrs View NSIDC
L3_SM_P Soil moisture (radiometer, daily composite) 36 50 hrs View NSIDC
L3_SM_A/P Soil moisture (radar/radiometer, daily composite) 9 50 hrs View NSIDC
L4_SM Soil moisture (surface and root zone) 9 7 days View NSIDC
L4_C Carbon net ecosystem exchange (NEE) 9 14 days View NSIDC

*Mean latency under normal operating conditions (defined as time from data acquisition by the observatory to availability to the public data archive).

**Algorithm Theoretical Basis Documents (ATBDs) provide the physical and mathematical descriptions of the algorithms used in the generation of science-data products. The ATBDs include a description of variance and uncertainty estimates and considerations of calibration and validation, exception control, and diagnostics. Internal and external data flows are also described.

***Over outer 70% of swath

Ancillary Data Reports

The SMAP Ancillary Data Reports below hold application-related descriptions of datasets used with science-algorithm software in generating SMAP science-data products, as well as links to EASE grid information relevant to SMAP products.

These reports may undergo additional updates as new ancillary datasets or processing methods become available.

Antarctic Coastline

International Polar Year – GIIPSY Documents, Publications, and Mission Plans

Documents

Science Data Requirements and Mission Planning

The following files summarize a community assessment of GIIPSY satellite data requirements. Detailed requirements and justifications are discussed in the GIIPSY Science Requirements Document. A consolidated and numbered set of science themes are listed in GIIPSY Thematic Objectives. These were presented and tentatively adopted by the Space Task Group. A spread sheet linking GIIPSY thematic objectives by number to specific observational requirements is included as the GIIPSY Check List. The GIIPSY Check List also refers to image and digital maps listed below.

GIIPSY Science Requirements Document
GIIPSY Thematic Objectives
GIIPSY Check List

Image Maps of Coverage

Antarctica and Greenland Ice Sheets

Arctic and Antarctic Sea Ice

Glaciers

Permafrost

Mission Plans

seasat_data_flow_prep_cleaning

Seasat – Technical Challenges – 4. Data Cleaning (Part 2)

4.3 Prep_Raw.sh

seasat_data_flow_prep_cleaning

After development of each of the software pieces described previously in this section, the entire data cleaning process was driven by the program prep_raw.sh. This procedure was run on all of the swaths that were output from SyncPrep to create the first version of the ASF online Seasat raw data archive (the fixed_ files). Analysis of these results is covered in the next section. What follows here are examples of the prep_raw.sh process and intermediate outputs.

Time Filtering in Stages: Each plot shows range line number versus MSEC metadata value. Top row is before filtering; bottom is after. From left to right: (a) Stage 1 — attempt to fix all time values > 513 from the linear trend. (b) Stage 2 — fix stair steps resulting from sticking clock on satellite. (c) Stage 3 — final linear fix before discontinuity removal.

Time Filtering in Stages: Each plot shows range line number versus MSEC metadata value. Top row is before filtering; bottom is after. From left to right: (a) Stage 1 — attempt to fix all time values > 513 from the linear trend. (b) Stage 2 — fix stair steps resulting from sticking clock on satellite. (c) Stage 3 — final linear fix before discontinuity removal.

Data Set #1

Decoded Signal Data
Time Gap Corrections
Stair Corrections
Time Value Corrections

Data Set #2

Decoded Signal Data
Time Gap Corrections
Stair Corrections
Linear Time Restored
Decoded Time Data from Section 2.1 Examples
Cleaned Time Data from 2.1 Examples

4.4 Results of Prep_Raw.sh

By November 15, 2012, the beta version of prep_raw.sh was delivered to ASF operations. It was run on individual swaths at first, with results spot-checked. Once confidence in the programs increased, all remaining Seasat swaths were processed through this decoding and cleaning software en masse. Overall, from the 1,840 files that SyncPrep created, 1,470 were successfully decoded, and 1,399 of those made it through the prep_raw.sh procedure to create a set of fixed decoded files. The files that failed comprise 242 GB of data, while 3,318 GB of decoded swath data files were created.

Processing Stage #files Size (GB)
Capture 38 2160
SyncPrep 1840 2431
Original Decoded 1470 3585
Fixed Decoded 1399 3318
Good Decoded 1346 3160
Bad Decoded 53 157

Summary of Data Cleaning

93% of data made it through SyncPrep
92% of that data decoded (assume 1.6 expansion)
93% of that data was “fixed”
95% of that data considered “good”
OVERALL: ~80% of SyncPrep’d data is “good”

Reasons for Failures

  • SyncPrep: Not all captured files could be interpreted by SyncPrep
  • Original Decoded: Because the decoder needs to interpret the subcommutated headers, it is more stringent on maintaining a “sync lock” than SyncPrep
  • Fixed Decoded: Some files are so badly mangled that a reasonable time sequence could not be recovered
  • Bad Decoded: Several subcategories of remaining data errors are discussed in section 5

4.5 Addition of fix_start_times

During analysis of the fixed metadata files, it was discovered that bad times occurred at the beginning of many files. This problem was not a big surprise; the nature of the sync code search is such that many errors occur in places where the sync codes cannot be found. This is why the files were broken in the first place. So, it is expected that the beginning of a lot of the swath files will have bad metadata, which means bad times. When bad times are linearly trended, the results are unpredictable.

Bad Start Time Examples

To fix this problem, yet another level of filtering was added to the processing flow – this time a post-filter to fix the start times. The code fix_start_times replaces the first 5,000 times in a file with the linear trend resulting from the next 10,000 lines in the file. This code was added as a post-processing step to follow prep_raw.sh and run on all of the decoded swath files.

Final Processing Flow: With the addition of the fix_start_times code, the processing flow for data decode and cleaning is finally completed.

Final Processing Flow: With the addition of the fix_start_times code, the processing flow for data decode and cleaning is finally completed.

Fixed Start Time Examples

Written by Tom Logan, July 2013

Bits Per Sample: This value should always be 5, as the parameter never changed during the entire Seasat mission.

Seasat – Technical Challenges – 2. Decoder Development

Starting in the summer of 2012, ASF undertook the significant challenge of developing a Seasat telemetry decoder in order to create raw data files suitable for focusing by a synthetic aperture radar (SAR) correlator. In this case, that means processible by ROI, the Repeat Orbit Interferometry package developed at Jet Propulsion Laboratory. In addition to creating the range lines out of minor frames, the decoder must interpret the 18 fields in the headers to create a metadata file describing the state of the satellite when the data was collected.

The main challenges in decoding the raw telemetry were:

  1. Overcoming bit error problems
  2. Properly forming major lines from a variable number of minor frames
  3. Maintaining sync lock
  4. Discovering sentinels marking data collection boundaries

2.1 Problems with Bit Fields

All 10 of the bit fields proved to be unreliable, and, thus, with the exception of the fill flag, they are ignored by all of the software developed during this project. This section describes the ways in which the bit fields are unreliable.

Each Seasat minor frame contains 8 bits to record time and status. These bits encode 18 metadata fields, subcommutated in the first 10 minor frames of each range line. There are 10 bit fields, four fields that should be constant for a data take, two fields that should change rarely and two fields that should change steadily. Unfortunately, due to the high bit error rate (BER) of the telemetry data, even fields that should be constant show high variability. The following plots, showing decoded metadata values plotted over 800,000 range lines, drive home the enormity of the bit error problems in these data.

Bits per Sample

Bits Per Sample: This value should always be 5, as the parameter never changed during the entire Seasat mission.

PRF rate code

PRF Rate Code: The PRF rate code should be a 4 for the entire mission, since this satellite parameter never changed.

Last Digit of Year

Last Digit of Year: The Seasat mission occurred entirely in 1978, so the last digit of the year should always be 8.

Station Code

Station Code: The Station Code should be a constant for any given datatake. 5: Fairbanks, Alaska; 6: Goldstone, Calif.; 7: Merrit Island, Fla.; 9: Oak Hanger, United Kingdom; 10: Shoe Cove, Newfoundland.

Day of Year

Day of Year: For a given datatake, the day of year should change at most once, since any single datatake cannot exceed even an hour in duration, much less an entire day.

Delay to Digitization

Delay to Digitization: The delay to digitization parameterizes the time between emission of a pulse from the satellite and recording of return echoes. Used to calculate the slant range to the first pixel, the delay should change only a handful of times in any given raw data file based upon changes in orbital altitude.

Clock Drift

Clock Drift: The spacecraft clock drift records the timing error of the spacecraft clock. This should be a smoothly changing field, generally in the 2,000- to 3,000-millisecond range. It is not known how this field was originally created, only that it is vital in getting reasonable geolocations for processed imagery.

MSEC

MSEC: This field records the millisecond of the day when the data were acquired and should, therefore, be a linearly increasing field with an exact slope of 1/PRF.

Given all of the fallout from these truly bizarre plots, it is no surprise that attempts to use the fill flag quickly proved difficult; the bit errors are so pervasive that the field is unreliable.

2.2 Determining Minor Frame Numbers

Several oddities in the raw data are exacerbated by the high BER. First, the data are organized into 1,180-bit minor frames. This means that they are each 147.5 bytes long. Although the .5 byte offset was easy to deal with, it turns out that sync codes may actually appear at 147, 147.5, or 148 bytes from each other at seemingly random places in the raw data file – a topic addressed in section 2.3

Moreover, a variable number of minor frames need to be combined to create a single range line. Some lines contain 59 minor frames and some contain 60. Considering that the frame number in the minor frames is only 7 bits, and no major frame numbers exist in the telemetry, the “simple” task of finding the start of each major line was at times quite difficult. Synchronization codes can be either byte-aligned or non-byte-aligned, and partial lines occur on a regular basis. As a result, the minor frame numbers eventually had to be determined by context.

The current frame number is determined using three previous minor frame numbers and the next frame number — along with a handful of heuristics. For example, if a gap is found in consecutive minor frame numbers, the following rules are applied:

  • If the next frame is 1 and last was either 59 or 60, assume this is frame zero
  • Else if (next_frame-last_frame)==2, put this frame in sequence
  • Else if last two frames are in sequence, try to put this one in sequence – If the last frame < 59, put this in sequence – If the last frame was 59 and the last line was length 60, then this HAS to be frame zero since we never have two length 60 lines in a row.
  • Else if last2 and last3 frames are in sequence and last frame is 0, then set this frame to 1
  • If we got to here, then we did not fix the error!

Even beyond these rules, additional checks for a bad frame number 0 and major frames that spuriously showed more than 60 minor frames still had to be performed.

Aside: Bit Errors in Frame Numbers

Random bit errors in the middle of a line: In these decoded minor frame numbers, we see that lines 1, 2, 4 and 5 have no frame number errors; they are in sequence starting from 1 and going up to 59 or 60 by ones. Meanwhile, line 3 has 24 frame numbers in a row that are in error.

Partial lines: The first line is missing minor frames 5-9; the second line is complete; the third line has repeated minor frames 15-19; and the fourth line is the decoder’s attempt to enforce the fact that at most 60 minor frames form a range line. The fifth line is missing minor frames 23-26; the sixth line is complete; the seventh line has repeated minor frames 32-36; and, again, the eighth line is the decoder’s attempt to enforce the fact that at most 60 minor frames form a range line. 

Multiple lines of bit errors: This example shows how bad random bit errors can be, even if no minor frames are actually missing. Incredibly, in five lines, 122 minor frames are in error out of 298 total, giving a 40 percent error rate for these frame numbers. Perhaps even more incredibly, the ASF Seasat decoder managed to fix all of these frame numbers.

  • Original non-fixed frame numbers: 

  • Frame numbers fixed by the ASF Seasat decoder:

2.3 Maintaining Sync Lock

One very important aspect of decoding telemetry data is maintaining a sync lock: The decoding program must be able to find the synchronization codes that occur at the beginning of each minor frame.

Early in the development of the decoder, it was determined that the sync codes are just as susceptible to bit errors as the rest of the data. Initially, finding sync codes required a considerable amount of searching in the file, with the hope that no false positives would be encountered. After much development and testing, it was determined that in order to maintain sync lock, some number of bit errors had to be allowed in the sync code. Therefore, the code was configured to allow 7 bit errors per sync code out of 24 bits. Values less than this needlessly split datatakes (single passes of data over a given ground station). Values greater than this showed too many “false positive” matches for sync codes.

As a result of this extensive analysis, a pattern was determined in the location of sync codes. That is, a byte-aligned, 147.5-byte frame followed by a non-byte-aligned, 147.5-byte frame, repeated 14,217 times, followed by a single instance of a 147-byte frame. In code form:

Once this pattern was established, most problems with locating sync codes were abated.

2.4 Data Sentinel Values – Breaking Datatakes

The next problem involved bad sections of data that defied attempts to match frame numbers. The only solution is to break the datatake into multiple pieces, closing the current output file when problems arise, and creating a new output file when sync is regained. This is much like what SyncPrep does, except that the ASF decoder has to be more stringent in its rules for maintaining sync since it must be able to properly build range lines in addition to just finding sync codes.

In addition to losing sync lock as a result of BER, two additional cases arose that will break a datatake into segments: either 60 occurrences of the fill flag in a row, or the repeated occurrence of frame number 127. The fill flag is a valid field but is so unreliable it can only be trusted to be correct after many consecutive hits. The frame number 127 showed to be a sentinel for no data; it occurred thousands of times in areas where no valid SAR data was being collected. Either of these happenings will also cause the ASF Seasat decoder to close the current output file and create a new one.

2.5 Results of Decoding

seasat_decoder:

  • Decode raw signal data into unpackaged byte signal data (.dat file)
    • 13680 unsigned bytes of signal data per line
    • File size is aways lines * 13680 bytes in length
  • Decodes all headers to ASCII (.hdr file)
    • 20 columns of integer numbers per line
    • One line entry per line of decoded signal data
  • Additional Features:
    • Allows both byte aligned and anon-byte aligned minor frames
    • Deals with variable length lines, partial lines
    • Fixes frame numbers from context if possible
    • Creates one or more output files per input based on sentinels
    • Assembles headers spread across 10 minor frames

In spite of all of the challenges and problems in the raw data, the ASF Seasat decoder is able to decode raw telemetry SAR data. Using five frame numbers in sequence and a handful of heuristics, telemetry data is decoded into byte-aligned, 8-bit samples. Concurrently, all of the metadata stored in the headers is decoded and placed in an external file.

The current strategy tried to err on the side of only allowing valid SAR data to be decoded. Still, 7-bit errors had to be allowed in a sync code match to even get through the raw data. In addition, the decoded header information is simply not reliable. For example, early in development, the ASF Seasat decoder broke one 7-GB chunk of raw data into 24 segments of decoded data, dumping a header at the beginning and ending of each segment. Analysis of the decoded times in these headers showed that of the 48 dumped, 3 were completely zero and an additional 12 were in error. In other words, the decoded times did not make sense in context with the surrounding time values.

Thus, even after completing the decoder development with bit error tolerance, frame number heuristics, proper sync code detection, and known sentinel values for good data boundaries, the decoded Seasat archives were still nearly unusable in any reliable fashion.

Processing Stage #Files Size (GB)
Capture 38 2610
SyncPrep 1840 2431
Original Decoded 1470 3585

Initial Data Recovery: 93 percent of the data captured from tape made it through SyncPrep; Approximately 92 percent of that data was decoded (assuming a 1.6-expansion factor).

Aside: ASF Tape Archive File Names

When the tapes were captured onto disk, files were named based upon tape number and section of tape read. For example, the first part of tape1 was initially named SEASAT_tape1_01Kto287K.

This file was run through SyncPrep, which created multiple subfiles based upon its ability to maintain a sync lock, sometimes creating over 100 such numbered files, e.g. SEASAT_tape1_01Kto287K.000 to SEASAT_tape1_280Kto668K.020

Next, the files go through the ASF Seasat decoder, gaining yet another subfile number, but the prefix “SEASAT_” is removed. Note that this stage creates a file pair of {.dat, .hdr}, e.g. tape1_01Kto287K.018_000, tape1_01Kto287K.018_001, and tape1_01Kto287K.018_002 file pairs were all created from a single decode of SEASAT_tape1_280Kto668K.018.

Thus, for a single captured file, SyncPrep could make tens to a few hundred data segments, while the ASF Seasat decoder could break each of these files into even more sub-segments.

Written by Tom Logan, July 2013

ROI Configuration File Creation Program: Many command line options are offered by create_roi_in, including the ability to process any amount of signal data or the exact amount of signal data needed to create a single 100-km2 image framed by European Space Agency (ESA) framing standards.

Seasat – Technical Challenges – 9. From Swaths to Products

At this stage in the development of the ASF Seasat Processing System (ASPS): 1,346 cleaned raw signal swaths were created; ROI was modified to handle Seasat offset video format; New state vectors were selected for use over two-line elements (TLE’s); Caltones were filtered from the range power spectra; Data window position files were created…