Title: NPOESS Data Exploitation (NDE) Preliminary Design Review
1NPOESS Data Exploitation (NDE)Preliminary Design
Review
- November 21, 2006
- NSOF Building 4231 Suitland Road Room 2001
2Welcome-J. Silva
3PDR Agenda
- 800 a.m. Welcome
-
- 815 a.m. Introduction/Background
- 900 a.m. Project Overview
- 1045 a.m. NDE Context and Hardware
- 1115 a.m. Algorithm Development
- 1200 p.m. Lunch
- 100 p.m. Data Handling System
- 230 p.m. IT Security
- 300 p.m. Communications Study
- 340 p.m. Closing Remarks
4Agenda
- 800 a.m. Welcome
- 815 a.m. Introduction/Background
- 900 a.m. Project Overview
- 1045 a.m. NDE Context and Hardware
-
- 1115 a.m. Algorithm Development
- 1200 p.m. Lunch
- 100 p.m. Data Handling System
-
- 230 p.m. IT Security
- 300 p.m. Communications Study
- 340 p.m. Closing Remarks
- Introduction/Background J. Silva
- Mission Design Drivers J. Silva
- Architectural Overview G. Goodrum
- Data Products T. Schott
5NDE Design Review Board
-
- Chairperson Vanessa Griffin, (NESDIS/OSD) Ground
Systems Division Chief - Mike Mignogno (NESDIS/OSD) Polar Program Manager
- Mike Tanner (NESDIS/HQ) Senior Program Advisor
- Reginald Lawrence (NESDIS/OSDPD), Environmental
Satellite Processing Center Chief - David Benner (NESDIS/OSDPD) Satellite Services
Division Chief - Mitch Goldberg (NESDIS/STAR) Satellite
Meteorology and Climatology Division Chief - Joseph Mulligan (NOAA/IPO) NPOESS Interface Data
Processing Segment Lead - Kevin Schrab, Ph.D. (NWS) Satellite Product
Improvement Manager - Brian Gockel (NWS/OST/SEC) Systems Engineer
- Lt. Amanda Bittinger (NESDIS/STAR) Satellite
Oceanography and Climatology Division - Charles MacFarland NESDIS/OCIO Representative
6Request for Action
- Briefing Title
- Slide Number
- _________________________________________
- Comment/ Question
-
-
- _________________________________________
- Background on Comment or Question
- _________________________________________
- Team response
- RFAs are available on tables at back of room
- RFAs to be submitted to a Review Board Member
- All RFAs are due today by the end of the meeting
(400 p.m.) - Please state your questions clearly
- Review Board will disposition RFAs
- Board Members are requested to convene to
disposition RFAs
7The NDE IT Design Team
8Todays Objective
- Objectives To ensure that NDEs design concepts
are consistent with NPOESS and NOAA architectures
and that NDE plans are feasible - NDE Design Review Board roles and
responsibilities - Review NDE system conceptual design
- Review project plans necessary to reach Critical
Design Review (CDR) in August 2007 - Review strategies (acquisition, development,
testing, implementation, transition) - Assess risks
- Request actions that will improve NDE designs and
plans
9Assessment Criteria
- Does the proposed preliminary design satisfy the
contractual requirements? - Has the overall system framework been clearly
presented? - Has the functionality been appropriately
allocated to subsystems? - Have the interfaces between subsystems and to
external entities, including users, been
identified? - Have the security features and controls been
defined? - Have delivery strategies been defined for the
subsystems (build, buy, reuse)? - Are the plans and schedules leading to CDR
feasible?
10Background Documents
- NDE Requirements Performance Measures
- Software Engineering
- Infrastructure
- Systems Management
- Data Retention Archive
- Interfaces
- Operational Product Generation
- Communications
- Distribution
See NDE Web site http//projects.osd.noaa.gov/nd
e
11NDE Design Review BoardReview Calendar
Date Deliverable Tasked
Nov 21 Requests for Action (RFA) Review Board
Dec 15 Action Responses to Board and Chairperson NDE Design Team
Jan 26 Final Report Design Board Chairperson
12Agenda
- 800 a.m. Welcome
- 815 a.m. Introduction/Background
- 900 a.m. Project Overview
- 1045 a.m. NDE Context and Hardware
- 1115 a.m. Algorithm Development
- 1200 p.m. Lunch
- 100 p.m. Data Handling System
-
- 230 p.m. IT Security
- 300 p.m. Communications Study
- 340 p.m. Closing Remarks
-
- Introduction/Background J. Silva
- Mission Design Drivers J. Silva
- Architectural Overview G. Goodrum
- Data Products T. Schott
13Mission
- The NPOESS Data Exploitation (NDE) will provide
critical environmental products derived from NPP
and NPOESS observations to NOAAs operational
community in near-real time
14Capabilities AssessmentNDEs ?s
Current State Target State
DMSP POES ? NPP NPOESS
End-to-End Control ? Post-processing
Orbital Processing ? Granule Processing
Stove-pipe Applications Processing ? Centralized Data Processing
Bolted-on Security ? Security Compliant w/ Standards
100 GB/day Ingest ? 2.5 (NPP) TB/day Ingest
300 GB/day Distribution ? 6 TB/day Distribution
Heritage Instruments ? New Instrument Technologies
15NDE ProjectCapabilities Assessment
- Current State Target State
- Orbital Processing ? Granule Processing
- Granule (data structure) Processing
- Instrument specific
- SafetyNet
- Continuous ingest
- Near-real time lt30 minutes
- All data packaged as HDF5
- Orbital Processing
- Near-real time 156 minutes from observation to
product
NDE systems must adapt to all new input formats,
alter formats to user needs, take advantage of
improved latency
16NDE ProjectCapabilities Assessment
- Current State
Target State - Stove-pipe
- Applications Processing ?
Centralized IT Architecture
- Centralized IT Architecture
- Reusable objects
- Database Technologies Data Management
- Common CM, testing, and transition processes
- Common toolsets
- Centralized management and control
- Stove-pipe Applications Processing
- Duplicated, redundant functionality
- High cost of management control
NDE will establish latest proven technologies and
manage them in 3 basic environments Development,
System Test, Operations
17NDE ProjectCapabilities Assessment
- Current State
Target State - 300 GB/day Distribution ?
6 (tbc) TB/day Distribution
- Increase customer base (increase benefits)
- Online subscriptions
- GUIs
- More Dissemination Options
- Push, pull, network, fiber, internet, etc.
- Product benefits realized by relatively small
customer base - Limited number of dissemination options
- orders require developer intervention
- FTP (pull) in most cases
NDE to establish a Communications Architecture in
collaboration with NOAAs Network Advisory
Committee (NAC) and AWIPS
18The Transition Challenge
- Environmental satellite industry challenge
- NESDIS history 2 ½ years from data
availability, through research, to provision of
products to operational users (R2O) - All new NOAA projects must have Transition Teams
and submit Transition Plans for approval - NDE submitted plans (MIRS, CrIS/ATMS, Sea Surface
Temperature) developed with SPSRB oversight - NDEs Goals
- Minimize IT obstacles as algorithms migrate
through development, test, and operational
environments - Work with end users to prepare operations for
products
19NDE Milestone Plan
20Agenda
- Introduction/Background J. Silva
- Mission Design Drivers J. Silva
- Architectural Overview G. Goodrum
- Data Products T. Schott
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
21NDE System Objectives
- Disseminate NPOESS Data Records to NOAA
Operational Community - Generate and disseminate tailored NPOESS Data
Records (versions of NPOESS Data Records in
previously agreed alternative formats and views) - Generate and disseminate NOAA-unique products
(augmented environmental products constructed
from NPOESS Data Records) - Deliver NOAA-unique products, product processing
elements, and associated metadata to CLASS for
long-term archiving - Provide services to customers, including NDE
product training, product enhancement, and
implementation support across NOAA - Provide software for NPOESS Data Record format
translation and other data manipulations
22NPOESS Data Exploitation (NDE) Mission Context
Diagram
NOAA Environmental Satellite Processing
Center (ESPC)
NPOESS Ground System
Data Records (xDRs)
NPOESS Data Exploitation (NDE)
NOAA Unique Products
Tailoring Tools
Data Records (xDRs)
NPOESS will deliver more than 100 different
data records (xDRs)
Tailored Products
Data Records (xDRs)
NOAA Unique Products
CLASS
Near Real-time Customers
CLASS
xDRs
Climate Community
Near Real
-
time
Near Real
-
time
Near Real
-
time
Climate
Tailoring Tools
Customers
Customers
Customers
Community
NOAA
-
Unique Products
23Scope Requirements
- Design, develop and implement
- Development Environments for NPP/NPOESS
- System Test Environment for NPP/NPOESS
- Operational Environment for NPOESS
24Vision3 Environments, Common Standards
Environmental Satellite Processing
Center
NDE System Test/Backup Environment
NDE Operational Environment
NDE Development Environment
Shared Capabilities, Common Standards
25Design Considerations
- Algorithm modularity
- Reuse
- Centralization
- Scalability
26IDPS / NPP NDE Integration Timeline
NDE-NPP Integration
NDE CriticalDesign
NDE PreliminaryDesign
NDE System Test Environment
Concepts in work
Note New NPP IPAC Verification events
(Performance Verification) not yet Planned
27Risk Handling and MitigationAssessment of impact
and project plans
- Mitigating steps in NDEs current plans
- Develop a risk management plan
- For IT work, schedule control via earned value
management - Develop lifecycle cost and benefit estimates
- Develop the project budget
- Plan for needed resources
- Design systems to support reliability
requirements - Monitor NOAA and DoC security requirements
- New risk mitigating activities
- Develop and implement a quality management plan
to address system and data quality, quality
assurance, and quality control. - Develop and implement NDE continuity plans that
are coordinated among participating
organizations. - Coordinate development of preliminary
(pre-launch) data with the IPO and instrument
designers
28Risk Management Summary
- Risks management plan will be revised as risks
are handled, new risks are identified, or
stakeholder views change - Continued input from stakeholders and other
sources - Developers have a critical responsibility to
identify issues, risks, and challenges and their
potential impact on operations - Preliminary design review will identify issues
and challenges for key design elements
29Agenda
- Introduction/Background J. Silva
- Mission Design Drivers J. Silva
- Architectural Overview G. Goodrum
- Data Products T. Schott
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
30Product Definitions
- Products delivered by NPOESS contractor
- xDRs
- Raw Data Records (RDR)
- Sensor Data Records (SDR)
- Temperature Data Records (TDR)
- Environmental Data Records (EDR)
- Intermediate Products (IP)
- Pre-planned Product Improvements (P3I)
- Products produced by NOAA using xDR information
- Tailored xDRs and IPs
- NOAA Unique Products (NUPs)
31NPOESS Products Requirements
Integrated Program Office (IPO)
Deliverables
Contract Document
Requirements Document
NPOESS Integrated Operational Requirements
Document (IORD II) Dec 10, 2001
NPOESS Technical Requirements Document
Sensor, Temperature, Environmental Data Records
(xDRs)
NPOESS Data Exploitation (NDE)
(1) Tailored xDRs (2) NOAA-unique products (NUPs)
NDE Product Matrix
NDE Product Development
32Sample from Mission Goal Version of Product Matrix
Products Line Office
Mission Goal
Review
(CORL items in gray)
Master, WW, Climate, Ecosystems, CT spreadsheets
Motivation Develop NPOESS products to meet all
NOAA user requirements
33NPP Product Summary
NPP Totals
Data Records (xDRs/IPs/P3I) 39
NOAA Unique Products 59
Totals 98
34Acquisition StrategyEmphasis POES Mission
Continuity
- Nunn-McCurdy process restructured NPOESS
acquisition efforts - POES mission continuity might rely on NPP data
operationally
35Define NPP Products
- POES and EOS Mission Continuity
- Essential that we address these products
- Development products
- Under development and could go operational prior
to NPP launch - New products
- Never provided to user community and not under
development within NESDIS
36NPP Mission Continuity Products
NOAA-Unique Products NOAA-Unique Products Tailored Products
CrIS Thinned Radiances SST Anomalies ATMS Radiances
Total Precipitable Water (ATMS) Coral Reef Degree Heating CrIS Radiances
Snow Cover (ATMS) Coral Reef Bleaching VIIRS Radiances
Precipitation Type/Rate (ATMS) Tropical Rainfall Potential OMPS Radiances
Surface Emissivity (ATMS) Vegetation Fraction Cloud Mask
Cloud Liquid Water (ATMS) Hazard Support (Tropical) Sea Surface Temperature (SST)
Sea Ice Cover/Concentration Hazard Support (Volcano) Ozone Profile
Snow Water Equivalent (ATMS) Total Ozone (CrIS) Ozone Total Column
Ice Water Path (ATMS) Blended Ozone Snow Cover
Land Surface Temperature Outgoing Longwave Radiation Imagery
Temperature Profiles (ATMS) Outgoing Longwave Radiation Ocean Color/Chlorophyll
Moisture Profiles (ATMS) Absorbed Radiation Vegetation Index
Blended Snow Cover Rainfall Prediction Active Fires
Harmful Algal Blooms Tropical Cyclone Intensity Atmospheric Temperature Profile
Regional Ocean Color Tropical Rainfall Potential Atmospheric Moisture Profile
Blended SST Vegetation Fraction Aerosol Optical Thickness
Surface Type Vegetation Cover
Surface Albedo
Cloud Cover/Layers
37NPP Development Products
NOAA-Unique Products Tailored Products
CrIS Cloud Cleared Radiances Aerosol Particle Size
Clear Sky Radiances (VIIRS) Cloud Top Temperature
Polar Winds (VIIRS) Cloud Top Pressure
Vegetation Health Land Surface Temperature (VIIRS)
Vegetation Moisture
Drought Indices
Vegetation Thermal Conditions
Leaf Area Index
Fire Potential
SST (AVHRR-like)
Aerosol (AVHRR-like)
Carbon products
38NPP New Products
NOAA-Unique Products Tailored Products
Integrated xDRs at CrIS Resolution Cloud Base Height
Cloud Liquid Water Path (VIIRS) Cloud Effective Particle Size
Cloud Ice Water Path (VIIRS) Cloud Optical Thickness
Cloud Top Temperature (VIIRS) Cloud Top Height (VIIRS)
Inversion Strength and Height Ice Surface Temperature
Net Heat Flux
Sea Ice Characterization (VIIRS)
Suspended Matter
Atmospheric Pressure Profile
39Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
- Project Overview G. Sloan
- Requirements Mgmt L. Fenichel
- Development Approach E. Wilson
- Tool Selection E. Wilson
40A Reminder
- Parts of NPP, NPOESS, and NDE are subject to
- ITARs (International Trafficking in Arms
Restrictions) - NOFORN Classification
41Topics
- Requirements Management
- Development Approach
- Algorithm Development
- Data Handling System
- IT Security
- Data Distribution
- The Way Forward
42Master Phasing Schedule
43NDE Baselined Documents
Document Status
NDE Concept of Operations Draft Delivered 8/24/2006 Draft Delivered 11/7/2006 (Incorporated RIDS from Peer Review)
NDE Federal Enterprise Architecture Draft Delivered 8/24/2006
NDE Data Distribution Conceptual Design Draft Delivered 8/24/2006
NDE Stake Holder Survey Summary Status 2006 Draft Delivered 8/24/2006 Draft Re-Delivered 8/28/2006
NDE Network Services Design Draft Delivered 8/24/2006
NDE System/Subsystem Design Description Draft Delivered 8/24/2006
NDE Logical and Geographical Connectivity Design Draft Delivered 8/24/2006
Standards for NDE Algorithm Development, Delivery, Integration Test Draft Delivered 8/24/2006
44NDE Baseline Documents (Contd)
Document Status
Project Plan Draft Delivered 8/2004 Draft Delivered 8/2005 Draft Delivered 8/2006 Draft Delivered 11/2006
NDE Systems Requirement Specification Draft Delivered 11/7/2006
NDE Configuration Management Plan Draft Delivered 5/18/2006 Draft Delivered 6/30/06 Draft Delivered 8/30/2006
Risk Management Plan Delivered 7/14/2006
45Management Approach
- Staff the NDE Design Team with experienced
professionals - Use design tools
- Prototype of Development Environment
- IBM p570 using AIX
- Use NPP as the NDE risk reduction mission
- Maximize use of COTS and GOTS (minimize custom
code) - Collaborate with similar, current projects
- Develop prototypes to validate requirements and
design - Sell-off requirements as early as possible
46Lessons Learned from NASAs EOSDIS Core System
- Integrate Algorithms with the Data Handling
System early in the life cycle - Test the system under full operational load
data driven systems behave in strange ways - Build in Operator Dashboard and System Metrics
for reporting - NetIQ and CAs Unicenter are candidates
- Configuration Parameters
- Put in Database for change control, audit trail
- Document CPs, min/max values, default, suggested
values - Use Rolling Wave technology refresh approach
- Constant effort
- Keeps budgets flat no large peaks every 5 years
- Just in Time Hardware buys
47NDE Design Assumptions
- All NDE users are registered users
- Users get data by entering an NDE subscription
- Subscriptions have two types
- Standing Order (no expiration date)
- Ad-Hoc (covers a limited time period)
- Subscription to NOAA-unique products to be
archived to CLASS will be determined by Archive
Review Board - NDE will subscribe to all IDPS xDRs (in lieu of
implementing complex logic using IDPS API to
order only xDRs needed) - NDE will not distribute RDRs in near-real time to
users (exception Telemetry) - Users will need to request RDRs from CLASS
- Product latency is customer and product specific
48Data Access
- NDE will flag customers as authorized for data or
not during data denial - Use Initial Joint Polar System (IJPS) Data Denial
Implementation Plan for initial authorization
guidance - NDE will maintain a customer database that will
contain sufficient attributes in order to set
flag - Restricted data will be delayed for subscriptions
from unauthorized customers - Timeliness more an issue for NPOESS than NPP
- Only authorized users will be notified of data
access status - End User License Agreement with customer
restricts redistribution
49Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
- Project Overview G. Sloan
- Requirements Mgmt L. Fenichel
- Development Approach E. Wilson
- Tool Selection E. Wilson
50Requirements Management Matrix
ID Section Requirement Contract
SRS203 3.14.10 Development Lifecycle Tools - The Contractor will specify a suite of proven development lifecycle tools to enhance NESDIS capabilities in performing developmental and software maintenance tasks. The documentation will demonstrate that the selected tools are widely supported in the remote sensing software industry and the most likely to be known by future NESDIS support staff. Technologies in this category are CASE tools, modeling tools, 4th Generation Languages, Testing tools, and Requirements Tracking tools. SE3
SRS94 3.14.6 SEI Capability Maturity Model - The Contractor shall design the System so that its management capabilities can be evaluated in terms of the Software Engineering Institute's (SEI) Capability Maturity Model (CMM), with the goal of Level 2 certification during proposal evaluation and Level 3 certification three years after contract award. SM1
SRS201 3.14.8 NOAA IT Best Practices - The System shall be designed and built with NOAA IT "Best Practices" guidance from the NESDIS Information Technology Architecture Team (ITAT). SE8
51Requirements Management The Process
- Define Business Objectives
- Contained in Contract Requirements Matrices
under Desired Outcomes and Performance Standard
columns - Additional Objectives detailed in the Concept of
Operations - Automate Requirements Analysis, maintain
Traceability (Use DOORS) - Reduce Ambiguity
- Contractual Requirements reviewed, System
Requirements Spec (SRS) created - Derive Requirements to further capture NDE
functionality - Use Case Driven Process
- Cause/Effect Tables
- Automate using UML compliant tools
- Domain Expert Reviews and Refinements
- Several Iterations
- Capture and Formalize Detailed Requirements
- Test Cases generated
- Final Round of Domain Expert Reviews
52NDE Requirement Allocations
Allocations Number of SRS Requirements
Customer Services 16
Ingest 24
Production 26
Distribution 25
Product Management 22
Common Services (Infrastructure) 50
System Monitoring and Control 22
Security 1
Documentation 12
53NDE Requirements Traceability
ID Section Short Title Contract Allocation Use Case Test Case
SRS62 3.2.1.3.2 SARSAT Telemetry from IDPS XF1 Ingest UC101 TC25
SRS63 3.2.1.3.3 ADCS Data from IDPS XF1 Ingest UC101 TC26
SRS58 3.2.1.3.4 Product Subscriptions to IDPS XF1 Ingest UC102 TC44
SRS101 3.2.2.3.1 Available Data Product Projections PG5 Production UC181 TC124
SRS102 3.2.2.3.2 Available Data Product Aggregations PG6 Production UC182 TC131
- Maintained in DOORS Tool with Active Links to
other NDE Documents - Telelogic Tool Suite (DOORS, TAU Developer)
- Use Cases Mapped
- Test Cases Mapped
- Test Results Tracked
54Challenges Matrix
Challenge Related Requirement Impact Mitigation
Maintain Requirements Traceability throughout the Project Life Cycle (SRS201) 3.14.8 NOAA IT Best Practices Potential testing gaps and schedule delays as a result Use of Tools (SRS201/Automated Tool Suite)
Full specification of NDE System Elements and Components N/A Cost, quality, schedule Employ a methodology (SRS204) where domain experts are involved early in the process, delivery at CDR
55Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
- Project Overview G. Sloan
- Requirements Mgmt L. Fenichel
- Development Approach E. Wilson
- Tool Selection E. Wilson
56Development ApproachRequirements
ID Section Requirement Contract
SRS204 3.10.1 The Contractor shall identify a widely accepted software engineering methodology to be used on the project. SE12
SRS94 3.14.6 The Contractor shall design the System so that its management capabilities can be evaluated in terms of the Software Engineering Institute's (SEI) Capability Maturity Model (CMM), with the goal of Level 2 certification during proposal evaluation and Level 3 certification three years after contract award. SM1
SRS172 3.9.2 The Contractor shall evaluate candidate System Elements for operational fitness using Unit, Integration, Regression, Stress (load), and End-to-End System testing. SE4
SRS173 3.9.2.1 Each algorithm for creating NOAA-Unique products will be described in terms of explicit, expected test results prior to the installation of the NOAA-supplied algorithm on the operational product generation system. The NOAA-Unique algorithms must satisfy these test requirements. PG8
SRS182 3.11.6 The Contractor shall cooperate with algorithm developers to identify System Test procedures, standards, and the criteria to be applied in determining a system elements' fitness for operational status. SE4
57Development Approach Requirements Based Testing
- What is it?
- Requirements Based Testing (RBT) is a System
Implementation Methodology that - Substantially Reduces the Risk of
- Building the Right System to the Wrong
Requirements ! - Why are we Concerned about That ?
- Over half all software project defects originate
in the requirements phase - 82 of application re-work is related to
requirements errors - Only 54 of initial project requirements are
actually realized - Requirements problems account for 44 of project
cancellations
Statistics by James Martin and others, taken from
Eliminate the Testing Bottleneck, Borland White
Paper, August, 2006, p. 4.
58The Methodology of Requirements Based Testing
(RBT)
- Its about Testing as an Integral Step of
Implementation -- not a Separate Phase - Design Tests before Designing the Software -- the
Test Cases become the Ultimate Statement of the
Develops Requirements - Iterate the Requirements-to-Test Case Loop with
Users and Domain Experts until they concur that
successful Test Execution will produce the result
they want not necessarily what they originally
specified! - to insure that
- What we Build is What we Say
- and What we Say is What we Mean
59Traditional Implementation Hand-Off Phases
Natural Language Requirements
Natural Language Requirements
Design
Design
Code
Test Plans
Test Plans
- Analysts talk to Domain Experts to express
Natural Language Requirements - S/W Designers Interpret these into Design, which
is then interpreted into Code - Independent Test Team Interprets Requirements to
develop Test Plans - The Result Unresolved ambiguities and
inconsistencies between what was developed, what
is tested, and what is really needed!
60RBT Focuses on Formalized Requirements Test
Cases
Requirements Analysts
Users Domain Experts
Create Formal Reqts from Natural Language
Create Logical Test Cases
Review/Fix/Validate Reqts Logical Test Cases
Review/Fix Reqts Logical Test Cases
Formal Reqts and Test Case Repository
Study Logical Test Cases Review Design
versus Validated Test Cases Review Code versus
Validated Test Cases
Study Logical Test Cases Complete Physical Test
Cases
Test Experts
Developers
61Requirements-Based Testing (RBT) Workflow
Requirements Quality
Logical Test Case Design
Structure/Formalize Requirements
Validate against Business Objectives
Map against Use Cases
Validated Requirements
Define/Optimize Test Cases
Ambiguity Analysis
Reqts Review by Domain Experts
Requirement Fixes
Logical Test Cases
Baseline of Validated Test Cases
Test Case Quality
Test Case Review By Reqs Authors
Test Case Review By Domain Experts
Test Execution
Create Test Procedures
Design and Code Quality
Code Baseline
Developers Study Test Cases
Test Experts Study Test Cases
Design Code Review with Test Cases
Execute Tests
Diagram based on Eliminate the Testing
Bottleneck, Borland White Paper, August, 2006,
Fig. 2.
62Sample Technique for Formalized Requirements
REQT ID CAUSES EFFECTS
SRS80 System is in Degraded Operations Mode Customer meets Pre-Defined Criteria TBD Customer has Pending Data Request Pending Data is distributed to Customer Customers Pending Data Subscriptions are Ignored while Degraded Operations Mode persists
63Development ApproachChallenges
Challenge Related Requirement Impact Mitigation
Build the NDE Systems as we Say, and Say what we Mean them to Be SRS94, SRS173, SRS182 Unresolved ambiguities and inconsistencies between what gets developed, what gets tested, and what is really needed! Utilize Requirements Based Testing (RBT) Methodology
Find a Suite of Development Tools that Support the RBT Implementation Workflow SRS203 Worst case develop RBT artifacts manually (using only Excel, Visio, Word) and develop a homegrown database to track baselines, team member updates, and requirements to test traceability Identify the Development Functionalities needed to Support RBT establish Criteria to Evaluate Tools performance w.r.t the Functions, taking into account interoperability between Tools in a Candidate Suite
64Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
- Project Overview G. Sloan
- Requirements Mgmt L. Fenichel
- Development Approach E. Wilson
- Tool Selection E. Wilson
65Tool Selection Requirements
ID Section Requirement Contract
SRS93 3.8.3.2 The NDE Processing System Design will use COTS and Open Source software where it is possible, practical, and approved by the Government. SE10
SRS203 3.14.10 The Contractor will specify a suite of proven development lifecycle tools to enhance NESDIS capabilities in performing developmental and software maintenance tasks. The documentation will demonstrate that the selected tools are widely supported in the remote sensing software industry and the most likely to be known by future NESDIS support staff. Technologies in this category are CASE tools, modeling tools, 4th Generation Languages, Testing tools, and Requirements Tracking tools. SE3
SRS215 3.10.2 The contractor shall develop the data processing elements of the future NDE system using the latest proven technologies (programming languages, CASE tools, object repositories, data base management systems, etc.) that are appropriate for remote sensing data processing. SE13
66RBT Steps whereNDE Expects to Employ
Tools
Validate against Business Objectives
Ambiguity Analysis
Baseline of Validated Test Cases
Code Baseline
Developers Study Test Cases
Test Experts Study Test Cases
Diagram based on Eliminate the Testing
Bottleneck, Borland White Paper, August, 2006,
Fig. 2.
67IT Development ToolsFunctional Categories
SUMMARY OF FUNCTIONAL CATEGORY SCORES SUMMARY OF FUNCTIONAL CATEGORY SCORES SUMMARY OF FUNCTIONAL CATEGORY SCORES SUMMARY OF FUNCTIONAL CATEGORY SCORES SUMMARY OF FUNCTIONAL CATEGORY SCORES
1. Interoperability (with respect to a Suite, and with Common Interchange Tools) 1. Interoperability (with respect to a Suite, and with Common Interchange Tools) 1. Interoperability (with respect to a Suite, and with Common Interchange Tools) 1. Interoperability (with respect to a Suite, and with Common Interchange Tools) 0
2. Requirements Specification Maintenance 2. Requirements Specification Maintenance 0
3. Configuration Management 3. Configuration Management 3. Configuration Management 0
4. Requirements Verification 0
5. Engineering Management/Team Support 0
6. General Architectural Design and Modeling 0
7. Data Modeling 0
8. Business Process Modeling 0
9. Software Analysis, Design, and Construction 9. Software Analysis, Design, and Construction 0
10. Extensibility 0
11. User Interface 0
12. OS/Network Support - Tool Server 0
13. Tool Administration Vendor Support 12
14. Documentation, Training, and Ongoing Education 14. Documentation, Training, and Ongoing Education 0
15. Vendor Comprehension Licensing Offer 0
68Tool Selection Methodology
- Goal
- To Define a Process for Trade-Off Comparisons
between IT Development Tools - That
- Establishes a Consistent Approach regardless of
the Individuals performing the Evaluation - Produces Quantitative Results for Comparison and
Justification - is Conducted as Objectively as Possible
- . . . and can also be applied to
- Architecture Operational Components
- Science Development Tools
69Sample SectionIT Development Tool Evaluation
Worksheet
13. Tool Administration Vendor Support 13. Tool Administration Vendor Support 13. Tool Administration Vendor Support 13. Tool Administration Vendor Support 13. Tool Administration Vendor Support 13. Tool Administration Vendor Support 13. Tool Administration Vendor Support 12
13.1 Ease of Installation (server/client combined) 50 ltnonegt 0 10
13.2 Degree of Site/User Configurability 50 ltnonegt 0 10
13.3 Vendor includes personal tool setup assistance in the purchase price 90 ltnonegt 0 10
13.4 Number of Release Defects Reported by Vendor 50 ltnonegt 100 0
13.5 Number Un-reported Defects uncovered during Evaluation of Release 80 ltnonegt 10 0
13.6 Number of old versions of the tool currently supported 65 Enter the number 1,2,3,4 0 5
13.7 Vendor provides online help support tool 50 ltnonegt 0 10
70We Attempt to Establish an Evaluation Framework
that is as Objective as Possible
- We evaluate each tool with respect to all
functional categories, and develop our own
profile of a tools capabilities giving each
tool a numeric score for each functionality - For a suite of tools, we calculate functional
category scores by combining scores for all tools
in the suite for each category - Suite Functional Score Min(Tool Scores) for a
- weakest link
- Suite Functional Score Max(Tool Scores) for a
- strongest component
- Similar approach applied to
- Science Development Tools - K. Tewari
- Architectural Components - A. Al-Jazrawi
Common Function
Specialized Function
71IT Development Tool Suites Under Consideration
- IBM Rational
- Requisite Pro, Clear Case, Clear Quest, Software
Architect, (Test Manager) - Telelogic
- DOORS, Synergy CM, Synergy Change, System
Architect, Tau Developer, (Rhapsody), plus
Mercury Test Director - Borland
- Caliber DefineIT, Caliber RM, Silk Central,
Together, StarTeam
72Tool Selection Challenges
Challenge Related Requirement Impact Mitigation
Find a Suite of Development Tools that Interoperate smoothly, allowing us to move from step to step or backwards without manual cut-and-paste transfer of data SRS203 Vendor Tool Suites are a mix of in-house products and products from buy-outs, not as well integrated as heralded members of some suites dont even run on the same set of platforms Our evaluation methodology makes special provisions to evaluate a Suite as a whole Tool Interoperability is scored w.r.t the Suite Context Suite score for that and other common functions is taken to be the weakest link of the Suite members
Find a Suite of Development Tools that can be expected to be supported several years into the future. SRS203 IT Tool development is currently a very dynamic area with many contenders, some of them with outstanding new products Our evaluation criteria include vendor stability factors, as well as number years experience with the tool (e.g. after a buyout) our down-select list is composed of 3 of the most well-known, stable vendors in this field.
73Tool Selection Challenges
Challenge Related Requirement Impact Mitigation
Business Process modeling tools are closely tied to their vendors middleware for collection of information to feed the model they are not interchangeable to other vendors middleware SRS215 It could be that our first choice for modeling NDE business activities would force us toward middleware that might be inadequate for our near real-time requirements We will evaluate the middleware architectural components first, as that component will have the broadest impact on overall system performance however, before making a final decision we will evaluate the Business Process tools to check that we arent forced toward an unacceptable product we may have to balance the gap in tools with the gap in architectural components.
74Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
- Context G. Roth
- Environments G. Roth
- Hardware G. Roth
75NPOESS Context Diagram
AFWA (Air Force Weather Agency)
(Supports NPP)
FNMOC (Fleet Numerical Meteorology
and Oceanography Center)
NESDIS (National Environmental Satellite Data
and Information Service)
NPOESS (National Polar-orbiting
Operational Environmental Satellite System) (IDPS)
(Supports NPP)
NAVOCEANO (Naval Oceanographic Office)
76IDPS Context Diagram
SDS (Science Data Segment) (NASA)
NESDIS
NSIPS (NPOESS Science Investigator-led Processing
System) (cal val)
IDPS (Interface Data Processing Segment) (at NOAA)
SAN
NDE
CLASS (Comprehensive Large Array-data
Stewardship System) (archive)
77NDE Context Diagram
Users
ESPC (Environmental Satellite Processing Center)
IDPS (at NOAA)
CLASS (LTA) (Long Term Archive)
NDE
Algorithm Developers (STAR) (Center for
Satellite Applications and Research)
78NDE Environment Requirements
ID Section Requirement Contract
SRS83 3.14.3 The System shall be designed to support Operational, System Test, and Development Environments. SE2
SRS237 3.14.3.1 The Contractor shall provide an Operational Environment design that lowers the cost and risks of generating and distributing NPOESS-derived products to customers. SE2
SRS85 3.14.3.2 The System shall provide the capability to support the development of the Data Handling System and the development and integration of Scientific Algorithms in a segregated Development Environment. I3
SRS84 3.14.3.3 During the NPP mission, the System Test Environment will be segregated in a manner that supports "Quasi-Operational" product generation and distribution (Quasi-Operational is defined as 24 X 7 automated product generation and distribution with 8 X 5 Science and Operations support services). I1
SRS86 3.14.2.3 - During the NPP mission, the System will have the capability to support product volumes of 4 TB/day. - During the NPOESS C1 mission, the System will have the capability to support product volumes of 8 TB/day. - During the NPOESS C2 mission, the System will have the capability to support product volumes of 12 TB/day. I1
79NDE System
- IT Development Environment
- Where the Data Handling System (DHS) is developed
- Science Development Integration Environment
- Science Algorithm Development Environment
- Science Algorithms may be delivered from STAR
Collaborative Environment (CE) - Algorithms integrated with Data Handling System
(DHS) - Algorithm Performance Tuning
- Functional, end-to-end testing of NDE System
- System Test Environment
- Stress Testing, Regression Testing and Operations
Acceptance Testing - System and Algorithm Performance Testing
- Serves as Backup for Operations Environment
- Augmented to make products during NPP
(quasi-operational) - Operations Environment
- DHS and Algorithms
- High Availability
- Low NDE Latencies
- High Product Volume
80NDE System
Configuration Management
81DHS Promotion Integration Process
82Algorithm Promotion Integration Process
83Hardware Design Development Environment (Phase
1)
Design
Development Environment CM
ESPC Network
CITS Administration LAN
StorNext
Automated Tool Servers Dell PE1850 Servers Dual
3.0GHz 4 GB RAM 2 Win2003 1 RHEL4 42U Rack
Storage ESPC SAN Expansion 146 GB SATA
Drives 2.33 TB (usable) StorNext FS Server 42U
Rack
Processing IBM p570 16 x 2.2 GHz CPUs 32 GB
RAM AIX 5.3 42U Rack
84Rolling Wave Capacity Upgrade Planning
Data Direct Networks Disk Storage
Data Direct Networks Disk Storage
SAN Disk Storage Ingest Distribution
CPUs Processing Database
N n lt 2N
additional capacity
Operations
85IBM Virtualization
- Virtual I/O Server
- Shared Ethernet
- Shared SCSI and Fibre Channel-attached disk
subsystems - Supports AIX 5L V5.3 and Linux partitions
- Micro-Partitioning
- Share processors across multiple partitions
- Minimum partition 1/10th processor
- AIX 5L V5.3, Linux, or i5/OS
- Partition Load Manager
- Balances processor and memory request
- Managed via HMC or IVM
Dynamically resizable
AIX 5L V5.2
Linux
AIX 5L V5.3
i5/OS V5R3
6 CPUs
2 CPUs
4 CPUs
1 CPU
1 CPU
6 CPUs
Linux
Linux
AIX 5L V5.3
AIX 5L V 5.3
AIX 5L V5.3
AIX 5L V5.3
AIX 5L V5.3
Micro-Partitioning
Virtual I/O server partition
IVM
Storagesharing
Ethernet sharing
Virtual I/O paths
Hypervisor
PLM partitions
Unmanaged partitions
LPAR 2 AIX 5L V5.3
LPAR 1 AIX 5L V5.2
LPAR 3 Linux
Manager Server
PLM agent
PLM agent
Hypervisor
SLES 9 or RHEL AS 4 and above Available
on selected p5-570, p5-590 and p5-595
models IVM on p5-560Q and below
86NDE Design Challenges
Challenge Related Requirement Impact Mitigation
IDPS requires use of StorNext to write to ESPC SAN vs ESPC security zones Imposed on NDE by IDPS and ESPC Multiple NDE hosts also need to see the same file system on the ESPC SAN that IDPS uses therefore NDE is also required to use StorNext as its metadata server ESPC (Brian Callicott) is pursuing technical solution
Processing multiple read writes to from SAN (utilizing StorNext) may not meet product latency requirements SRS 133 3.2.5.4 Data Product Latency Table CD9 File I/O to/from SAN may not meet latency requirements. Early testing of StorNext from IDPS to the ESPC SAN and early interface testing of StorNext from the ESPC SAN to NDE.
Potential risk of not having a Product Technical Baseline and model may impact the ability to correctly size the number of CPUs, RAM, Disk, and internal networks required to meet latency requirements. SRS 133 3.2.5.4 Data Product Latency Table CD9 Insufficient hardware sizing of future System Test and Operations Environments or increased spending to overcompensate for lack of information Develop a Product Technical Baseline and model with Product Development Lead Tom Schott
87Agenda
- 800 a.m. Welcome
- 815 a.m. Introduction/Background
- 900 a.m. Project Overview
- 1045 a.m. Building Blocks and Hardware
- 1115 a.m. Algorithm Development
- 1200 p.m. Lunch
- 100 p.m. Data Handling System
-
- 230 p.m. Communications Study
- 330 p.m. Closing Remarks
-
- 400 p.m. PDR Board Review of RFAs
- Research to Operations M. McHugh
- SADIE K. Tewari
88Research to Operations
- GOAL Faster, more efficient transition of
research to operations. - HOW By facilitating interaction between STAR
and NDE developers - Similar
- Standards
- Configuration management (CM) tools
- CM procedures.
- Common
- Libraries, error handling, development and
visualization tools etc. - Compatible
- Architectures.
89Similar Environments
90Research to Operations
STAR Collaborative Environment (SCE)
Create the Delivered Algorithm Package (DAP) for
an algorithm
Send DAP to the SADIE
Integrated Product Team
91Research to Operations
STAR Collaborative Environment (SCE)
Create the Delivered Algorithm Package (DAP) for
an algorithm
Send DAP to the SADIE
Integrated Product Team
92Science Algorithm Development and Integration
Environment (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
93Science Algorithm Development and Integration
Environment (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
94Science Algorithm Development and Integration
Environment (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
95Science Algorithm Development and Integration
Environment (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
96Science Algorithm Development and Integration
Environment (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
97SCIENCE ALGORITHM DEVELOPMENT AND INTEGRATION
ENVIRONMENT (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
98Science Algorithm Development and Integration
Environment (SADIE)
Receive DAP from outside of SADIE
Place received DAP under configuration
management (CM) control
Ensure DAP has correct format
CM Manager
Baseline the DAP
Compile algorithm
Integrated Product Team
Execute algorithm using its DAP
NDE Development Team
Integrate algorithm into Data Handling System
99System Test Environment
Install Algorithm
Test Algorithm
CM Manager
Promote to Operations
Operations Environment
NDE Development Team
Integrated Product Team
Install Algorithm
Monitor Algorithm Performance
ESPC CCB
100System Test Environment
Install Algorithm
Test Algorithm
CM Manager
Promote to Operations
Operations Environment
NDE Development Team
Integrated Product Team
Install Algorithm
Monitor Algorithm Performance
ESPC CCB
101System Test Environment
Install Algorithm
Test Algorithm
CM Manager
Promote to Operations
Operations Environment
NDE Development Team
Integrated Product Team
Install Algorithm
Monitor Algorithm Performance
ESPC CCB
102System Test Environment
Install Algorithm
Test Algorithm
CM Manager
Promote to Operations
Operations Environment
NDE Development Team
Integrated Product Team
Install Algorithm
Monitor Algorithm Performance
ESPC CCB
103System Test Environment
Install Algorithm
Test Algorithm
CM Manager
Promote to Operations
Operations Environment
NDE Development Team
Integrated Product Team
Install Algorithm
Monitor Algorithm Performance
ESPC CCB
104Agenda
- 800 a.m. Welcome
- 815 a.m. Introduction/Background
- 900 a.m. Project Overview
- 1045 a.m. Building Blocks and Hardware
- 1115 a.m. Algorithm Development
- 1200 p.m. Lunch
- 100 p.m. Data Handling System
-
- 230 p.m. Communications Study
- 330 p.m. Closing Remarks
-
- 400 p.m. PDR Board Review of RFAs
- Research to Operations M. McHugh
- SADIE K. Tewari
105Science Algorithm Development Integration
Environment (SADIE)
- Has similar components and configuration as the
IT, System Test, and Operational environments - Uses common CM and Defect Tracking tools as other
environments - Accessible locally and remotely to algorithm
developers - Supports development, integration with DHS test
of new algorithms and those delivered from STAR
Collaborative Environment (SCE) - Supports a suite of software development and
science tools (COTS, GOTS, etc.) - Receives Delivered Algorithm Package (DAP) from
SCE
106SADIE
- SCE users receive
- - Access and training in use of SADIE
- - Defect Reports
- Supports development of additional science tools
and applications (i.e., validation of products,
determination of coefficients, etc.) - Supports trouble-shooting of product generation
failures in System Test and Operational
Environments - Supports algorithm functional and regression
testing - Supports fine tuning of algorithms for meeting
latency goals in System Test and Operational
Environment
107Candidate Science Development Integration Tools
- ArcExplorer
- ArcInfo
- ArcView
- CDAT
- EDGEIS
- Geomatica
- GeoTIFF
- GrADS
- HDF4 Tools
- HDF5 Tools
- IDL/Envi
- IMSL
- MATLAB
- McIDAS
- netCDF
- OPeNDAP
- PV-WAVE
- SAS
- TeraScan
- WXP
108Science Algorithm Development, Delivery,
Integration Test Standards
- Algorithm Development Standards
- Example Use Similar Development and Test
Environments - Algorithm Delivery Standards
- Example Standardized Algorithm Delivery Package
- CM and Defect Tracking Standards
- Example Algorithm Developers and NDE use same CM
and Defect Tracking Tools - Algorithm Input Standards
- Example Production Rules Not Hard-coded in
Science Algorithms - Algorithm Processing Environment Standards
- Example Customized Toolkit Containing Utilities
- Algorithm Output Standards
- Example Common File Format for All NOAA-Unique
Products
109Design Challenges
Challenges Related Requirement Impact Mitigations
Agreement on Standards Adherence to Standards SE14 Delays in research to operations Collaboration w/STAR
Provide low NDE Latency times for products (minutes) NDE Latency largely determined by algorithm performance Algorithms are developed by another part of NESDIS and sometimes in another environment SRS 10 3.3.3 Fail to achieve major goal of NDE Integrate algorithms and DHS as soon as possible Algorithm Development, Delivery, Integration and Test Standards being developed jointly with STAR Make STAR Collaborative Environment and SADIE as alike as possible to facilitate Delivered Algorithm Package (DAP)
110Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm
Development 1200 p.m. Lunch 100 p.m.
Data Handling System 230 p.m. IT
Security 300 p.m. Communications Study
340 p.m. Closing Remarks 400 p.m.
PDR Board Review of RFAs
111Agenda
800 a.m. Welcome 815 a.m.
Introduction/Background 900 a.m. Project
Overview 1045 a.m. NDE Context and
Hardware 1115 a.m. Algorithm Development
1200 p.m. Lunch 100 p.m. Data
Handling System 230 p.m. IT Security
300 p.m. Communications Study 340 p.m.
Closing Remarks 400 p.m. PDR Board
Review of RFAs
- External Interfaces E. Wilson
- DHS Architecture E. Wilson
-
- Common Services P. MacHarrie
- Subsystem Design A.Al-Jazrawi
112External Interface Requirements
ID Section Requirement Contract
SRS243 3.3.1 IDPS Data Acquisition - The System shall provide the capability to acquire all xDRs, Intermediate Products, SARSAT Telemetry, A-DCS, ancillary, and auxiliary data and metadata from IDPS. XF1
SRS244 3.3.2 External Ancillary Data Acquisition The System will provide the capability to configure ancillary data acquisition streams from external sources. PG9
SRS10 3.3.3 Data Product Retrieval - The NOAA-Unique and Tailored Products generated by NDE shall be made available to customers by placement in locations where data can be extracted within a time not to exceed the time specified in the Data Product Latency Table. XF3
113External Interface Requirements
ID Section Requirement Contract
SRS137 3.3.4.1 Archive Data Used for Functional Testing - Metadata and ancillary data, as well as Intermediate , NOAA-Unique, and Tailored Products used in documented Functional Tests, will be sent to CLASS. DA5
SRS138 3.3.4.2 Retrieve Data From CLASS - The System will have the capability to retrieve data from CLASS. DA9
SRS140 3.3.5 MMC Interface Through ESPC - ESPC Operations shall provide an interface between NDE and the NPOESS Mission Management Center (MMC) such that 100 of the NDE inquiries to the MMC and NDE replies to MMC reuests are received by the MMC in a time not to exceed that specified in the ICD, and that 100 of the notifications and inquiries from the MMC to NDE are received by NDE in a time not to exceed that specified by the ICD. XF9
114External Interface Requirements
ID Section Requirement Contract
SRS141 3.3.6