Slide 1 A Reference Model for Requirements and Specifications NOTES
Slide 2 Software Artifacts in the Real World Example from the ESA software engineering standard: User Requirements Specification (URS). Software Requirements Specification (SRS). Example from Lucent: Customer Specification Document. Feature Specification Document. Common scenario: just the code.
Slide 3 Reference Models Provide an idealized model of the architecture for common concepts, standards, reference implementation. Accommodate both variation and precision. Successful example: ISO 7-layer model for networks. Benefits include: Improved communication. Guide for good designs. Vehicle for analysis of tradeoffs.
Slide 4 Key Concepts for a Reference Model for Software Artifacts Kinds of artifacts. Informal vs. Rigorous vs. Formal. Verbal vs. Document vs. Code. Environment and System. Visibility and Control. Refinement Obligations.
Slide 5 Reference Model Artifacts WRSPM Domain (“World”) knowledge provides presumed facts about the environment. EnvironmentSystem
Slide 6 Reference Model Artifacts WRSPM Requirements that indicate what the customer needs from the system, described in terms of its effect on the environment. EnvironmentSystem
Slide 7 Reference Model Artifacts WRSPM Specification providing enough information for a programmer to build a system that satisfies the requirements. EnvironmentSystem
Slide 8 Reference Model Artifacts WRSPM Program implementing the specification on the given program platform. EnvironmentSystem
Slide 9 Reference Model Artifacts WRSPM Programming platform (“Machine”) provides a basis for programming a machine to satisfy the specification. EnvironmentSystem
Slide 10 Reference Model Artifacts WRSPM Given (Indicative) EnvironmentSystem Given (Indicative) Given (Indicative)
Slide 11 Reference Model Artifacts WRSPM EnvironmentSystem “Wished For” (Optative)
Slide 12 Reference Model Artifacts WRSPM Common Language Refinement EnvironmentSystem
Slide 13 Designations EnvironmentSystem eheh evev svsv shsh Visibility Control
Slide 14 Example: Patient Monitoring System Requirement: a system to warn a nurse if a patient’s heart stops beating.
Slide 15 Artifacts Domain knowledge W. A nurse is always at the nurse ’ s station and can hear a bell. If the patient ’ s heart has stopped, then a sensor on the patient ’ s chest ceases detecting the sound of a heartbeat. Programming platform M. Sound sensor. Bell. Computer that can activate the bell based on information from the sensor.
Slide 16 Designations e h – The nurse and the heart of the patient. e v – The sound of the heartbeat. s v – The ringing bell. s h – Internal variables used in calculations. All sets distinct. Let e = e h e v and s = s h s v.
Slide 17 Anecdote Safety requirement in an aircraft: reverse thrusters should not activate while the plane is in the air or stopped on the ground. Domain knowledge assumption: wheels are turning iff the plane is rolling on the ground. Hidden from the system (e h ): whether the plane is on the ground. Visible to the system (e v ): whether the wheels are turning. Fallacy in 2.
Slide 18 Fundamental Refinement Obligations FO1Adequacy: e s. W M P R Implementing the program P on the machine M implies that the requirements R are met so long as the domain W behaves as expected. FO2Consistency: e s. W The domain assumptions W are consistent. FO3Relative Consistency: e v. ( e h s. W) ( e h s. W M P) Allowed actions of the environment must be consistent with the system behavior.
Slide 19 Relative Consistency How to Get it Wrong! FO3 e v. ( e h s. W) ( e h s. W M ’ ) where M ’ = M P Some choice of environment events makes the system consistent: e s. W M ’. Too weak: the system doesn ’ t get to pick what the environment will do.
Slide 20 Second Way to Get it Wrong FO3 e v. ( e h s. W) ( e h s. W M ’ ) Every set of events consistent with the domain assumptions is allowed by the system. e s. W M ’. Too strong: this defeats the point of the requirements, which say how the system should constrain possibilities.
Slide 21 Third Way to Get it Wrong FO3 e v. ( e h s. W) ( e h s. W M ’ ) If there is a system behavior consistent with the domain assumptions then the system must allow it. e s. W M ’. Too weak: given e, if there is an s that is inconsistent with domain knowledge, then the system can implement anything.
Slide 22 Fourth Way to Get it Wrong FO3 e v. ( e h s. W) ( e h s. W M ’ ) If there is a system behavior consistent with the domain assumptions then the system must allow it. e. ( s. W) ( s. W M ’ ) Too strong: consider patient monitoring system. It is consistent with W for the patient ’ s heart (e v ) to stop beating without the nurse being warned (e h )!
Slide 23 Role of the Specification The specification provides communication between the user and the developer expressed in the common vocabulary of the environment and system. It enables a factorization of responsibilities between user and developer. Users work with designations visible in the environment (viz. W and R). Developers work with designations visible in the system (viz. P and M).
Slide 24 Environment-Side Obligations EO1 e s. W S R EO2 e s. W EO3 e v. ( e h s. W) ( s. S) ( s. S e h. W)
Slide 25 System-Side Obligations SO1 e. ( s. S) ( s. M P) ( s. M P S)
Slide 26 How to Get it Wrong EO3 e v. ( e h s. W) ( s. S) ( s. S e h. W) FO3 e v. ( e h s. W) ( e h s. W M ’ ) Replace EO3 with a weaker obligation formed by close analogy with FO3: e v. ( e h s. W) ( e h s. W S) Fails to imply that the fundamental obligations are met.
Slide 27 The Key Objective Theorem: EO1 EO2 EO3 SO1 FO1 FO2 FO3
Slide 28 Related Work Functional documentation model (Parnas and Madey). Feasibility similar to relative consistency. Obligations weaker than WRSPM. FD model allows implementations with unpredictable behavior. Domain descriptions (Jackson and Zave). Reactive Modules (Alur and Henzinger). Village Telephone System (Karthik, Obradovic, and authors).
Slide 29 Conclusions Reference model based on 5 artifacts: WRSPM. Four variables reflect visibility and control between environment and system. Proof obligations can be factored between environment and system by the use of a specification in the common language.
Slide 30 Functional Documentation Model REQ(m,c) Nat(m,c) mc IN(m,i)OUT(o,c)SOF(i,o) io Environment System
Slide 31 FD Proof Obligations FD1 Acceptability m i o c. NAT(m,c) IN(m,i) SOF(i,o) OUT(o,c) REQ(m,c) FD2 Feasibility m. ( c. NAT(m,c)) ( c. NAT(m,c) REQ(m,c)) FD3 m. ( c. NAT(m,c)) ( i. IN(m,i))
Slide 32 FD Counterexample The following definitions satisfy all FD conditions, but cannot be realized. NAT: ( t. c(t) > 0) ( t. m(t) < 0) REQ: t. c(t + 3) = -m(t) IN: t. i(t+1) = m(t) SOF: t. o(t+1) = i(t) OUT: t. o(t+1) = i(t)
Slide 33 FD to WRSPM me v cs v i,os h W = NAT(m,c) R = REQ(m,c) M ’ = IN(m,i) SOF(i,o) OUT(o,c)
Slide 34 WRSPM Conditions for FD Artifacts T1 Admissibility m i o c. NAT(m,c) IN(m,i) SOF(i,o) OUT(o,c) REQ(m,c) T2 Consistency m c. NAT(m,c) T3 Relative Consistency m. ( c. NAT(m,c)) ( m i o c. NAT(m,c) IN(m,i) SOF(i,o) OUT(o,c)) Theorem T1 T2 FD1 FD2 FD3