Presentation Title to allow for 3 lines - DFW-ASEE
Using Behavior Surveys for CMMI Process Deployment Raytheon North Texas Software Engineering Donna Freed Overview: Using Behavior Surveys for CMMI Process Deployment PPQA Support of CMMI Level 4 and 5 N TX Software Engineering Process Journey SW Steps to CMMI Level 5 R6 Process for Achieving Level 5 What Is a Behavior Survey? Survey Results Improvements Made: Statistical Process Control for design and code reviews 01/25/20 Page 2 PPQA Focuses on Improvement Improve and Strengthen Process by Closing Gaps Step 3: Identify and Resolve Gaps Step 2: How Well Process Is Supporting Programs Step 1: How Well We Are Executing Work with Process Owners R6 Specialist Projects Optimize Execution Determine Process Gaps Identify Process Improvement Feedback to Program and Process Causal Analysis 01/25/20 3 Page 3 N TX Software Engineering Typical Program Few Large Programs Customer/End User:
Military Number of Engs: 10 Number of SW Engrs: Program Duration: PPQA Support: 5 18 Months 3 Programs Per SQE Many Small Programs 01/25/20 Page 4 N TX Software Engineering Process Improvement Journey RTIS RTISIntegrated Integrated Product Product Development Development Process Process RTIS RTISPolicies Policies&& Procedures Procedures RTIS RTISSoftware Software Operating Operating Instructions Instructions 1989 1990 Baldridge Award Level 2 1991 Level 2 Repeatable 1992 1993
Formed Team Formed IPI CMM-Based Internal Process Improvement Assessment IPI Raytheon/TI CMM-basedSystems Internal Process Improvement Assessment RTIS RTIS Raytheon/TI Systems CMMI CMM Integrated CMMI CMM Integrated 01/25/20 Page 5 SW Steps to CMMI Level 5 2002 Develop and Deploy Processes 2003 CMM CMM44 Fixes Fixes CMM-4 CMM-4 to to CMMI-4 CMMI-4 CMMI-5 CMMI-5 ACHIEVE CMMI LEVEL PERFORMANCE then GO for the Appraisal Level-5 PBA 6-12 6-12Mo. Mo. Execution Execution Level-5 SCAMPI 01/25/20 Page 6 R6 Process Life at Levels 4 & 5 Measure Improvements & Re-bas eline
Set Objec tive P erforma nc e & Quality Goals Stabil ize proc es s & Establis h P roc ess c apabili ty Bas eline CPI Process Capability for CPI LSL = 0.975, Nominal = 1.0, USL = 1.15 15 Cp = 0.09 12 Cpk = -0.08 Cpk (upper) = 0.26 Deploy Improvements frequency 9 Cpk (lower) = -0.08 6 Cr = 10.96 3 Cpm = 0.09 K = -1.14 0 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2
CPI Selec t & P rio ritize Improvements P ilot Improvements 01/25/20 Page 7 Identify Improvement Opportunities Ideas are the inputs to this process Ideas are gathered from many sources: Raytheon Six Sigma visualization and assessments Program level metrics analysis meetings Organization level software improvement metrics root cause analysis Make progress toward software improvement objectives Improvement ideas from individuals and management reviews Process appraisals Behavior surveys Surveys are a feedback process based on the Behavior Rollouts 01/25/20 Page 8 What Is a Behavior Survey? Summary Metrics Behavior Verification Questions Valid entries in blue cells below are: N, Y, P, T, N/A 1 Update the total # of programs being verified in the cell below: Total # of questions 24 18 SEI Level PA 2 3 3 3 4 5
4 4 4 Questions IPDS Assessment Results IPDS Assessment Program Start Results Program Start 11 11 10 10 9 9 8 SQE/SQE Lead 7 7 6 DO NOT EDIT, DELETE, OR MOVE THIS ROW, ADD NEW PROGRAMS IN YELLOW AREA BELOW THIS ROW T Y Y Y Y Y Y Y N/A Y Y N Y Y Y Y Y Y Y N Y Y Y N/A Y Y Y Y Y Y N
Y N/A Y Y N Y Y Y Y Y Y Y Y Y Y DO NOT EDIT, DELETE, OR MOVE THIS ROW, ADD MORE PROGRAMS ABOVE THIS LINE IN YELLOW AREA Yes Behavior Current Data: 19 19 Partial Behavior 0 0 Tailored or Variance 1 0 No Behavior 2 2 N/A (not time) 1 2 NR (not required) 0 0 Not Assessed 0 0 23 23 Question Question Project Business Area 8 Y N N Y Y Y Y N Y N/A N Y
N Y N Y Y Y Y Y Y Y Y N/A Y Y Y Y Y Y N Y N/A N Y N Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y Y N N Y N Y N/A N Y Y Y Y Y Y Y Y Y 16 0 0
90% 90% 100% 100% Deploy Consistent Behavior 01/25/20 9 Page 9 Why Are Behavior Surveys Needed? Program execution compliant to CMMI L4/L5 process Programs not executing to CMMI L4/L5 process Lack Results in Adherence to CMMI process Lack Supports Consistent Deployment Objective Feedback Enables Lack Provides Lack Organization Improvement 10 01/25/20 Page 10 This Is Not an Audit An Audit is more formal than a Behavior Survey Audits Behavior Survey Based on approved plans Based on communication and tailored requirements
Identify non-compliances Attributable to programs Compliance packages Identify gaps Not attributable to programs Process deployment Surveys take the pulse of process deployment 01/25/20 Page 11 Developing Checklist Questions Valid entries in blue cells below are: N, Y, P, T, N/A 1 Update the total # of programs being verified in the cell below: Total # of questions 24 18 SEI Level PA 2 3 3 3 4 5 4 4 4 SQE/SQE Lead Project Business Area Questions Survey Questions created by Software Quality Engineering (SQE) and stakeholders DO NOT EDIT, DELETE, OR MOVE THIS ROW, ADD NEW PROGRAMS IN YELLOW AREA BELOW THIS ROW
T Y Y Y Y Y Y Y N/A Y Y N Y Y Y Y Y Y Y N Y Y Y N/A Y Y Y Y Y Y N Y N/A Y Y N Y Y Y Y Y Y Y Y Y Y DO NOT EDIT, DELETE, OR MOVE THIS ROW, ADD MORE PROGRAMS ABOVE THIS LINE IN YELLOW AREA Yes Behavior Current Data: 19 19 Partial Behavior 0 0 Tailored or Variance 1 0 No Behavior 2 2 N/A (not time)
1 2 NR (not required) 0 0 Not Assessed 0 0 23 23 Y N N Y Y Y Y N Y N/A N Y N Y N Y Y Y Y Y Y Y Y N/A Y Y Y Y Y Y N Y N/A N Y N Y Y Y Y Y Y Y Y Y Y N Y Y Y
Y Y Y N N Y N Y N/A N Y Y Y Y Y Y Y Y Y 16 0 0 6 1 0 0 23 18 0 0 3 2 0 0 23 17 0 0 5 1 0 0 23 Results to be reported for each question 01/25/20 Page 12 Survey of CMMI Level Behavior 18 17 SQE Survey Question 16 15 14
13 12 11 10 9 8 7 6 5 4 3 2 1 0% 10% Compliant 20% E L P M A X E Y L N O 30% 40% Partially Compliant 50% 60% 70% Tailored Out or Variance 80% 90% 100% Non Compliant 01/25/20 Page 13 Survey Timeline CMMI CMMI Level
Level55PBA PBA CMM CMM Level Level44 IPI IPI 4Q01 2Q01 3Q01 RTIS RTISIntegrated Integrated Product Product Development Development Process Process RTIS RTISPolicies Policies&& Procedures Procedures RTIS RTISSoftware Software Operating Operating Instructions Instructions 1989 1990 Baldridge Award Level 2 1991 Level 2 Repeatable 1992 1993 1994 Level 3 Defined Level 3 Baseline Validation
60% Planning Assessment Tailored Out or Variance 80% 100% Non Compliant Behavior 01/25/20 Page 16 Behavior Survey May 2003 CMMI Level 5 Behavior Survey Artifact Levels of Control 18 17 16 15 14 13 Individual submission of Org Imp Ideas 12 Question 11 10 9 8 7 6 5 4 3 2 1 0% 10% 20% Compliant Behavior 30% Risk Training 40%
50% Result Partially Compliant Behavior 60% Tailored or Variance 70% 80% NR (not required) 90% 100% No Behavior 01/25/20 Page 17 Levels of Control PBA Correction Artifact Release Work in Progress Created Engineering Level Development Level Formal Level The Scope of Levels of Control Levels of control provide change and configuration control for work in progress Emphasis is on earlier control of artifacts and inclusion of more artifacts that are controlled at lower levels. Communicate Levels of Control concept to program team Update SCM plan and procedures to describe levels of control A small improvement with significant impact 01/25/20 Page 18 Planning Improvement Proposal Triggers Consistent Behavior Survey findings Programs have difficulty meeting the 90-day planning requirement Incremental SW Planning Define planning activities for each SW stage
Review and Approval Stakeholder Involvement, and reduce cycle time Lean Planning Power Point slides Revised SWP350 Software Project Management training class A big improvement with significant impact 01/25/20 Page 19 Defect Containment Improvement Proposal Defect Containment Monitor peer review performance Verify defect classification Consistent use of standard tools: Synergy 6.2b and Defect Logger 3.1 Survey on organization behavior Implement organizational improvement through Defect Containment Statistical Process Control for code reviews Developed and provided training to SWEC A big improvement leading to multiple improvement projects. 01/25/20 Page 20 Behavior Survey September 2003 CMMI Level 5 Behavior Results 50% Adoption of Planning Improvement Behavior Survey 70% Adoption of SPC Improvement 13 12 11 10 Question 9 8 +22% increased in use of Org improvement DB 7 6 5 4 +30% Improvement in
Levels of Control 3 2 1 0% 10% 20% Compliant Behavior 30% 40% 50% 60% Partially Compliant Behavior Result Tailored or Variance 70% 80% NR (not required) 90% 100% No Behavior 01/25/20 Page 21 SW Process CMMI at Levels 4 and 5 CPI Defect Containment Process Capability for Percent In Phase Process Capability for CPI 15 LSL = 0.975, Nominal = 1.0, USL = 1.15 15 Cp = 0.09 12 Cpk (upper) = 0.26 Cpm = 0.09 K = -1.14 0
Page 22 Initial Survey for Metrics/QPM Consistent Behavior Questions Pulled from March, 2002 Metrics Analysis Comm Metrics Analysis Content Q u e s tio n Metrics Analysis Held QPM/SWIP QPM Plan Metrics Verified Metrics Reported Metrics Defined 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Assessment Compliant Partially Compliant Tailored Out or Variance Non Compliant Actions taken following baseline survey Communication of results with management, all hands PPQA and SEPG provide consulting support Quantitative Process Management roll-out, 6/2002 Software Improvement Program (SWIP) Metrics roll-out, 8/2002 October Survey 01/25/20
Page 23 October 2002 Survey for Metrics/QPM Consistent Behavior Questions Pulled from October, 2002 Metrics Analysis Communicated Metrics Analysis Content Q u e s tio n Metrics Analysis Held QPM-SWIP Aligned QPM Metrics Verified Metrics Report Metrics Plan 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Assessment Compliant Behavior Partially Compliant Behavior Tailored or Variance NR (not required) Not Assessed No Behavior Actions taken following October survey Communication of results, continued support Move Process Capabilities Towards Our Goals roll-out, 10/2002 November Survey 01/25/20 Page 24
November 2002 Survey for Metrics/QPM Consistent Behavior Questions Pulled from November, 2002 Metrics Analysis Comm Q u e s tio n Metrics Analysis Content Metrics Analysis Held QPM-SWIP QPM Metrics Verified 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Assessment Compliant Behavior Partially Compliant Behavior Tailored or Variance NR (not required) No Behavior Actions taken following November survey Communication of results, continued support Monitor Organizational Capability roll-out, 12/2002 Statistical Process Control roll-out, 5/2003 June Survey 01/25/20
Page 25 June 2003 Survey for Metrics/QPM Consistent Behavior Questions Pulled from June, 2003 Q u e s tio n MA Held QPM-SWIP 0% Compliant Behavior 10% 20% 30% 40% Partially Compliant Behavior 50% 60% Result Tailored or Variance 70% 80% NR (not required) 90% 100% No Behavior Actions taken following June survey Communication of results with management, PPQA, all hands SPC Training continued support September Survey 01/25/20 Page 26 SPC Background What Is SPC? Statistical Process control is an engineering activity that measures key characteristics of a product, compares them
to expected standards (based on statistical calculations), and takes corrective action as needed We have established control limits that define our expected performance Our main objective is to ensure we are performing within those limits and to reduce our overall variability Reduction in variation leads to better products and less rework 01/25/20 Page 27 Selected Sub-processes Based on root cause analysis of our organizational goals we chose two sub-processes to place under statistical control Peer Reviews for Design Peer Reviews for Code and Unit Test The next slides will show how we selected these sub-processes for SPC 01/25/20 Page 28 SWIP Objectives Meet Commitments (to Customer) Intent: Meet the cost and schedule objectives of the programs we support Quantification: Cost and schedule indices (CPI and SPI) Software Price Intent: Price software engineering products competitively Quantification: $/DLOC (delivered lines of code) Delivery Quality Intent: Deliver quality software engineering products Quantification: In-phase Defects and Defect Density Business objectives drive sub-process selection 01/25/20 Page 29 Characterize Cause and Effect Diagrams Process Tooling Multiple data sources Procedure
People Subjectivity (differing points of view) L im ite d D o m a in E x p e rts S c h e d u le C o n s tra in ts Customer care abouts S y s te m s S W E x p e rts P h a s e o f o rig in D e fe c t E x is te n c e Mix of programs O ld /N e w Conducting Using correct people in reviews Time to do prework Round trip eng. (i.e. integrated tooling) D o m a in Va ria tio n Effective meeting training needed for peer reviews R e a so n co d e No common repository for shared code People need to know the concept of operations New people do not know how to classify defects Typ e s S o u rc e J IT Roles
N o O rg T ra in in g Phases P u rp o s e n o t c le a r Type & reason N e e d s to in c lu d e : Severity C o n tin u u s le a rn in g c u rv e Peer review training D o u b le c o u n ts ? ? In c o n s is te n t Va ry in g ty p e s C rite ria v a ria tio n Not a priority on a program Tools & process to automate defect capture Tools not user friendly D e fe c t P jt to p jt d iffe re n c e s in c o u n tin g Peer review process Type & reason codes not useful D e fe c t L o g g e r How to count defects in reused code What counts (definition of defects) C o n tin u u s
C le a r g u id e lin e s fo r p h a se s o f o rig D o o rs Too program focused, need product line & org view Overloaded People concerned about defect counts being used against them Attrition Defect Containment Metric has Excessive Variation: s > 28% Always start at 100% People don't know how to use the defect data Meeting & follow-up Corrective action Training Programs Diagrams done for five SWIP metrics 01/25/20 Page 30 Characterize CPI Analysis D e v e lo p m e n t P h a se Box-and-Whisker Plot Design Complete We compared this analysis with Defect Containment analyses! Maint.
SW Integ. Sys Integ 0 0.5 1 1.5 2 2.5 CPI Analysis of Variance (ANOVA) on CPI by development phase shows Software Integration and System Integration contribute the most to CPI variation. There is a statistical difference between these phases and the others 01/25/20 Page 31 Characterize Defect Analysis Main Bullet one Average Percentage In Phase Main Bullet two Blue Lines show 1 Mean Sub bullet one 0.9 Sub bullet two 0.8 Sub bullet three confidence interval around the average Supports hypothesis that Code Peer Reviews are inadequate to meet our goals 0.7 Third bullet indent 0.6 Third bullet indent 0.5 Main Bullet three 0.4 Sub bullet one Code Third bullet indent
DesignRequirements Defects escaping from Requirements phase addressed elsewhere ANOVA on Defects found in phase indicates a statistical difference between phases. Most escaping Code defects are found in the Software and System Integration phases, where our greatest CPI variation is found! 01/25/20 Page 32 Peer Review Control Chart 01/25/20 Page 33 Expectations Peer Reviewers Implement SPC on your design and code peer reviews Project Metrics Team Add SPC data to your Metrics Analysis Meetings and record minutes Execute Causal Analysis and Resolution as part of Metrics Analysis Meetings Project Lead Use new Senior Management Review template 01/25/20 Page 34 September 2003 Survey for Metrics/QPM Cons is te nt Be havior Que s tions Pulle d from Se pte m be r, 2003 Q u e s t io n OR G Norms in SM R Org Norms in M etrics A nalysis M eeting Org Norms in Defect Logger 0% Compliant Behavior 10% 20% 30% Partially Compliant Behavior 40% 50%
60% Re s ult Tailored or Variance 70% 80% 90% NR (not required) 100% No Behavior Actions taken following September survey Communication of results, continued support SCAMPI 01/25/20 Page 35 Summary Behavior surveys have been successfully used as a deployment tool for metrics/quantitative process management Started with basic behaviors which were refined as process deployment increased Evolved to address more sophisticated activities Statistical Process Control (SPC) A tool for deployment, but with further analysis can also identify specific improvements Improvement in procedures Improvement in training 01/25/20 Page 36
Czars tried to retain power by harsh and oppressive rule. ... Lenin's family background instilled in him a hatred of the Czar and Czar government policies. He was influenced by the ideas of Karl Marx and as a young man...
His first great writing success was Treasure Island, a thrilling story of a swashbuckling pirate named Long John Silver.. RLS has a good claim as the inventor of the sleeping bag, taking a large fleece-lined sack with him to sleep...
Document presentation format: On-screen Show ... GrantsHand Aeolus Blush Microsoft Clip Gallery Microsoft Photo Editor 3.0 Photo Bitmap Image Microsoft PowerPoint Slide No Slide Title No Slide Title No Slide Title No Slide Title No Slide Title No Slide Title...
What fraction of those asked were 16 or less and bought sweets? ... Splat the rat. How much information is needed? Press students to clearly articulate what they mean. Challenge - again, what is the minimum amount of information you...
What causes TIDES? The Moons gravity pulls up on the oceans because they are not attached to the Earth. The Suns gravity also pulls up on the oceans. Spring Tides Occur during New and Full moons. Earth, Moon, and Sun...
To replace the points on the road by the corresponding angular directions in the forward field and to define the luminous intensity in the corresponding directions. ... IF YOU ADD RE-AIMING POSSIBILITY AND NON REPEATABILITY OF VISUAL CUT-OFF AIMING IN-USE...
Poetry Analysis Using TPCASTT English I Pre-AP Getting Started… This is a process to help you organize your analysis of poetry. We have already learned the vocabulary, now it's time to put it into practice!
- Darren Haver, Water Quality Advisor, South Coast REC and Orange County Director. Efforts are underway to bring water-focused programming to 4-H youth, with opportunities to learn about the water cycle, human interventions.
Ready to download the document? Go ahead and hit continue!