View Graphs for CIS 679
Set Two
Prof. Murray Turoff

click here to view slide by slide

CONTENTS


  1. Development Process
  2. Justification
  3. Investigation
  4. Investigation Methods
  5. System Design
  6. Software
  7. Evaluation
  8. Professionals
  9. System Faults
  10. Acquisitions

DEVELOPMENT PROCESS View Graphs

* (c) Copyright 1992 Murray Turoff

RUNAWAYS

* BLACK HOLES OF COMPUTER BUDGETS
* RESULTS IN MANY NEGATIVES
* EXAMPLE : ALLSTATE INSURANCE
* 1982 PROJECT TO COST 8 MILLION

* TO BE COMPLETED IN 1987
* ESTIMATED FOR 1993 AND 100 MILLION

* PEAT MARWICK SURVEY:

* 600 LARGEST FIRMS
* 35% HAVE MAJOR RUNAWAYS

DEVELOPMENT PHASES 1

* USER REQUEST
* APPLICATION SELECTION
* FEASIBILITY STUDY
* JUSTIFICATION STUDY
* INVESTIGATION, ANALYSIS & SYSTEMS DESIGN

DEVELOPMENT PHASES 2

* PROGRAMMING
* SYSTEMS TESTING
* DOCUMENTATION
* CONVERSION & IMPLEMENTATION
* MAINTENANCE
* EVALUATION & POST AUDIT
* UPGRADING

DEVELOPMENT PROCESS 1

* CONSIDERABLE OVERLAP BETWEEN PHASES
* CAN BE ITERATIVE AND CYCLIC
* AUTHORIZATION AND APPROVAL CHECKPOINTS
* CAN BE TAILORED TO ORGANIZATION & SITUATION

DEVELOPMENT PROCESS 2

* PROJECT INITIATION

* 1. USER REQUEST

* APPLICATION SELECTION
* PROBLEM SELECTION
* EXPLORATORY STUDY

* 2. FEASIBILITY STUDY

* INITIAL REQUIREMENTS DESIGN
* SOLUTION ALTERNATIVES
* EVALUATION OF ALTERNATIVES

DEVELOPMENT PROCESS 3

* PROJECT DEVELOPMENT

* 3. INVESTIGATION

* DATA GATHERING
* FACT FINDING

* 4. ANALYSIS
* 5. SYSTEM DESIGN

DEVELOPMENT PROCESS 4

* PROJECT IMPLEMENTATION

* 6. PROGRAMMING

* DESIGN, STRUCTURING
* CODING, DEBUGGING

* 7. SYSTEM TESTING
* 8. DOCUMENTATION

* ONGOING

* 9. CONVERSION

* USER ACCEPTANCE AND TESTS

DEVELOPMENT PROCESS 5

* POST PROJECT

* 10. MAINTENANCE & UPDATING
* 11. EVALUATION & POST AUDIT
* 12. UPGRADING

FEEDBACK STRUCTURE

* OPERATIONAL ANALYSIS
* APPLICATION ANALYSIS
* SYSTEMS ANALYSIS
* IMPLEMENTATION
* CONVERSION
* TESTING
* VERIFICATION
* VALIDATION
* EVALUATION

SOFTWARE DEVELOPMENT PROCESS 1

* USER REQUEST

* 1% OF EFFORT
* PRIORITY

* FEASIBILITY STUDY

* 10% OF EFFORT
* INITIAL DECISION

* INVESTIGATION, ANALYSIS & DESIGN

* 30% OF EFFORT
* FINAL DECISION

SOFTWARE DEVELOPMENT PROCESS 2

* PROGRAMMING, TESTING, DOCUMENTATION

* 40% OF EFFORT

* CONVERSION & IMPLEMENTATION

* 10% OF EFFORT
* ACCEPTANCE DECISIONS

* EVALUATION & MAINTENANCE

* 10% OF EFFORT
* ONGOING

KEY DEVELOPMENT PROBLEMS

* AGE OF EXISTING SYSTEMS
* MANUAL PROCEDURES
* COGNITIVE VARIABILITY
* AGREEMENT ON OLD SYSTEM
* AGREEMENT ON NEW SYSTEM
* WHO GAINS BENEFITS
* WHO GAINS COSTS
* USER COOPERATION
* USER COMPUTER LITERACY
* STAFF CAPABILITY
* MANAGEMENT INVOLVEMENT

SOFTWARE MODELS I

* WATERFALL LIFE CYCLE

* TRADITIONAL LIFE CYCLE
* ANALYSIS, DESIGN, CODE, TEST & MAINTENANCE
* TOP DOWN RIGIDITY
* NO ITERATION BETWEEN PHASES
* DIFFICULT ACCOMMODATING UNCERTAINTY & RISK
* BLACK BOX APPROACH

SOFTWARE MODELS II

* PROTOTYPING

* THROWAWAY APPROACH
* EVOLUTIONARY APPROACH
* INCREMENTAL APPROACH
* CHANCE TO EXPERIMENT
* CAN BE OVERSOLD

* RAISING EXPECTATIONS

* DIFFICULT FOR LARGE COMPLEX SYSTEMS
* SOMETIMES PROTOTYPE CONSIDERED FINAL

SOFTWARE MODELS III

* SPIRAL MODEL
* FOUR MAJOR ACTIVITIES
* (1) DETERMINATION OF OBJECTIVES, ALTERNATIVES, AND CONSTRAINTS

* OBJECTIVES: FUNCTIONALITY, PERFORMANCE, RELIABILITY, FLEXIBILITY
* ALTERNATIVES: DESIGN CHOICES, PURCHASE OR DEVELOP, ETC.
* CONSTRAINTS: TIME, BUDGET, ENABLING TECHNOLOGY, STAFFING, ETC.

SOFTWARE MODELS IV

* SPIRAL MODEL
* (2) RISK ANALYSIS AND PROTOTYPING

* BUDGET OVERRUNS, SCHEDULE SLIPPAGE, STAFFING PROBLEMS, REQUIREMENTS CHANGE, TECHNICAL CHANGES, ETC.
* RISK:

* IDENTIFICATION
* ASSESSMENT
* PRIORITIZATION
* MANAGEMENT STRATEGIES
* RESOLUTION
* MONITORING

SOFTWARE MODELS V

* SPIRAL MODEL
* (3) WATERFALL APPROACH TO NEXT LEVEL PRODUCT

* APPROPRIATE VALIDATION OR VERIFICATION

SOFTWARE MODELS VI

* SPIRAL MODEL
* (4) PLAN FOR THE NEXT PHASE
* CYCLE:

* RISK ANALYSIS
* OPERATIONAL PROTOTYPE

* SIMULATIONS, MODELS, BENCHMARKS

* DETAILED DESIGN
* CODE, UNIT TEST, INTEGRATION AND TEST, ACCEPTANCE TEST, IMPLEMENTATION

SOFTWARE DEVELOPMENT APPROACHES

* SOFTWARE LIBRARIES

* COMPUTER SYSTEM LEVEL

* FUNCTIONAL KERNELS

* INTERFACE FUNCTIONS, HELP LIBRARIES

* FOURTH GENERATION APPLICATION LANGUAGES

* DATABASES

* STRUCTURAL MODELING

* USER GENERATED CODE

REQUIREMENTS ANALYSIS I

* TASKS CHANGE
* DIFFICULT TO SPECIFY ALL
* INADEQUACIES MAJOR ROLE IN RUNAWAYS
* DESIGN FOR CHANGE RATHER THAN ABSOLUTE SPECIFICATION
* SUPPORT TOOLS

* DATA FLOW DIAGRAMS, DATA DICTIONARIES, DECISION TABLES, FLOW CHARTS, OBJECT ORIENTED ANALYSIS

* KNOWLEDGE BASED AUTOMATIC SD

* REQUIREMENTS GENERATE CODE

REQUIREMENTS ANALYSIS II

* FUNDAMENTAL DESIGN PRINCIPLES

* ABSTRACTION

* USER TASKS

* COUPLING

* MINIMIZE INCORPORATION

* COHESION

* PIECES WORK TOGETHER

* ENCAPSULATION

* REUSE

* MODULARITY

* MAINTAINABLE, EXTENSIBLE

REUSABLE DESIGNS

* REUSE OF DESIGN
* REUSE OF CODE
* CONSTRUCTION OF FRAMEWORKS
* ARTIFACTS OF SOFTWARE
* OBJECT ORIENTED DESIGN

CASE SUPPORT

* ANALYSIS TOOLS

* OOA DIAGRAMS, SPECIFICATION CHECKER, REAL-TIME ANALYSIS TOOLKITS, PROTOTYPING TOOLS, ETC.

* DATA DESIGN TOOLS

* TEXT, GRAPHICS, AND LAYOUT TOOLS FOR DESIGN DOCUMENTATION, AUTOMATED DESIGN ANALYZERS, HYPTERTEXT DATA BASES, ETC.

* STRUCTURE OF REUSE LIBRARIES, OPEN QUESTION

SOFTWARE DEVELOPMENT R&D I

* FACETED CLASSIFICATION
* SOFTWARE ARCHIVES
* DOMAIN ANALYSIS
* HYPERTEXT TECHNOLOGY
* KNOWLEDGE BASED SYSTEMS

SOFTWARE DEVELOPMENT R&D II

* OBJECT ORIENTED TECHNOLOGY
* DATA MODELING AND SEMANTIC DATA MODELING
* INTERFACE KERNELS
* STANDARDS
* PROGRAMMING LANGUAGE EXTENSIONS
* USER MENTAL MODELS & COGNITIVE PROCESSING

USER REQUEST 1

* BASED UPON BUSINESS PLAN

* BUSINESS ANALYSIS

* USER GENERATED
* INFORMAL SYSTEMS ANALYSIS REVIEW
* INFORMAL COOPERATION WITH I.S. GROUP
* SHORT DOCUMENT

* 2 TO 10 PAGES

* USUALLY STANDARDIZED ( INFORMALLY)
* SUBMITTED TO STEERING COMMITTEE

USER REQUEST 2

* CONTENT:
* OBJECTIVES

* e.g. COST REDUCTION, MORE SERVICE, WORKLOAD INCREASE, PERFORMANCE GOALS, etc.
* PREFERRED QUANTIFICATION

* BOUNDARIES & CONSTRAINTS

* BUDGET LIMITS
* RESOURCE LIMITS
* SECURITY REQUIREMENTS

* LISTS

* MANDATORY DOCUMENTS

USER REQUEST 3

* TIMESCALE
* IDENTIFIED PROBLEMS
* SUGGESTED SOLUTIONS
* USER SHOULD KNOW :

* PROBLEM
* APPLICATION SPECIFICS
USERS ORGANIZATION

* SYSTEMS PEOPLE SHOULD KNOW:

* TECHNOLOGY
* IMPLEMENTATION EFFORT
* GENERALIZABLE APPROACHES

USER REQUEST 4

* STEERING COMMITTEE CONSIDERATIONS

* AVAILABLE RESOURCES
* OTHER PROJECT REQUESTS
* PREDICTABILITY BY PLAN

* STEERING COMMITTEE ALTERNATIVES

* DO IT (ADVANCE TO FEASIBILITY STUDY)
* PUT IT ON HOLD
* TURN BACK FOR MORE INFORMATION
* REJECT IT
* BUCK IT UP TO HIGHER LEVEL

* SOMETIMES FORMAL REQUIREMENT

MIDDLE LEVEL STEERING COMMITTEE 1

* REPRESENTATIVES OF USER UNITS

* MIDDLE MANAGEMENT/ PROFESSIONALS
* COMPUTER LITERATE

* CHAIR IS ASSISTANT CIO
* A FEW I.S. PROFESSIONALS/ MANAGERS

* MINORITY

MIDDLE LEVEL STEERING COMMITTEE 2

* SET PRIORITIES ON PROJECTS
* REVIEW I.S. RESOURCES

* EQUIPMENT
* SOFTWARE
* PEOPLE

* REVIEW LONG & SHORT TERM PLANS
* ENDORSE BUDGET

MIDDLE LEVEL STEERING COMMITTEE 3

* REVIEW MAJOR PROBLEMS
* MAKE RECOMMENDATIONS WHEN NEEDED
* TO REPRESENT USER INTERESTS
* YES/ NO ON USER REQUEST
* YES /NO ON FEASIBILITY STUDY
* REPLACE CHARGING FUNCTION

UPPER LEVEL STEERING COMMITTEE

* TRANSLATE BUSINESS PLANS TO I.S. PLANS
* SET STRATEGIC I.S. OBJECTIVES
* APPROVE MAJOR I.S. PROJECTS
* OVER SOME DOLLAR AMOUNT
* APPROVE MAJOR I.S. EXPENDITURES

* GDBMS
* MAJOR NETWORK EXPANSION
* NEW MAIN FRAME

* HIGHER NEW CIO
* RESOLVE MAJOR CONFLICTS

ROLE OF STEERING COMMITTEE 1

* PRIORITY SETTING FOR PROJECTS
* REVIEW OF PROJECTS
* POLICY RECOMMENDATIONS
* APPROVING STANDARDS
* REVIEW OF DP RESOURCES

* RECOMMENDING SIGNIFICANT CHANGES

* REVIEW OF EVALUATION STUDIES
* TRANSLATION OF STRATEGIC OBJECTIVES INTO TACTICAL CRITERIA

ROLE OF STEERING COMMITTEE 2

* DEVELOPING FUNCTION DP REQUIREMENTS
* MEDIATING

* BETWEEN USERS GROUPS
* BETWEEN USERS AND DP SHOP

* RECOMMENDING ORGANISATIONALLY WIDE DP PROJECTS
* REVIEW OF DP BUDGET AND ENDORSEMENT
* SETTING OBJECTIVES FOR DP

ROLE OF THE DP MANAGER 1

* ALLOCATING RESOURCES TO ACCOMPLISH PRIORITIES
* TACTICAL AND OPERATIONAL DECISIONS

* BASED UPON TACTICAL CRITERIA SET UP BY STEERING COMMITTEE

* DECIDING ON TECHNICAL FEASIBILITY OF ANY PROJECT OR EFFORT
* PLANNING DP

* DEVELOPMENT
* STAFFING
* RESOURCE ALLOCATION

ROLE OF THE DP MANAGER 2

* ALLOCATION OF PEOPLE TO JOBS
* RECOMMEND EQUIPMENT AND SOFTWARE
* RECOMMENDING ORGANIZATIONAL STANDARDS

MANAGEMENT REALTIES

* NO EXCUSE FOR FAILURE
* NO EXCUSE FOR NOT HANDLING PROBLEMS
* RESOURCES REQUESTED HAVE TO BE ADEQUATE
* DECISION MUST HAVE SPECIFICITY
* DECISIONS MUST HAVE ACCOUNTABILITY
* PROMISES HAVE TO BE KEPT
* PUNISHMENTS MAY BE LONG TERM

* AT RIGHT TIME, AFTER COVER-UP
* USE OF SCAPEGOATS

* FORMAL & INFORMAL SYSTEMS
* THE HIGHER UP THE TRUER

CONFLICTING OBJECTIVES

* I.S. DEPARTMENT WANTS PERFORMANCE
* TECHNICAL TYPES WANT MODERN TECHNOLOGY
* VENDOR WANTS PROFIT
* BUDGET PEOPLE WANT LEAST COST
* USERS WANT TASKS AND JOBS DONE
* ORGANIZATION WANTS PROFITABILITY

CHECKPOINTS

* DECISIONS

* SCHEDULES, RESOURCES, ETC.

* AGREEMENT

* WHAT IS REALITY AND OBJECTIVES

* WITHIN USER GROUPS
* BETWEEN USER GROUPS
* WITHIN I.S. DEPARTMENT
* BETWEEN I.S. & USERS

GO TO START

FEASIBILITY / JUSTIFICATION STUDY 1

* NOT A DESIGN STUDY
* ENOUGH TO CAREFULLY ESTABLISH

* RESOURCES REQUIRED
* UNCOVER HIDDEN PROBLEMS
* FEASIBLE ALTERNATIVES

* TOP LEVEL DESIGNS
* FIVE TO TEN PERCENTAGE OF EFFORT
* MULTI DISCIPLINARY TEAM

FEASIBILITY / JUSTIFICATION STUDY 2

* TERMS OF REFERENCE

* ROLE IN PLAN
* SCOPE OF STUDY
* METHODS OF STUDY
* OBJECTIVES, BOUNDARIES, & CONSTRAINTS

* EXISTING SYSTEM

* SUMMARY DESCRIPTION
* GROWTH AND FUTURE TRENDS
* PROBLEM ANALYSIS

FEASIBILITY / JUSTIFICATION STUDY 3

* REQUIREMENTS SPECIFICATION

* DATA REQUIREMENTS
* PROCESSING REQUIREMENTS
* CONSTRAINTS: TIME & SECURITY

* ALTERNATIVE SOLUTIONS

* OUTLINE DESCRIPTION
* OUTLINE DEVELOPMENT PLAN
* COST/BENEFIT ANALYSIS
* ADVANTAGES/DISADVANTAGES

* RECOMMENDATIONS & SOLUTIONS
* RELATED DOCUMENTS, REPORTS, DATA

FEASIBILITY / JUSTIFICATION STUDY 4

* SIGN OFF BY I.S. DEPARTMENT
* SIGN OFF BY USER DEPARTMENTS
* SIGN OFF BY STUDY TEAM
* SIGN OFF BY AFFECTED DEPARTMENTS
* LAST "REAL" CHANCE TO KILL PROJECTS

FEASIBILITY / JUSTIFICATION STUDY 5

* STEERING COMMITTEE ACTIONS
* ACCEPT

* SET PRIORITY, STARTING DATE
* CHOOSE SOLUTION

* PUSH IT HIGHER UP \ PUT ON HOLD

* CONDITIONALLY ACCEPT
* SHORTCOMINGS

* SEND BACK FOR MORE WORK
* TABLE (USUALLY POLITICAL)
* CONDITIONALLY REJECT (REASONS WHY)
* REJECT (REASONS WHY)

FEASIBILITY / JUSTIFICATION STUDY 6

* SOLUTION MUST BE:

* TECHNICALLY FEASIBLE
* OPERATIONALLY SUITABLE
* ECONOMICALLY VIABLE

* USER COOPERATION MUST BE ASSURED
* THE MORE EXPENDED ON EFFORT:

* HARDER TO KILL

* IF ACCEPTED:

* DETAILED PLAN LAID OUT FOR:

* INVESTIGATION, ANALYSIS, DESIGN

EVALUATION CRITERIA 1

* COSTS & OBJECTIVES

* ECONOMIC VIABILITY

* IS IT WORTH THE MONEY?

* HARDWARE & SOFTWARE

* TECHNICAL FEASIBILITY

* CAN IT BE PUT TOGETHER?

EVALUATION CRITERIA 2

* PEOPLE & ORGANIZATIONS

* OPERATIONAL SUITABILITY

* CAN IT BE USED?
* WILL IT CAUSE LEARNING?

* PROFITABILITY

* STRATEGIC VIABILITY

* WILL IT MAKE MONEY

ECONOMIC VIABILITY 1

* AUTOMATION

* REDUCE PEOPLE TIME
* SPECIALIZE WORK
* DISPLACE WORK
* ELIMINATE JOBS

* PRODUCTIVITY

* OBTAIN SOONER
* REDUCE COST
* INCREASED QUANTITY
* INCREASE QUALITY

ECONOMIC VIABILITY 2

* OPPORTUNITY

* DO NEW THINGS
* DO DIFFERENT THINGS
* DO IMPOSSIBLE THINGS

DEVELOPMENT EFFORT

* USER REQUEST 1-2%

* UNCERTAINTY IN EFFORT 100%

* FEASIBILITY STUDY 5-10%

* UNCERTAINTY IN EFFORT 50%

* DESIGN STUDY 20-40%

* UNCERTAINTY IN EFFORT 25%

* IMPLEMENTATION 30-60%

* UNCERTAINTY IN EFFORT 10%

* CONVERSION 5-30%

RUNAWAY DANGERS 1

* SATURATED HARDWARE
* NEW TYPE OF APPLICATION
* NEW TECHNOLOGY
* NEW OPERATING SYSTEM
* NEW LANGUAGE
* LOW COST REVIEW CRITERIA
* CONCEPTUAL REQUIREMENT ERRORS

RUNAWAY DANGERS 2

* IGNORING EXCEPTIONS
* IGNORING MANUAL OPERATIONS
* LONG DELAYS
* LARGE PROJECTS
* LACK OF ACCOUNTABILITY
* LACK OF COORDINATION
* LACK OF USER COOPERATION

GO TO START

INVESTIGATION 1

* CURRENT AND NEW SYSTEM
* DATA:

* USES, VOLUME & CHARACTERISTICS

* PROCEDURES:

* WHAT IS DONE, WHERE, WHEN & HOW

* ERROR AND EXCEPTIONS CASES
* PEOPLE:

* WHO DOES WHAT, WHEN AND HOW
* APTITUDE & ATTITUDES

INVESTIGATION 2

* FUTURE:

* GROWTH RATES, PROJECTIONS ON WORKLOADS

* REPORTS AND OUTPUT
* REQUIREMENTS: ACCURACY, TIMELINESS, SECURITY

INVESTIGATION 3

* MAJOR PREDICTABLE PROBLEMS:

* AGE OF EXISTING SYSTEM
* COGNITIVE VARIABILITY
* LORE
* INFORMAL PROCESSES

* MAJOR CHECKPOINT:

* AGREEMENT ON EXISTING SYSTEM
* INVESTIGATORS, USERS

AGREEMENT ROLES

* REQUESTERS
* USERS
* DESIGNERS
* OPERATORS
* SUPPLIERS

AGREEMENT ISSUES

* WHAT WAS THE OLD SYSTEM?
* WHAT IS THE NEW SYSTEM?
* WHO ACCEPTS IT?
* HOW IS IT CHANGED?
* WHAT IS MANUAL PROCEDURE?
* WHAT ARE THE EXCEPTIONS?
* WHAT ARE THE CONSTRAINTS?
* WHO BARES COSTS?
* WHO GAINS BENEFITS?

DECISION & AGREEMENT POINTS 1

* SOFTWARE DEVELOPMENT PROJECTS
* USER REQUEST

* INFORMAL AGREEMENT ON COMPUTER
* APPROACH APPROPRIATENESS

* SA/BA AND USER

* INITIAL FEASIBILITY

* COMPUTER GROUP

* INITIAL PRIORITY AND GO AHEAD

* STEERING COMMITTEE

DECISION & AGREEMENT POINTS 2

* FEASIBILITY STUDY

* AGREEMENT ON WHAT CURRENT SYSTEMS IS AND DOES
* AGREEMENT ON WHAT NEW SYSTEM SHOULD DO
* AGREEMENT ON MEANINGFUL ALTERNATIVES

* USERS AND COMPUTER GROUP

* AGREEMENT ON CHOSEN ALTERNATIVE

* STEERING COMMITTEE, USERS AND COMPUTER GROUP

DECISION & AGREEMENT POINTS 3

* FEASIBILITY STUDY

* FINAL PRIORITY FOR PROJECT, GO AHEAD TO NEXT STEP

* STEERING COMMITTEE

* INVESTIGATION, ANALYSIS AND DESIGN

* DETAILED AGREEMENT ON WHAT CURRENT SYSTEM DOES, INCLUDING MANUAL PROCESSES
* DETAILED PERFORMANCE CRITERIA FOR NEW SYSTEM

DECISION & AGREEMENT POINTS 4

* INVESTIGATION, ANALYSIS AND DESIGN

* EFFORTS REQUIRED ON PART OF USER GROUP
* RESOURCES NEEDED FROM OTHER SOURCES (AUXILIARY GROUPS)

* COMPUTER GROUP AND USERS (MAYBE BAS)

* FINAL DECISION TO GO AHEAD

* STEERING COMMITTEE, COMPUTER GROUP AND USERS

DECISION & AGREEMENT POINTS 5

* PROGRAMMING, SYSTEM TESTING, DOCUMENTATION

* ACCEPTANCE OF THE SYSTEM AS WORKING

* USERS, COMPUTER GROUP, MAINTENANCE GROUP, AUDIT GROUP

* CONVERSION & IMPLEMENTATION

* ACCEPTED AS AN OPERATIONAL SYSTEM

* USERS, COMPUTER GROUP, MAINTENANCE GROUP, AUDIT GROUP

DECISION & AGREEMENT POINTS 6

* EVALUATION OF OBJECTIVES MET

* SA/BA & USERS, STEERING COMMITTEE

* CHANGING REQUIREMENT

* USERS

INVESTIGATION / INQUISITION

* COLLECT INFORMATION

* MAXIMUM AMOUNT
* CORRECT
* RELEVANT

* IN MINIMUM TIME
* WHILE DOING "PUBLIC" RELATIONS

GO TO START

INVESTIGATION METHODS

* STUDY OF DOCUMENTS
* INTERVIEWS
* OBSERVATIONS
* PARTICIPANT OBSERVATION
* PROTOCOL ANALYSIS
* QUESTIONNAIRES
* GROUPS (E.G. FOCUS)
* DELPHI
* KEY: STRUCTURE FOR INFORMATION

SAMPLE INTERVIEW CHECKLIST 1

* FOR DOCUMENT

* IDENTIFICATION
* WHAT IS ENTERED
* WHEN, HOW OFTEN RECEIVED?
* HOW MANY ARE RECEIVED?

* PEAKS & TROUGHS?

* WHERE DOES IT COME FROM?

SAMPLE INTERVIEW CHECKLIST 2

* FOR DOCUMENT

* WHO DOES IT COME FROM?
* HOW IS IT DELIVERED?
* WHO DELIVERS IT?
* WHERE IS IT RECEIVED?
* WHO RECEIVES IT?
* WHAT IS IT USED FOR?
* WHAT HAPPENS TO IT?

* LEADS INTO PROCEDURES

SAMPLE INTERVIEW GUIDE

* FOR FUNCTIONS:

* PLACE EACH ON 3 x 5 CARD
* LAY OUT TABLE OF:

* DESIRABILITY
* FEASIBILITY

* REQUEST USER JUDGEMENT
* ASK FOR EXPLANATIONS AND NOTE
* ASK TO FILL IN BLANK CARDS

INTERVIEW GUIDE 1

* DO:

* KEEP PHRASING SIMPLE
* KEEP A LOGICAL SEQUENCE
* LEAVE ROOM FOR NOTES
* KEEP QUESTIONS RELEVANT TO THE PERSON BEING INTERVIEWED
* USE SHORTHAND CODES

* ++ IMPORTANT AGREEMENT
* ? MISSING INFORMATION

* HAVE BACK UP CHECKLISTS FOR

* DOCUMENTS, FILES, PROCEDURES

INTERVIEW GUIDE 2

* DO NOT:

* BE VAGUE IN PHRASING
* CRAM FOR SPACE OR BE UNTIDY
* BE PAROCHIAL

* PROCESS:
* 1. THINK OF MAJOR QUESTION
* 2. ASK IT
* 3. LISTEN
* 4. ANALYZE
* 5. NOTE
* 6. FORM SUPPLEMENTARY QUESTION

USER ALTERNATIVES I

* WHAT HE WOULD LIKE ANALYST TO THINK
* WHAT HE THINKS ANALYST WOULD LIKE TO HEAR
* WHAT HE THINKS ANALYST WOULD LIKE TO KNOW
* WHAT ACTUALLY HAPPENS
* WHAT HE WOULD LIKE TO HAPPEN
* WHAT HE HAS BEEN TOLD HAPPENS

USER ALTERNATIVES II

* WHAT HIS SUPERVISOR WOULD LIKE HIM TO SAY
* WHAT IT SAYS OFFICIALLY OCCURS
* WHAT HE HAS BEEN TOLD TO SAY
* WHAT IS GOOD FOR HIM
* WHAT IS GOOD FOR THE ORGANIZATION
* WHAT IS GOOD FOR THE UNIT

USER STRATEGY

* DEVELOP USER COMMUNITY
* ACQUISITIONS TIED TO GOALS
* USE STEERING COMMITTEE
* ENCOURAGE COMPUTER LITERACY
* DEVELOP USER REQUIREMENTS CONTINUOUSLY
* DEAL WITH QUALITATIVE & QUANTITATIVE IMPACTS
* EMPHASIZE: TIME SAVED & NEW OPPORTUNITIES

SA BEHAVIOUR

* BE OBJECTIVE
* VALIDATE
* BE PROFESSIONAL
* DO NOT GIVE UP (POLITE TENACITY)
* FOLLOW THROUGH
* BE DIPLOMATIC

FACTORS INCREASING RESISTANCE

* APPREHENSION ABOUT CHANGE
* UNANSWERED CONCERNS
* CONFLICTS WITH BELIEFS
* PAST NEGATIVE CHANGES
* IRRITATION WITH MANNER OF CHANGE

FACTORS DECREASING RESISTANCE

* PERSONAL SECURITY
* TRUST IN MANAGEMENT, UNION, WORK GROUP
* CONFIRMED POSITIVE EXPECTATIONS
* PAST POSITIVE CHANGES
* SATISFACTION WITH MANNER OF CHANGE

USER QUESTIONS 1

* WILL IT:

* AFFECT MY EARNINGS?
* BLOCK MY PROMOTION PROSPECTS?
* LIMIT MY FREEDOM OF ACTION?
* MEAN I LOSE MY JOB?
* "TAKE THE FUN" OUT OF THE JOB?
* MEAN MORE SUPERVISION?
* CUT THE NUMBER OF MY STAFF?
* IT ERODE MY AUTHORITY?
* IT INCREASE MY WORKLOAD?

USER QUESTIONS 2

* WILL I:

* JUST BECOME A `NEW BOY' ?
* BE ABLE TO COPE WITH EVERYTHING?

* IS THIS JUST THE FIRST STEP, WHAT IS NEXT?

ACCEPTANCE/REJECTION SPECTRUM 1

* ACCEPTANCE

* ENTHUSIASTIC SUPPORT
* ACTIVE COOPERATION
* WILLING ACCEPTANCE
* COOPERATION UNDER PRESSURE
* PASSIVE ACCEPTANCE
* TOLERANCE UNDER SUFFERANCE

* INDIFFERENCE

* APATHY
* MINIMUM EFFORT TO RUN SYSTEM
* REGRESSIVE, NON-LEARNING BEHAVIOUR

ACCEPTANCE/REJECTION SPECTRUM 2

* RESISTANCE

* MINIMUM EFFORT ON WORK GENERALLY
* WITHDRAWAL
* PROTESTS
* BENDING THE SYSTEM
* DELIBERATE SABOTAGE

JUSTIFICATION ANALYSIS 1

* WHAT NEW SYSTEM WILL DO
* TRAP:

* AUTOMATING WHAT WAS DONE BEFORE

* ITEMIZATION OF:

* FUNCTIONS
* DATA
* PROBLEMS TO BE SOLVED

* MAJOR CHECKPOINT:

* AGREEMENT ON WHAT NEW SYSTEM WILL DO
* AGREEMENT ON JUSTIFICATION FOR SYSTEM
* INVESTIGATORS, USERS

JUSTIFICATION ANALYSIS 2

* EFFICIENCY

* FUNCTIONS DONE QUICKER
* MORE VOLUME
* SAVE PEOPLE TIME

* ELIMINATE JOBS
* DON'T HIRE

* CHEAPER

* EFFECTIVENESS

* STANDARDS
* STATE OF THE ART (LEARNING)
* INCREASE QUALITY

JUSTIFICATION ANALYSIS 3

* OPPORTUNITY

* DO OLD THINGS, NEW WAYS
* DO NEW THINGS, NEW WAYS
* DO IMPOSSIBLE THINGS

JUSTIFICATION ANALYSIS 4

* PRODUCTIVITY =

* QUALITY x QUANTITY / COST

* EFFICIENCY =

* EFFECTIVENESS / COST

* OPPORTUNITY
* STRATEGIC RELEVANCE
* PROFIT ENHANCEMENT

JUSTIFICATION ANALYSIS 5

* PROBLEMS:
* MEASURES OF PERFORMANCE

* DRUNKARD'S PARADOX
* MEASURING REALITY, NOT POTENTIAL

* MULTIPLE MEASURES (RELATIONSHIPS)

* INDIVIDUAL CONFLICTS
* ARTISTIC COMPROMISES

* SUBJECTIVE ESTIMATIONS
* GROUPS AGREEMENT PROCESSES
* BASIS FOR FINAL COST JUSTIFICATION

GO TO START

SYSTEM DESIGN 1

* CREATE SYSTEM SPECIFICATION

* BUSINESS SYSTEM DESIGN
* TECHNICAL SYSTEM DESIGN

* BASIS FOR

* FINAL COST/BENEFIT ASSESSMENT
* PLAN FOR PROJECT IMPLEMENTATION

DESIGN METHODS

* COMPARISON / DIFFERENTIATING
* DESIGNING / REQUIREMENTS

* TASK UNDERSTANDING / MACRO
* COGNITIVE UNDERSTANDING / MICRO

* ENHANCEMENTS / EVOLUTION
* VISIONING / NORMATIVE

* SOCIAL ENGINEERING
* GOAL SETTING

USER INVOLVEMENT

* DEFINING & AGREEING ON PROJECT AIMS
* LIAISON OFFICERS
* DECISION CHECKPOINTS
* STAFF CONTRIBUTIONS TO INVESTIGATION
* STAFF TRAINING
* STAFF PARTICIPATION DURING CONVERSION
* MOCKUP EVALUATION
* EVALUATION
* FEEDBACK

SYSTEM DESIGN 2

* SYSTEM SPECIFICATION DOCUMENT
* FOR OLD AND NEW SYSTEM
* STILL NOT SUBROUTINE LEVEL
* FUNCTIONAL LEVEL
* 1. SYSTEM SUMMARY
* 1.1 MANAGEMENT SUMMARY
* 1.2 SYSTEM ORGANIZATIONAL CHART
* 1.3 NARRATIVE DESCRIPTION
* 1.4 ORGANIZATIONAL RESPONSIBILITIES
* 1.4 COST/BENEFIT STUDY

SYSTEM DESIGN 3

* 2. DATA SPECIFICATIONS
* 2.1 FILES AND DATA STRUCTURES
* 2.2 INPUTS
* 2.3 OUTPUTS
* 3. PROCESSING SPECIFICATIONS
* 3.1 MANUAL
* 3.2 COMPUTER
* 4. INTERFACE SPECIFICATIONS
* 4.1 METAPHOR
* 4.2 SCREEN MOCKUPS
* 4.3 EVALUATION

SYSTEM DESIGN 4

* 5. INTEGRATION SPECIFICATIONS
* 5.1 DATA AND FILE TRANSFER
* 6. CONSTRAINT SPECIFICATIONS
* 6.1 SECURITY
* 6.2 TIME LIMITS
* 7. SYSTEM TEST PLAN
* 7.1 TEST ORGANIZATION
* 7.2 TEST SCHEDULE

SYSTEM DESIGN 5

* 8. CONVERSION PLAN
* 8.1 CONVERSION TASKS & RESPONSIBILITIES
* 8.2 FILE DATA SOURCES & PREPARATION METHODS
* 8.3 CONVERSION SCHEDULE & COSTS

SYSTEM DESIGN 6

* STEERING COMMITTEE

* FINAL DETERMINATION

* GO \ NO GO
* DELAY
* SEEK REVISION
* PASS UP

* AMOUNT
* CONTROVERSY

* USUALLY FORMALITY
* INFORMAL COMMITMENTS CRUCIAL
* BATTLES SETTLED BEFORE

CONVERSION

* FILE CONVERSION
* USER TRAINING
* DOCUMENTATION REVIEW
* OPERATIONAL ACCEPTANCE TRIALS
* MAINTENANCE REVIEW
* SOFTWARE REVIEW
* PROTOTYPING
* HARDWARE DELIVERY & INSTALLATION
* SERVICES EXPANSION

GO TO START

SOFTWARE EXPERIMENT

* 3 PEOPLE TEAMS

* EXPERIENCE & TALENT

* 3 DAY EFFORT ROUGHLY
* OBJECTIVES

* CREATE IN SHORTEST TIME
* CREATE SHORTEST CODE
* MINIMUM CORE USAGE
* MINIMUM EXECUTION ITEM
* EASY TO MAINTAIN CODE
* EASY TO READ OUTPUT

* SOFTWARE AS AN ART FORM

SOFTWARE COST OBSERVATIONS 1

* COST OF ERRORS = 50%
* COST OF CONCEPTUAL & DESIGN ERRORS = 40%

* ERROR IN ESTIMATION OF EFFORT

* BEFORE REQUIREMENTS 100%
* BEFORE DEVELOPMENT 50%
* BEFORE OPERATIONS 25%


* IF STANDARD/FREQUENT JOB 1/2 ABOVE

SOFTWARE COST OBSERVATION 2

* COST OF SOFTWARE LINE OF CODE

* LINEAR UP TO 70% MACHINE UTILIZATION.
* TWICE AS MUCH AT 80%
* FOUR TIMES AT 90%

PROGRAMMING, TESTING, DOCUMENTATION 1

* FINAL DECOMPOSITION OF SPECIFICATIONS

* TAILORED TO PEOPLE RESOURCES
* TAILORED TO HARDWARE & SOFTWARE

* CONSIDERABLE OVERLAP
* CASE AND SOFTWARE ENGINEERING
* LIVE TESTING
* INTERFACE REFINEMENT

PROGRAMMING, TESTING, DOCUMENTATION 2

* VARIABILITIES

* MANAGEMENT STYLE
* DEGREE OF TALENT
* STATE OF HARDWARE & SOFTWARE
* MORAL & COMMITMENT
* ATTITUDE OF TECHNICAL PEOPLE

PROGRAMMING, TESTING, DOCUMENTATION 3

* VARIABILITIES

* RELATIONSHIPS TO USER GROUPS
* DEGREE OF SPECIALIZATION
* TEAM STRUCTURES

* STANDARD
* EGOLESS
* SURGICAL
* APPRENTICESHIP
* BALANCED

SYSTEM DEVELOPMENT CONCERNS I

* NOT WHAT YOU DO BUT WHO DOES IT
* NEVER INSTALL WITHOUT TESTING
* IN CASE OF FAILURE STOP AND THINK, NOT RUN
* MAKE SURE COMPETITION DOES NOT WIPE OUT GAINS

* PROPRIETARY HARDWARE AND SOFTWARE TRAP

SYSTEM DEVELOPMENT CONCERNS II

* EFFICIENT IF YOU PRODUCE MORE WITH LESS
* EFFECTIVE IF YOU POCKET THE GAINS
* MAKE ONLY ONE MAJOR CHANGE AT A TIME

* SYSTEM FAULTS ARE MULTIPLICATIVE

* ANTICIPATE CHANGE
* INVESTIGATE EXCEPTIONS

MAINTENANCE

* USED TO DISGUISE RUNAWAYS

* COST YEAR ONE EQUALS DEVELOPMENT COST

* ALL SYSTEMS SUBJECT TO CHANGE
* SUPPORT SOFTWARE CHANGES
* SOURCES OF APPLICATION CHANGES

* RESIDUAL BUGS
* UNEXPECTED EXCEPTIONS
* CHANGING USER REQUIREMENTS
* REDUCTIONISM ERRORS

* TRADEOFFS AND CONFLICTS

* APPLICATIONS CONCEPTION ERRORS

SOFTWARE MAINTENANCE PROBLEMS 1

* IN ORDER OF IMPORTANCE:

* USER KNOWLEDGE
* LACK OF USER UNDERSTANDING
* INADEQUATE TRAINING
* PROGRAMMER EFFECTIVENESS

* PRODUCTIVITY, MOTIVATION & SKILLS
* PRODUCT QUALITY
* ADEQUATE DESIGN SPECIFICATIONS
* QUALITY OF PROGRAMMING
* QUALITY OF DOCUMENTATION

SOFTWARE MAINTENANCE PROBLEMS 2

* MACHINE REQUIREMENTS
* STORAGE & PROCESSING TIME
* SYSTEM RELIABILITY
* HARDWARE & SOFTWARE
* DATA INTEGRITY

EVALUATION

* ACHIEVE WHAT USERS WANTED?
* HAVE REQUIREMENTS CHANGED?
* EASY TO LEARN?
* TRAINING AND HELP ADEQUATE?
* USER EFFORT REQUIRED
* HARDWARE & SOFTWARE PERFORMANCE?
* ERROR TYPE AND FREQUENCY ANALYSIS?
* FREQUENCY AND TYPE OF EXCEPTIONS?
* EVOLUTION?
* USER SATURATION?
* INTEGRATION?

SUPPORT FUNCTIONS

* STANDARDS
* TRAINING
* SYSTEMS SOFTWARE
* GENERAL APPLICATION SYSTEMS

* DATABASES
* COMMUNICATIONS
* TRANSACTION TRACKING

* DOCUMENTATION, LIBRARY, CLERICAL
* FORWARD RESOURCE PLANNING
* FORWARD SYSTEM PLANNING
* DISTRIBUTED COMPUTING & SYSTEMS

GO TO START


EVALUATION View Graphs

* (c) Copyright 1992 Murray Turoff

EVALUATION OF SYSTEMS 1

* COMPUTER SCIENCE

* EXPERT JUDGMENT
* WISDOM
* FEATURES COMPARISON
* NETWORK & FLOW MODELS

EVALUATION OF SYSTEMS 2

* INDUSTRIAL ENGINEERING

* ERGONOMICS
* ERROR ANALYSES
* ACTIVITY MONITORING
* HUMAN FACTORS
* MOTOR ABILITIES
* SIMULATIONS & MODELS

EVALUATION OF SYSTEMS 3

* PSYCHOLOGICAL AND SOCIOLOGICAL

* MENTAL ABILITIES
* PROTOCOL ANALYSIS
* FOCUSED GROUPS
* CONTROLLED EXPERIMENTS
* FIELD TRIALS
* INTERVIEWS
* SURVEYS
* LONGITUDINAL STUDIES

EVALUATION OF SYSTEMS 4

* MANAGEMENT SCIENCES

* SYSTEM MONITORING
* USER SATISFACTION
* COST - BENEFITS / PRODUCTIVITY
* EFFICIENCY
* QUALITY
* OPPORTUNITIES

EVALUATION OF SYSTEMS 5

* COGNITIVE PSYCHOLOGY

* COGNITIVE PROCESSES
* TASK ANALYSIS
* HUMAN PROBLEM SOLVING

* SOCIOLOGICAL

* GROUP PROCESSES
* ENVIRONMENTAL INTERACTION

EVALUATION OF SYSTEMS 6

* ANTHROPOLOGICAL

* METAPHORS
* PARTICIPANT OBSERVATION

* ARCHAEOLOGICAL

* ARTIFACTS & TOOLS

* PHILOSOPHICAL

* VIRTUAL REALITY

EVALUATION

* "IMPLEMENTATION WITHOUT A FORMAL ASSESSMENT IS LIKE TRYING TO NEGOTIATE A MINEFIELD BLINDFOLDED." JAMES BAIR IN THE OFFICE OF THE FUTURE

FORMATIVE EVALUATION

* FEEDBACK PERMITS CORRECTIVE ACTION IF PERFORMANCE IS SUB OPTIMAL AND SUGGESTIONS FOR IMPROVEMENT (GUIDES SYSTEM EVOLUTION)
* PRODUCES IMPLEMENTATION GUIDELINES
* PREVENTS INCORRECT & SURFACE INTERPRETATIONS

SUMMATIVE EVALUATION

* MEASURE AND "PROVE" SIGNIFICANT BENEFITS AND COSTS IN TERMS OF IMPACTS
* USUALLY NECESSARY TO ASSURE CONTINUED FUNDING OF USE OF THE SYSTEM AFTER A PILOT

EVALUATION MODEL 1

* ENVIRONMENTAL FACTORS

* INFLUENCE SYSTEM
* E.G. JOB MARKETPLACE

* CONTROL VARIABLES

* CAN CHANGE SYSTEM
* E.G. SALARY POLICY

EVALUATION MODEL 2

* INTERVENING VARIABLES

* CAN BE MEASURED
* E.G. JOB SATISFACTION

* OUTPUT VARIABLES

* OF INTEREST
* E.G., JOB TURNOVER

EVALUATION GUIDELINES

* SHOULD BE OVER TIME
* MEASUREMENT POINTS

* BEFORE USAGE
* SHORT TERM
* LONG TERM

* SHOULD USE SEVERAL METHODS OF DATA COLLECTION

QUESTIONNAIRES

* MAIN SOURCE OF QUANTIFIABLE DATA

* SUBJECTED TO MULTI-VARIATE ANALYSIS
* TO UNTANGLE CAUSALITY

* REPEAT SOME QUESTIONS

* TWO OR MORE POINTS IN TIME
* OBSERVE CHANGES

EVALUATION METHODS 1

* SURVEYS
* PROTOCOL ANALYSIS
* INTERVIEWS
* FOCUSED GROUPS
* ACTIVITY MONITORING
* PARTICIPANT OBSERVATION

EVALUATION METHODS 2

* MENTAL MODELS
* COGNITIVE WALKTHROUGHS
* PILOT & PROTOTYPE TESTING
* CONTROLLED EXPERIMENTS
* LONGITUDINAL STUDIES
* FIELD TRIALS
* ARTIFACT & TOOL ANALYSIS

PERFORMANCE OF I.S. 1

* QUANTITATIVE

* THROUGHPUT
* RELIABILITY
* TIMELINESS
* ACCURACY
* QUANTITY
* AVAILABILITY
* APTITUDE
* USE

PERFORMANCE OF I.S. 2

* QUALITATIVE & SUBJECTIVE

* QUALITY
* OPPORTUNITY
* RELEVANCE
* SELECTIVITY
* COOPERATION
* ATTITUDES
* CHANGE
* MORAL & COHESION
* STRESS & EFFORT

VIEWS OF THE WORLD

* REAL WORLD

* OUTCOMES VALIDATION

* REQUIREMENTS MODEL

* SYSTEMS ANALYSIS VERIFICATION

* IMPLEMENTATION MODEL

* SYSTEM DESIGN TESTING

* INTERFACE MODEL

* METAPHOR SYSTEM OPACITY

* MENTAL MODEL

* EXPERIENCE FUNCTIONAL OPACITY

INFORMATION DOMAINS OF USERS I

* COMMON I.S. SUPPORT LEVELS

* SIMPLE INQUIRY / CALCULATIONS / MESSAGING
* ORGANIZING / FILTERING / SUMMARIZING
* STATUS BRIEFING / REPORT GENERATION
* TRACKING / MONITORING

INFORMATION DOMAINS OF USERS II

* CURRENT INDIVIDUAL CHALLENGES

* EXCEPTION REPORTING
* MODELING / STRUCTURING
* DIAGNOSIS
* DISCOVERY
* HYPOTHESIS TESTING AND ANALYSIS

* CURRENT GROUP CHALLENGES

* PLANNING AND DECISION ANALYSIS
* DECISION IMPLEMENTATION
* COMMAND AND CONTROL

USER ROLES AND TYPES

* NOVICE, CASUAL
* INTERMEDIARY
* FREQUENT
* OPERATOR, SPECIALIZED
* EXPERIENCED, ROUTINE
* PROBLEM SOLVER
* POWER
* RESULTS:

* DIFFERENT ROLES IN ONE SYSTEM
* MULTIPLE INTERFACE METHODS

USER RESPONSE TO INADEQUATE SYSTEM 1

* DIS-USE:

* TURN TO OTHER SOURCES

* MIS-USE:

* USING INAPPROPRIATE WAYS

* PARTIAL USE:

* USE OF WRONG SUBSET

* DISTANT USE:

* USE OF INTERMEDIARY

USER RESPONSE TO INADEQUATE SYSTEM 2

* MODIFICATION OF TASK:

* CHANGE TASK TO FIT SYSTEM

* COMPENSATORY ACTIVITY:

* USER HAS TO DO MORE

* DIRECT PROGRAMMING:

* USER MODIFIES SYSTEM

* NON-USE:

* AVOIDING THE SYSTEM

GO TO START


PROFESSIONALS View Graphs

* (c) Copyright 1992 Murray Turoff

PEOPLE STRUCTURE 1

* PROGALYSES
* ONE PERSON ALL TASKS

* FROM USER REQUEST TO EVALUATION

* HISTORICAL APPROACH
* TYPICAL IN

* SMALL BUSINESS
* UNDER 15 TECHNICAL PEOPLE
* USER ORGANIZATIONAL UNIT

* TECHNICAL STAFF AS GENERAL WORKERS
* GOOD MORAL

PEOPLE STRUCTURE 2

* USER REQUEST

* BUSINESS ANALYST

* FEASIBILITY STUDY

* BUSINESS ANALYST
* APPLICATION SYSTEMS ANALYST

* INVESTIGATION, ANALYSIS

* BUSINESS ANALYST
* APPLICATION SYSTEMS ANALYST
* TECHNICAL SYSTEMS ANALYST

PEOPLE STRUCTURE 3

* SYSTEM DESIGN

* APPLICATION SYSTEM ANALYST
* TECHNICAL SYSTEMS ANALYST
* SYSTEMS DESIGNER
* INTERFACE DESIGNER
* SENIOR PROGRAMMER

PEOPLE STRUCTURE 4

* PROGRAMMING DOCUMENTATION

* SYSTEM DESIGNER
* SENIOR PROGRAMMER
* APPLICATION PROGRAMMERS
* SYSTEM PROGRAMMERS
* DOCUMENTORS

PEOPLE STRUCTURE 5

* SYSTEM TESTING

* SYSTEM ASSURANCE OFFICERS
* MAINTENANCE ANALYSTS
* AUDITORS
* TESTORS
* TRAINERS

PEOPLE STRUCTURE 6

* CONVERSION

* SYSTEM ANALYST
* APPLICATION SYSTEM ANALYST
* MAINTENANCE ANALYST
* MAINTENANCE OPERATORS
* SYSTEM ASSURANCE OFFICERS
* AUDITORS
* TRAINERS
* USERS

PEOPLE STRUCTURE 7

* MAINTENANCE, EVALUATION

* MAINTENANCE STAFF
* USERS
* TRAINERS
* BUSINESS ANALYSTS
* APPLICATIONS SYSTEMS ANALYSTS

I.S. CENTRALIZED STAFFING 1

* LARGE ORGANIZATIONS:

* BUSINESS ANALYSTS
* SYSTEMS ANALYSTS
* SYSTEMS DESIGNERS
* SYSTEM PROGRAMMERS
* APPLICATION PROGRAMMERS

I.S. CENTRALIZED STAFFING 2

* LARGE ORGANIZATIONS

* SYSTEMS ASSURANCE SPECIALISTS
* MAINTENANCE SPECIALISTS
* DATA BASE SPECIALISTS
* AUXILIARY SPECIALTIES

* HUMAN FACTORS, COMMUNICATIONS
* EVALUATION, AUDITING
* TRAINING, DOCUMENTATION

PEOPLE STRUCTURE 8

* PROGALYST ADVANTAGES

* SIMPLIFIED COMMUNICATIONS

* BETWEEN PEOPLE
* BETWEEN PHASES

* TECHNICALLY REALISTIC SYSTEMS
* CONSISTENCY OF QUALITY

PEOPLE STRUCTURE 9

* PROGALYST DISADVANTAGES

* LIMITED NUMBER OF PROJECTS
* WORKLOAD IMBALANCE
* ONE SA CAN KEEP MANY BUSY
* MAINTENANCE BUILDS UP
* DOCUMENTATION OFTEN NEGLECTED
* WEAK LINK IN CHAIN TALENT SITUATION
* PEOPLE SKILLS AND TECHNICAL TALENT?

PEOPLE STRUCTURE 10

* SPECIALIZATION ADVANTAGES

* BALANCED WORKLOADS
* HIGH DEGREE OF PROFESSIONALISM
* CLEAR ADVANCEMENT ROUTES
* PRODUCTIVITY
* ACCOUNTABILITY

PEOPLE STRUCTURE 11

* SPECIALIZATION DISADVANTAGES

* ARRANGING WORKABLE HANDOVER POINTS
* ENFORCING RIGOROUS DOCUMENTATION
* POST MORTUM ANALYSIS
* COORDINATING PARALLEL ACTIVITIES
* CREATING MEANINGFUL JOBS
* PREVENTING IMBALANCES OF:

* POWER AND AUTHORITY
* STATUS

TECHNICAL TALENTS 1

* SA'S AND PROGRAMMERS
* ABILITIES

* COMMUNICATION
* ANALYSIS
* DISCIPLINE
* SOCIAL SENSITIVITY
* DETAIL
* COORDINATION

TECHNICAL TALENTS 2

* KNOWLEDGE & EXPERIENCE

* HARDWARE & SOFTWARE
* DATA HANDLING
* MANAGEMENT PROBLEMS & TECHNIQUES
* COMMUNICATIONS & ANALYTICAL TECHNIQUES
* BUSINESS PROBLEMS & TECHNIQUES
* INDIVIDUAL & GROUP PROCESSES

SA SPECIAL TALENTS

* MAINTAINING CUSTOMER RELATIONS
* MAINTAINING CUSTOMER COMMUNICATIONS
* ASSESSING CUSTOMER NEEDS
* CONDUCTING PRESENTATIONS
* TRAINING OTHERS
* PROVIDING SUPERVISION & LEADERSHIP

LOSING OBJECTIVITY

* `HALO' EFFECT
* DONE IT BEFORE
* PERSONAL RELATION TO USER
* DESIGN IN DETAIL TOO EARLY
* WITHDRAW IN CONFUSION
* FRUSTRATION WITH PROGRESS
* TOO CLOSE TO USERS
* TOO DISTANCE FROM USERS

IMPORTANT TECHNICAL TALENTS 1

* TRAINING OTHERS
* DOCUMENTATION
* PROVIDING SUPERVISION & LEADERSHIP
* TECHNICAL KNOWLEDGE
* MAINTAINING CUSTOMER RELATIONS
* SYSTEM MODIFICATIONS

IMPORTANT TECHNICAL TALENTS 2

* MAINTAINING COMMUNICATIONS
* ASSESSING CUSTOMER NEEDS
* JOB COMMITMENT & EFFORT
* DEBUGGING & TESTING
* CONDUCTING PRESENTATIONS
* PLANNING, ORGANIZING & SCHEDULING

FACTORS AFFECTING SYSTEMS ANALYSIS 1

* QUALITY OF MANAGEMENT
* MANAGEMENT CLIMATE
* STAFF RELATIONS CLIMATE

* TRUST, CONFIDENCE, FAIR DEALING, DELEGATION OF AUTHORITY
* MISTRUST, MOCKERY, ANGER, INTERFERENCE

FACTORS AFFECTING SYSTEMS ANALYSIS 2

* PROCESS OF CREATING PROJECTS

* "THIS COMPUTER PROCESS IS TYPICAL. MY BOSS GETS A BEE IN HIS BONNET ABOUT A NEW SYSTEM. PROBABLY HE WAS SOLD THE IDEA BY SOMEONE ELSE. OUT OF THE BLUE, WITHOUT CONSULTATION, HE ANNOUNCES THAT A NEW SYSTEM MUST GO IN BY THE END OF THE YEAR".

MANAGEMENT STYLES

* CONCERN FOR
* PEOPLE PRODUCTION JARGON
* HIGH HIGH TEAM MANAGEMENT
* MEDIUM MEDIUM CONSERVATIVE
COMPROMISE
* HIGH LOW COUNTRY CLUB
* LOW HIGH TASK MANAGEMENT
* LOW LOW IMPOVERISHED (WIB)

MANAGEMENT STYLES

* ROLES OF SYSTEMS ANALYST
* TEAM MANAGEMENT SA CATALYST
* CONSERVATIVE SA COMMITTEE

MEMBER

* COUNTRY CLUB SA TOLD
* TASK MANAGEMENT SA TELLS
* IMPOVERISHED SA IGNORES

HERZBERG'S MOTIVATION/HYGIENE THEORY

* MOTIVATORS (JOB CONTENT)

* ACHIEVEMENT: PRIDE IN A JOB
* RECOGNITION: PRAISE
* WORK: INTERESTING & CHALLENGING
* RESPONSIBILITY: CARRYING OUT TASKS
* ADVANCEMENT: MOBILITY

* HYGIENE FACTORS (JOB CONTEXT)

* COMPANY POLICY & ADMINISTRATION
* INTERPERSONAL RELATIONS
* SUPERVISION
* SALARY AND WORKING CONDITIONS

MOTIVATION

* NEEDS

* CAUSE

* DRIVES

* WHICH GENERATE

* BEHAVIOUR

* TO REACH

* GOALS

* THAT LEAD TO

* REDUCTION OF TENSION

MASLOW'S HIERARCHY OF NEEDS

* 1. PHYSIOLOGICAL

* STARVING, FREEZING

* 2. SAFETY

* PHYSICAL RISK
* FREEDOM RISK

* 3. LOVE & BELONGING

* SOCIAL, TRIBAL

* 4. ESTEEM

* RECOGNITION

* 5. SELF ACTUALIZATION

* DOING YOUR BEST

HAWTHORN EXPERIMENTS 1

* PRODUCTIVITY INCREASES AS A RESULT OF ATTENTION

* 1. FEELING OF IMPORTANCE
* 2. TEAM SPIRIT
* 3. COMMON OBJECTIVE
* 4. PLEASANT ENVIRONMENT
* 5. FRIENDLY SUPERVISION

HAWTHORN EXPERIMENTS 2

* MAFIA WORKS GROUPS:

* 1. INFORMAL
* 2. LIMIT ON INDIVIDUAL
PRODUCTIVITY
* 3. TRANSGRESSORS PUNISHED
* 4. INFORMAL
LEADER/REPRESENTATIVE
* 5. CLANNISH

CLIMATE CONDITIONS

* QUALITY OF MANAGEMENT
* MANAGEMENT CLIMATE
* STAFF RELATION CLIMATE
* QUALITY OF I.S. STAFF
* QUALITY OF USERS

TURNOVER MODEL 1

* DEMOGRAPHIC VARIABLES

* AGE -
* ORGANIZATIONAL TENURE -
* EDUCATION+

* ROLE STRESSORS

* ROLE AMBIGUITY (EXPECTATIONS) +
* ROLE CONFLICT (EVALUATION) +

* CAREER RELATED

* SALARY -
* PROMOTABLITIY -
* CAREER OPPORTUNITIES +

TURNOVER MODEL 2

* INTERVENING VARIABLES

* JOB SATISFACTION-
* CAREER SATISFACTION-
* ORGANIZATIONAL COMMITMENT-

* OUTPUT VARIABLE

* TURNOVER RATE

TALENT ISSUE

* WHAT ARE THE PRACTICES AND POLICES THAT WILL MAINTAIN OUTSTANDING TECHNICAL TALENTS IN YOUR ORGANIZATION?

FAVOURABLE MANAGEMENT CLIMATE

* MANAGERS ALLOWED TO MANAGE
* ACHIEVEMENT WILL BE RECOGNIZED AND REWARDED
* ENTERPRISE AND NEW IDEAS ACCEPTED
* SENIOR MANAGERS LISTEN TO THOSE BELOW
* ACCOUNTABILITY IS CLEAR
* RESPONSIBILITY CAN BE DELEGATED

LEADERSHIP CHARACTERISTICS IN I.S.

* INDUSTRY AND ORGANIZATIONAL KNOWLEDGE
* RELATIONSHIPS IN THE FIRM AND INDUSTRY
* REPUTATION AND TRACK RECORD
* ABILITIES AND SKILLS
* PERSONAL VALUES
* MOTIVATION

GO TO START


SYSTEM FAULTS View Graphs

* (c) Copyright 1992 Murray Turoff

MAJOR SYSTEM FAULTS 1

* REPORTS INADEQUATE/REDUNDANT

* LACK OF RECALL

* INFORMATION OVERLOAD

* LACK OF PRECISION

* LOW DATA ACCURACY
* MISSED SCHEDULES

MAJOR SYSTEM FAULTS 2

* EXCEEDED CAPACITY
* EXISTING SYSTEMS RETAINED
* NOT ALL PROCESSING/DATA INCLUDED
* COGNITIVE & TASK MISMATCHES
* RESTRICTIVE OR INFLEXIBLE
* STAFF UNDER PRESSURE
* DIFFICULT TO LEARN

SYSTEM FAULT CONSEQUENCES

* HIGHER COSTS

* EXCESSIVE MAINTENANCE
* MORE HARDWARE
* MORE STAFF

* LOWER BENEFITS

* PROBLEMS NOT FULLY SOLVED
* USER DISSATISFACTION
* POOR QUALITY OUTPUTS

* NEW PROBLEMS CREATED

* MANAGEMENT INFORMATION NOT USED
* FULL POTENTIAL OF SYSTEM UNUSED

INADEQUATE REPORTS

* MANAGERS

* DID NOT KNOW WHAT TO ASK
* DID NOT KNOW WHAT TO EXPECT
* WERE NOT COMPUTER LITERATE
* NOT QUANTITATIVE ORIENTED

* ANALYST

* THOUGHT THEY KNEW BEST
* PRODUCE EASY REPORTS
* REPORTS EVERYTHING

LOW DATA ACCURACY

* INPUT PROCEDURE TOO COMPLEX
* EXCEPTIONS NOT PROPERLY REPRESENTED
* ANOTHER RESPONSIBLE FOR CORRECTION

PERSON OR MACHINE

* EFFICIENT USE OF PRINTER
* READABLE REPORTS
* SPUDS, MURPHY'S
* MINIMIZE INPUT ERRORS
* USER COSTS IGNORED

REPORT AND DATA SOLUTIONS

* USER DEFINED REPORT CAPABILITIES
* DATA SOURCE RESPONSIBLE FOR INPUT
* DATA SOURCE GAINS BENEFIT FROM SYSTEM

SYSTEM FAILURE CAUSES

* USERS DO NOT PARTICIPATE
* ANALYST THINKS S/HE KNOWS BEST
* POOR INVESTIGATION
* NO OBJECTIVES SET
* NO CHECKPOINTS
* MACHINES PUT FIRST
* PEOPLE IGNORED
* PURE AUTOMATION APPROACH
* NO PROBLEM ANALYSIS

ORGANIZATIONAL LEARNING

* INVESTMENT AND INITIATION

* FAILURE: STAGNATION

* TECHNOLOGY LEARNING & ADAPTATION

* FAILURE: NARROW FOCUS & AUTOMATION

* RATIONALIZATION & MANAGEMENT CONTROL

* FAILURE: EFFICIENCY DOMINATED

* MATURITY & WIDESPREAD USE

* FAILURE: RIGIDITY

GO TO START


ACQUISITION View graphs

* (c) Copyright 1992 Murray Turoff

ACQUISITION TIME

* SMALL ACQUISITIONS (MICROS)

* 2 MONTHS (1 WEEK TO 6 MONTHS)

* MEDIUM ACQUISITIONS (PC UPGRADE)

* 5 MONTHS (2-18 MONTHS)

* LARGE ACQUISITIONS (NEW CP)

* 22 MONTHS (6 MONTHS TO 6 YEARS)

CRITICAL FACTORS IN ACQUISITION

* DP PLAN RELATING ACQUISITIONS TO BUSINESS OBJECTIVES
* KNOWLEDGE OF MANAGEMENT AND PROFESSIONALS ABOUT COMPUTERS
* USER GENERATED REQUIREMENTS
* PRE-SCREENING OF VENDORS
* SPECIAL EXPERTISE IN LEGAL, CONTRACT AND PROCUREMENT AREA.
* PROCUREMENT GUIDELINES AND STANDARDS

COSTING METHODS 1

* FREE USE CENTER

* DEVELOPMENT ENVIRONMENT
* SINGLE USE APPLICATIONS

* RESOURCE CENTER

* PRIORITIES
* PLANNED ALLOCATION

* COST CENTER

* FULL, MARGINAL
* AVERAGE, ACTUAL
* TRANSACTIONS

COSTING METHODS 2

* PROFIT CENTER

* FREE MARKET
* NON COST RELATED

* CONSIDERATIONS

* SHARED RESOURCES
* MAKE OR BUY
* SUPPLY OR USE
* PREDICTABILITY
* UNDERSTANDABLE

GOVERNMENTAL DIFFICULTIES

* BUDGET AGRICULTURAL CYCLE

* LACK OF PLANNING RELIANCE

* EFFICIENCY EMPHASES
* AUTOMATION JUSTIFICATION

* NO DECISIONS

* STAFFING

* EXPERTS AS MANAGERS
* SALARIES

* LOW COST AWARDS
* PUBLIC EXPOSURE

GOVERNMENTAL COUNTERS

* CAMOUFLAGE (Word Processing)
* NUMBER TOYING & FUDGING
* CRISES OPPORTUNITIES

* CREATE IT

* STICKING ONES NECK OUT
* CO OP IMPORTANT USERS
* USER TECHNOLOGY
* TURN TAKING
* RESOURCE SHARING

VENDOR PLOYS 1

* WE DON'T NEED TO WRITE THAT DOWN!
* YOUR GETTING OUR BEST PRICE!
* PRICE INCREASE COMING!
* RIGHT TO SUBSTITUTE AND OTHER SHELL GAMES!
* FORM CONTRACTS!
* TRY IT, YOU'LL LIKE IT!
* WE CAN'T DO IT BECAUSE GSA WON'T LET US
* WE WOULD BE SETTING A PRECEDENT!

VENDOR PLOYS 2

* WE WILL ANNOUNCE A MODIFICATION SOON TO HANDLE THAT!
* YOU CAN BE ASSURED WE WILL OFFER THAT IN THE FUTURE
* OUR LANGUAGE MATCHES THE STANDARD AND THEN ADDS MORE
* IF YOU SIGN FOR FIVE YEARS MAINTENANCE WE CAN LOWER THE BASE PRICE

COMMON MYTHS

* MY PRODUCT HAS TO BE EASY TO USE -- MY SECRETARY USES IT
* IT MUST BE GOOD -- I NEVER HEARD COMPLAINTS
* WE WILL MAKE IT EASY TO USE -- JUST QUANTIFY IT FOR US
* I HAVE ADDED EVERY EASY TO USE FEATURE I COULD THINK OF
* IT IS EASY TO USE -- I USE IT MYSELF

SYSTEM SEXY

* ALL FUNCTIONS
* COMPATIBLE WITH EVERYTHING
* EASY TO LEARN
* EASY TO USE
* INFINITELY FLEXIBLE
* WILL HELP ANYONE FOR ANYTHING
* WILL SAVE A FORTUNE
* SOFTWARE WORKS PERFECTLY
* ULTIMATE RELIABILITY
* HAS GAMES

CONCERNS ABOUT USE OF MICROS

* UNDETECTED ERRORS IN PROGRAMS
* HIDDEN MODELS
* LACK OF ACCOUNTABILITY
* CONTROL OF DATA ACCURACY
* LOSS OF KEY PERSONAL
* DUPLICATION OF RESOURCES

GO TO START