|
|
|
|
|
|
|
Goals: |
|
Identify the behavior of KM systems over time |
|
Link into causal model |
|
Create formal simulation |
|
Identify possible policy levers |
|
Site |
|
Two of top ten IT Consulting Firms worldwide |
|
KM Experts, senior managers, staff |
|
Approach |
|
Case study and interview |
|
Formal Simulation |
|
Structured Model Review |
|
|
|
|
Knowledge growth from experience, decay from
turnover and obsolescence |
|
Successful knowledge management increases demand
for knowledge |
|
Increasing demand for knowledge increases costs |
|
Incremental contributions have less value than
fundamental ones |
|
|
|
|
Model of knowledge processes of firm with
well-defined domain boundaries |
|
Structures and behaviors from literature and
interviews |
|
|
|
|
|
|
|
|
|
|
|
|
Initial Conditions |
|
Knowledge decay rate constant (~33 month h/l) |
|
KM Start (time 10) |
|
5% senior staff time diverted to OKR |
|
Small seed into OKR of highly relevant documents |
|
|
|
|
Percentage Change in Junior Staff Knowledge |
|
Repository Size |
|
Repository Relevance |
|
Repository Coverage |
|
Percent Change in Senior Staff Effort |
|
|
|
|
|
Faster Knowledge Decay |
|
Initial Underfunding |
|
Unmet User Expectations |
|
Unmet Management Expectations |
|
|
|
|
|
|
|
Sustainable KM programs: |
|
Achievable if user and management expectations
met in face of endogenous change |
|
Effects may rise then fall over time |
|
Apparently unstable equilibrium |
|
Unsustainable KM programs: |
|
May start off similarly to sustainable programs |
|
Tip into failure |
|
|
|
|
|
|
Rests on several difficult to quantify factors |
|
KM satisfaction must be constantly monitored in
light of changing requirements |
|
Short-term gains and effects must be balanced
with longer-term expectations |
|
Resources shift from development to review |
|
|
|
|
|
Getting time and attention was very difficult |
|
Two hour phone interviews |
|
Description of microworld and causal model |
|
Scenario assumptions defined |
|
Dynamics of metrics described (without causal
process) |
|
Expert evaluated behavior and provided causal
interpretation |
|
Investigator provided alternative interpretation |
|
Expert critiqued alternative interpretation |
|
|
|
|
|
Suggestions for changes to behaviors |
|
Faster reactions by firm |
|
More efforts on revision |
|
Suggestions for changes to structures |
|
Influence of quality |
|
Relevance is really timeliness |
|
Feedback for changes in expectations |
|
Feedback from attempts at process improvement |
|
Insights into their own experiences |
|
|
|
|
Access was very difficult |
|
Limited live parameterization for reference
modes |
|
Model review sample small |
|
Open questions about structure |
|
|
|
|
Integration of Model Review into Dynamic
Hypothesis |
|
Use of model for policy exploration |
|
Mapping of live metrics to synthetic ones |
|
|