QUERY "Flawless Consulting" and SD

This forum contains all archives from the SD Mailing list (go to http://www.systemdynamics.org/forum/ for more information). This is here as a read-only resource, please post any SD related questions to the SD Discussion forum.
Locked
Bill Harris <bill_harris@faci
Senior Member
Posts: 51
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Bill Harris <bill_harris@faci »

Posted by Bill Harris <bill_harris@facilitatedsystems.com>

[ OD Organizational Development
SSM Soft Systems Modeling
SD System Dynamics ]

I've been thinking about this for a while, but recent threads (perhaps especially comments from Fred, for he and I and perhaps others of you are on an OD mailing list, too) have prompted me to see what others think.

Classic OD interventions, as typified by the process in Peter Block's _Flawless Consulting_, suggest one start with contracting and then move to data collection. Data collection usually consists of interviewing people to get raw data and then providing it back to the people who were interviewed to ask them to interpret it, to make sense of it. Out of that data feedback and interpretation meeting comes a decision by the client as to what action to take and how to do it.

I've used that approach, and it can be powerful. I am curious, though, how people blend that, if they do, with SD.

One can conceive of SD being a potential tool to be applied in the action phase of that consulting process. That gives ownership to the group for deciding to move forward with SD as part of the path to a solution. In a way, I think SSM sometimes adopts that sort of approach; as I understand it, SD can be used as a natural adjunct to SSM somewhat late in the process.

Yet SD has a role to play in understanding data, too. As an example, Barry's strategic forum seems to start with data collection and then proceeds to feed back processed information to the group in the form of a sequence of models. I've done that sort of thing, too.

I very much like the power of the message one sends when one gives the group the responsibility to interpret their data. Except in rare cases, I suspect groups don't have the ability to apply SD as part of the interpretative process, though, so bringing SD in early risks sending the message that they don't really own the interpretation given to the data -- it has to be done with the help of an expert. Yet waiting until late risks people getting stuck on certain interpretations an SD model might show to be naive.

How do you handle that issue? Or do you ignore it or bypass it (those might be different approaches)? If you consider yourself more of a client in this regard, how have people handled it when they've worked with you?

Bill
- --
Bill Harris
Posted by Bill Harris <bill_harris@facilitatedsystems.com>
posting date Thu, 13 Sep 2007 21:13:16 -0700 _______________________________________________
""Kim Warren"" <Kim@strategyd
Member
Posts: 36
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by ""Kim Warren"" <Kim@strategyd »

Posted by ""Kim Warren"" <Kim@strategydynamics.com>

My take on this is that SD provides a rigorous up-front architecture for how things actually work, so that data collection seeks the appropriate information. My impression is that others who use SD also place heavy emphasis on the early mapping - which itself requires discussion with participants in the situation. A common consequence of this is the discovery that most of the data actually available is of no relevance, and that some small key fraction of essential data does not exist - so that data collection often has to include originating information the organisation has never previously tracked.

I can't speak for how consultants in general approach the challenge of embarking on data collection, but have come across everything from [a] ""give us everything you have got, and we will see what it tells us"", to "" we have this 'issue tree' of possible cause-effect structure of the situation, so give us the data on the items in this picture"", to [c] ""our hypothesis is that the causes of the issue are X, Y, Z ... So give us data on that so we can show we are right"".


Kim Warren
Posted by ""Kim Warren"" <Kim@strategydynamics.com> posting date Mon, 17 Sep 2007 09:07:50 +0100 _______________________________________________
Bill Harris <bill_harris@faci
Senior Member
Posts: 51
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Bill Harris <bill_harris@faci »

Posted by Bill Harris <bill_harris@facilitatedsystems.com>

""SDMAIL Kim Warren"" <Kim@strategydynamics.com> writes:
>> My take on this is that SD provides a rigorous up-front architecture
>> for how things actually work, so that data collection seeks the
>> appropriate information. My impression is that others who use SD also
>> place heavy emphasis on the early mapping - which itself requires
>> discussion with participants in the situation. A common consequence
>> of this is the discovery that most of the data actually available is
>> of no relevance, and that some small key fraction of essential data
>> does not exist - so that data collection often has to include
>> originating information the organisation has never previously tracked.


Kim,

Thanks for your response. I do think you're right in the contribution of SD to finding the key data needed to address a problem, and that describes what I try to do.


>> I can't speak for how consultants in general approach the challenge
>> of embarking on data collection, but have come across everything from
>> [a] ""give us everything you have got, and we will see what it tells
>> us"", to "" we have this 'issue tree' of possible cause-effect
>> structure of the situation, so give us the data on the items in this
>> picture"", to [c] ""our hypothesis is that the causes of the issue are
>> X, Y, Z ... So give us data on that so we can show we are right"".


If you're not familiar with it, Peter Block wrote ""Flawless Consulting,""
which many consider to be the classic organization development consulting guide. He gives a rather clear description of data collection and interpretation as he sees it, which is quite open-ended in collection (interviews) and quite participatory in interpretation (facilitated, deep group discussion among all those who provided the data plus perhaps others). I don't think the flawless consulting approach fits any of your [a], , or [c].

Having used that process, I can say it can be quite powerful in its impact on an organization. The people who provide the data are quite likely to end up owning its interpretation in a very intellectual and emotional way, since they had primary responsibility for both the data and its interpretation. That (presumably) leads to more ownership in the process of resolving the problem. It also is broad-based; people tend to look at logical problems in the organization and emotional issues that are holding them back.

What it may not do is address the challenge of needed or useful external expertise, whether it be SD or something else. What I'm wondering is if there are better ways to incorporate (SD or other) expertise into a participatory process to keep the ownership while improving the quality.

Look at it from the perspective of sociotechnical systems (STS) design.
STS assumes that there are business, technical (process), and social
(people) issues inherent in organizational problems, and effective, sustainable solutions need to address all three areas appropriately. OD seems to have its kernel centered on the business and social aspects, with more perhaps qualitative approaches to the technical side. SD seems to have its kernel centered on the business and technical aspects, with less attention (relatively) to the social side. I'm trying to create synergy between the two types of approaches to bring the best of the participatory and expertise approaches to bear on problems.

One easy answer is to say that the process for addressing serious organizational issues starts with STS, and SD becomes a component pulled in as needed to address specific issues. I sense most SD practitioners don't see it that way, though; I sense most of us see SD as a (largely) complete, overarching approach in itself.

I've got an approach I think looks promising, but I'm curious what others think.

Bill
Posted by Bill Harris <bill_harris@facilitatedsystems.com>
posting date Mon, 17 Sep 2007 10:16:26 -0700 _______________________________________________
Roy Greenhalgh <rgreenh@attgl
Junior Member
Posts: 3
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Roy Greenhalgh <rgreenh@attgl »

Posted by Roy Greenhalgh <rgreenh@attglobal.net>

My experience of data collection over 25 years or so has been that I'm invariably offered data when I mention it. It is the stuff that clients believe will assist them meet targets.

The very fact that the client is failing to do this is why they invited me in. So, we start again, and it is at the end of the initial work, of understanding what the demand is for service, of the value that customers place of that service, and how their processes have been ""designed"" and work (usually ignorant of the demand) that we can agree that we need different, customer demand and process capability related data.

The SD maps are an important component in that understanding stage, and of course, when we come to redesign based on demand, value and flow.

Roy Greenhalgh
Posted by Roy Greenhalgh <rgreenh@attglobal.net> posting date Mon, 17 Sep 2007 13:34:36 +0100 _______________________________________________
""Jack Homer"" <jhomer@comcas
Member
Posts: 21
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by ""Jack Homer"" <jhomer@comcas »

Posted by ""Jack Homer"" <jhomer@comcast.net>

Kim Warren writes:

> I can't speak for how consultants in general approach the challenge of
> embarking on data collection, but have come across everything from [a]
> ""give us everything you have got, and we will see what it tells us"",
> to "" we have this 'issue tree' of possible cause-effect structure
> of the situation, so give us the data on the items in this picture"",
> to [c] ""our hypothesis is that the causes of the issue are X, Y, Z ...
> So give us data on that so we can show we are right"".
>

Data collection is very important to me as a modeler and consultant. Useful data, like the model's structure itself, emerge iteratively over the course of a project. We start in some projects by casting the net wide in an exploratory fashion, when there is no established set of key variables. But in other projects the client may want to start with an established set of variables, and so initially we pursue only those. However we start, though, I always find that iterative model development also leads to the need for more data, sometimes far afield from where we started. I find that this process is essential for converging on a model that is both insightful and persuasive.

I once described my experience with data collection for SD models as follows (""Why we iterate: scientific modeling in theory and practice"", SDR 12(1), 1996):
""Data collection requires persistence, patience, and immersion in messy details.
Clients know more and have access to more information than they initially think they do. The modeler often ends up getting involved as a data detective, compiler, and analyst...Some important data will be in error, irrelevant, missing, or slow to emerge...One may need to look for other sources for data, or consider using different variables or levels of aggregation to capture the same phenomena.""

Those words still ring true for me today, with another decade-plus of consulting under my belt.

- Jack Homer
Posted by ""Jack Homer"" <jhomer@comcast.net> posting date Mon, 17 Sep 2007 08:13:31 -0400 _______________________________________________
Richard Stevenson <rstevenson
Member
Posts: 37
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Richard Stevenson <rstevenson »

Posted by Richard Stevenson <rstevenson@valculus.com>



On 17 Sep 2007, at 11:01, SDMAIL Kim Warren wrote:
> My take on this is that SD provides a rigorous up-front architecture
> for how things actually work, so that data collection seeks the
> appropriate information.


Data! Now that's the most contentious issue in management science!
And, I think, the most destructive issue in rational management debate.

As practitioners (and by the way Kim... also as consultants) it
seems we need always to depend on ""data"" to support our arguments.
Rational argument, we apparently believe, must be supported by the right data. Because that's the way we have been brought up to think.

But data has two vast imperfections. First, by definition, it is historical. Second, it is always skewed in favour of the data collector and the client.

So screw data, I say. Every successful project I ever was involved
in either ignored - or better, overturned - historical data.
Acknowledge it - then throw it away. Burn it. Think better, instead.
That's much harder.

Richard Stevenson
Valculus Ltd
Posted by Richard Stevenson <rstevenson@valculus.com> posting date Mon, 17 Sep 2007 21:24:15 +0100 _______________________________________________
""John Morecroft"" <jmorecrof
Junior Member
Posts: 10
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by ""John Morecroft"" <jmorecrof »

Posted by ""John Morecroft"" <jmorecroft@london.edu>

I found Jack Homer's comments to be very helpful and insightful. His view rings true - that data, like the model's structure, emerges iteratively. I must re-read 'Why we iterate ....' SDR 12(1) 1996.

John Morecroft
Adjunct Associate Professor
Management Science and Operations
London Business School
Posted by ""John Morecroft"" <jmorecroft@london.edu> posting date Tue, 18 Sep 2007 14:13:14 +0100 _______________________________________________
Richard Stevenson <rstevenson
Member
Posts: 37
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Richard Stevenson <rstevenson »

Posted by Richard Stevenson <rstevenson@valculus.com>

My last note clearly didn't come out right! Thanks, Kim.

My intention was to convey the reality that often the SD practitioner needs to simplify and/or massage what the client sees as ""real data"", in order to generate a useful model.

My best example of this - modelling BP's Forties oilfield to evaluate ""whole of field"" investment options over a 20-year horizon. Clearly we could not model the reservoir in all its detail - the reservoir
engineers had detail simulators running on Cray computers!
Ultimately we modelled the entire reservoir as a single stock with an outflow governed by the volume of the stock itself. The engineers ultimately agreed that this solution was 85% as accurate as the Cray computer - and ""fit for purpose"" in the context of the strategy model.

My point is - data has to be chosen or fitted to the purpose of the model - and not the other way around. I have seen too many ""strategy"" models that set out model existing data - and ended up at a far greater level of detail than was appropriate. The good SD practitioner needs to be firm in demanding simplification of data that may be familiar to the client.


Richard Stevenson
Valculus Ltd
Posted by Richard Stevenson <rstevenson@valculus.com> posting date Wed, 19 Sep 2007 11:00:09 +0100 _______________________________________________
Roy Greenhalgh <rgreenh@attgl
Junior Member
Posts: 3
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Roy Greenhalgh <rgreenh@attgl »

Posted by Roy Greenhalgh <rgreenh@attglobal.net>

Richard wrote
>""So screw data, I say. Every successful project I ever was involved in
>either ignored - or better, overturned - historical data.
> Acknowledge it - then throw it away. Burn it.""

Richard
Would you please explain how you worked if you ""overturned"" historical data? What method did you adopt?

Roy Greenhalgh
Posted by Roy Greenhalgh <rgreenh@attglobal.net> posting date Tue, 18 Sep 2007 14:33:25 +0100 _______________________________________________
Bob Eberlein <bob@vensim.com&
Member
Posts: 26
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Bob Eberlein <bob@vensim.com& »

Posted by Bob Eberlein <bob@vensim.com>

A short comment on this. There is a lot of discussion of data in this thread but as I understand from the original query data was intended to mean all information about the problem domain, the people involved, what is at stake, how information is exchanged and everything else. The question posed was whether using a style of front end interaction in a project, such as ""Flawless Consulting"" leads to more effective engagements.

Actually I don't think the intention was to restrict such methodologies to the front end and that means there are two different questions to be considered. First, are there methodologies that increase the effectiveness or efficiency of information gathering? Second are the methodologies that increase the likelihood of implementation success?

In my limited experience the answer to the first question is no. It is best not to completely alienate the problem stakeholders, but beyond that the key to getting good information in a timely manner is knowing what to look for and being persistent. I don't know the answer to the second question. It seems like it should be yes, I want it to be yes, but I have never seen anything that is both methodical and effective.

Bob Eberlein
Posted by Bob Eberlein <bob@vensim.com> posting date Wed, 19 Sep 2007 07:33:25 +0100 _______________________________________________
Richard Stevenson <rstevenson
Member
Posts: 37
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Richard Stevenson <rstevenson »

Posted by Richard Stevenson <rstevenson@valculus.com>

One more thought, to clarify my comments on data. I was, indeed being somewhat contentious - but largely to make an important point.

My particular business and interest is concerned with strategy modelling - not with detailed operational modelling.

Most often, when seeking to create a truly insightful strategy model, the 'data' simply doesn't exist at the level of aggregation required.
Indeed, it is inevitably the case that some data relationships won't exist at all - such as important (non-linear) relationships between
(say) rewards, motivation and productivity. If such relationships are strategically important, then their effects might overwhelm other relationships in the model for which good data exists.

Of course we should use best efforts to build models with appropriate data. But an absence of detail data does not invalidate the impact of an insightful model. Indeed, some of the most insightful models of all time are not based on real data at all (e.g. Forrester's ""Market Growth as Influenced by Capital Investment"").

It all depends on the purpose for modelling. Clearly, if building a detail operational model of (e.g.) a supply chain or production line, then data is important. But although SD can be applied at that level, there are probably better modelling approaches. I have often been asked if SD models can be ""hard wired"" to a company's ERP system. Indeed, Powersim provides a means to do so with SAP. But in a strategy model, why would you bother?

I repeat my contention. The data has to be chosen or fitted to the purpose of the model - and not the other way around.

Richard Stevenson
Valculus Ltd
Posted by Richard Stevenson <rstevenson@valculus.com> posting date Wed, 19 Sep 2007 14:32:18 +0100 _______________________________________________
Jean-Jacques Laublé <jean-jac
Senior Member
Posts: 61
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Jean-Jacques Laublé <jean-jac »

Posted by Jean-Jacques Laublé <jean-jacques.lauble@wanadoo.fr>

Hi every body.

The question about the utility of using past data is interesting.
In many models the impression is often that the modeller is modelling more his beliefs than a real world expressed by real data.
Now is it a reason for choosing to always refer to past data?
When I have to choose between two solutions, I always choose the simpler one ,the simpler solution being often not to do anything.

I do that because instead of rushing on a solution, doing nothing leaves me plenty of time to think about the problem itself. And finding the simpler solution obliges to think hard about what one really wants to do.
I try to apply the solution and will eventually engage into some more complicate studies only after having considered the added utility of doing so.
In the case of the past data, I will consider the utility of taking them into consideration.
This reflection is sometimes more instructive than the use of the data themselves.
Finding the added utility is not just a quick consideration, but can take some time and a lot of effort.

I talked last week-end with the husband of my niece, who is working for Sodexho the World leader in collective restoration.

One of their problem is that the employees think that because they follow the Iso 900x rules everything must become perfect. In fact they do it so well that they forget the customer.
The employees think that because they use a very sophisticated system, the customer is automatically satisfied. The company is now obliged to organize courses to urge them to consider the client and to understand that the purpose of the system is not to be applied but to make the customer satisfied.

I think that today there are too many methods and not enough thinking and that the people forget to consider the problem in its whole, using simple considerations and common sense.
Giving the responsibility of decisions to all sorts of tools, is too a signal of disengagement and the fear to assume decisions.
I do not pretend that tools are not useful, but one should always know why one uses them and what relations they have with the problem to be solved.
Regards.
Jean-Jacques Laublé. Eurli Allocar
Strasbourg France.
Posted by Jean-Jacques Laublé <jean-jacques.lauble@wanadoo.fr> posting date Wed, 19 Sep 2007 15:51:07 +0200 _______________________________________________
""John Voyer"" <voyer@usm.mai
Junior Member
Posts: 5
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by ""John Voyer"" <voyer@usm.mai »

Posted by ""John Voyer"" <voyer@usm.maine.edu>

I have found this discussion's emphasis on data to be interesting. I have taught from Peter Block's ""Flawless Consulting"" for many years, and one of the things he emphasizes is the emotional aspect of consulting.
He talks a lot about how clients have issues with consultants because of concerns about control and vulnerability. He goes so far as to say that ""emotion is data,"" meaning that if you sense unease as you deal with a client, it may be the same feeling that the client's co-workers get as they deal with her/him, and it may therefore be part of what has created the problem.

Block puts a lot of stock in data, as people on this list have discussed it, but he puts equal or maybe greater stock on the emotional aspects of the work. In doing so, he gives an answer, of sorts, to Bob's second question below. Implementation will be more successful if clients feel that they have (1) valid data, (2) free choice of solutions, and (3) commitment to the change process. Items two and three on that list come more from the emotional side, less from the ""rational"" side that we so often emphasize in the SD community.

John Voyer

----------------------------------------------------------

John J. Voyer, Ph.D.
Associate Dean and
Professor of Business Administration
School of Business
University of Southern Maine
Posted by ""John Voyer"" <voyer@usm.maine.edu> posting date Wed, 19 Sep 2007 10:05:23 -0400 _______________________________________________
""Kim Warren"" <Kim@strategyd
Member
Posts: 36
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by ""Kim Warren"" <Kim@strategyd »

Posted by ""Kim Warren"" <Kim@strategydynamics.com>

It may be my fault that the thread went down the 'data' path - apologies. Bob's questions below are dead right.
I noticed afterwards that Bill's original post was angled towards SD's contribution to consulting specifically on organisational development, and replied to him off-line.
That reply to Bill included an observation that others may have found .. once a model is working [or even a diagrammatic picture] that situation-owners can clearly see is showing real-world behaviour, it can sometimes contribute to improved organisational functioning by defusing disagreements and disputes amongst the audience. It is hard to cling to an assertion that 'A causes B' when extensive, connected, time-based information shows no evidence that it does so, and shows instead that C and D have persistently been the driving factors. .. anyone else found the same?

Kim
Posted by ""Kim Warren"" <Kim@strategydynamics.com> posting date Thu, 20 Sep 2007 11:16:45 +0100 _______________________________________________
Bill Harris <bill_harris@faci
Senior Member
Posts: 51
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Bill Harris <bill_harris@faci »

Posted by Bill Harris <bill_harris@facilitatedsystems.com>

""SDMAIL Bob Eberlein"" <bob@vensim.com> writes:
>> A short comment on this. There is a lot of discussion of data in this
>> thread but as I understand from the original query data was intended
>> to mean all information about the problem domain, the people
>> involved, what is at stake, how information is exchanged and
>> everything else. The question posed was whether using a style of
>> front end interaction in a project, such as ""Flawless Consulting""
>> leads to more effective engagements.


Bob,

Thanks; you got it!


>> Actually I don't think the intention was to restrict such
>> methodologies to the front end and that means there are two different
>> questions to be considered. First, are there methodologies that
>> increase the effectiveness or efficiency of information gathering?
>> Second are the methodologies that increase the likelihood of implementation success?
>>
>> In my limited experience the answer to the first question is no. It
>> is


I agree that SD can provide deep information gathering, and so I'd mostly agree with your answer to the first question.

The part we may miss from an SD perspective is the emotions people have wrapped around situations and any power imbalances that may inhibit the development of sustainable solutions. It's not that we can't model emotions or power; we can. If we can explain the presenting problem without calling on them, we may apply Occam's razor to eliminate both, leaving people frustrated and less likely to help with implementation.
That's a call for watchfulness on our part.


>> best not to completely alienate the problem stakeholders, but beyond
>> that the key to getting good information in a timely manner is
>> knowing what to look for and being persistent. I don't know the
>> answer to the second question. It seems like it should be yes, I want
>> it to be yes, but I have never seen anything that is both methodical and effective.


We could probably all agree with the first clause of your first sentence. :-)

That second part of your response to the second question is the one I think I'm most focused on. Sure, we can produce great insights, but can we find better ways to have the solution become the clients' when we walk away? Does the OD approach offer more ownership because it has less outside expertise and more client group involvement up front and all along the way?

I do wonder if there's another alternative view, but I don't sense it's one we hold. That view would be that OD (or perhaps another, often qualitative, approach such as soft systems) is the overarching approach to improving organizations including power, structure, emotion, performance, and the like, while SD is the precision tool that comes in to make sense of specific issues. That doesn't feel quite right, for SD can get pretty overarching itself (World3, anyone?).

Here's a real-life, published example from years ago. I helped an organization reduce their overspending by 95%; the insights came from an SD model (""Pipeline Inventory: The Missing Factor in Organizational Expense Management,"" National Productivity Review, Summer 1999). The organization's manager had been an engineer active with feedback control systems; when I showed him a slide of the model (about slide 5 in a deck of about 20, as I recall), he understood immediately what might be causing the overspending and said, ""Fix it!"". I said I had 15 more slides to show, but he said he understood already; just go do it.
(Hence my suspicion that it may be easier to persuade managers who were engineers of the utility of SD.)

Incidentally, that paper also shows why credit cards may be risky to
your financial health; the same structure applies.

At his request, I wrote the software to add to the expense management systems to get the required information, but that was only half the job.
I and an admin assistant who also had had OD / change management experience developed a plan to involve key people (mostly the other admin assistants and the finance department representative) in the implementation that made the result theirs, not mine. We did a number of specific, OD-ish things that worked quite well. After a few hiccups in the process, we achieved our goal of reducing variance by 95%, making spending highly controllable (the organization was in an expense cutting mode), and getting expense management off the table as a management issue. It was used continuously and successfully until the organization was disbanded and its remnants folded into another Division. (Okay, we solved the expense management issue, not _all_ the business issues.)

What are the lessons from that? One might look on it as SD up front followed up with common sense and human courtesy, but specific OD / change management approaches were used to design the follow-on part, and I think those were both key in its success and not an obvious part of common sense or human courtesy.

So that was a successful SD intervention that followed up with OD. At the front, I had all the data I needed given to me in meetings for other purposes, and I developed the model on my own (I think the manager regarded the problem as simply a fact of life in that organization at that time). I guess I may have answered the question for myself: it may, at least sometimes, be appropriate to apply OD / change management at the end of the SD work to increase implementation success. It may be that participatory data collection (OD) and model-driven data collection
(SD) are simply two design choices, each leading potentially to somewhat different results and each being more suited potentially to somewhat different situations.

I think I'm feeling better; thanks for helping me think through this.
Other ideas (and reactions to my comments here) are more than welcome.

Thanks,

Bill
- --
Bill Harris
Posted by Bill Harris <bill_harris@facilitatedsystems.com>
posting date Wed, 19 Sep 2007 08:26:34 -0700 _______________________________________________
BalaporiaZ@schneider.com ""SD
Newbie
Posts: 1
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by BalaporiaZ@schneider.com ""SD »

Posted by BalaporiaZ@schneider.com

""SDMAIL Bill Harris"" <bill_harris@facilitatedsystems.com> wrote:
>>>> The part we may miss from an SD perspective is the emotions people
>>>> have wrapped around situations and any power imbalances that may
>>>> inhibit the development of sustainable solutions.

During a 6 yr stint in the external consulting world (supply chain, transportation network design, etc.), my most memorable moment was when a new VP at the client asked us to contact a manager to get some general information. After introducing myself, the manager's first statement was,
and I quote; ""Zahir, I know how this works. You ask for my watch and
tell my boss what time it is. So, lets get on with it."" I am thankful we
did not get that project and that this was not the norm. But I thought it spoke well to Bill's point about understanding the emotional climate.

In an effort to create the right climate, I always focused on building trust and understanding as early in the process as possible. We built that relationship with the following guiding principles that we asked the client to buy into before we started.

- The only thing more important than customer satisfaction is analytical integrity and quality. The answer is what the answer is.
- We don’t tell you what to do. We build models that help you decide what to do.
- The “WHAT” is important, but the “WHY” is invaluable.

The first point was about clearly articulating a bias towards the integrity of the analysis. If the client was using us to sell their pet idea to senior mgmt then they ran the risk that we may prove their pet idea wrong.
The second point was about not ""shifting the burden"" to us. They had to own the problem and they had to own the recommendation. Finally, most clients only wanted to know the ""what"" but the real power was in the ""why""
and you really needed the why to get buy-in from those that were not involved with the process.

No secret sauce in all of this. Just another perspective. It seemed to work for us.

Zahir Balaporia
Schneider National, Inc.
Posted by BalaporiaZ@schneider.com
posting date Thu, 20 Sep 2007 15:05:11 -0500 _______________________________________________
richard.dudley@attglobal.net &l
Newbie
Posts: 1
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by richard.dudley@attglobal.net &l »

Posted by richard.dudley@attglobal.net
<richard.dudley@attglobal.net>

Jack Homer said that ""Useful data, like the model's structure itself, emerge iteratively over the course of a project.""

This points out a slightly different use of SD in what I would call scientific SD modeling: the application of SD to, for example, coupled natural - human systems (as in health or natural resources etc). In establishing ""research projects"" I think that is very useful to model early and often. This iterative approach can help define what data are necessary to understand the issues being examined, and will help define the sub-areas of research necessary to obtain that data. Modeling in this sense is not (only) an end product, but a tool to define the research agenda.

____________________________

Richard G. Dudley


Posted by richard.dudley@attglobal.net
posting date Fri, 21 Sep 2007 22:19:59 -0700 _______________________________________________
Bill Harris <bill_harris@faci
Senior Member
Posts: 51
Joined: Fri Mar 29, 2002 3:39 am

QUERY ""Flawless Consulting"" and SD

Post by Bill Harris <bill_harris@faci »

Posted by Bill Harris <bill_harris@facilitatedsystems.com>

""SDMAIL BalaporiaZ"" <BalaporiaZ@schneider.com> writes:
>> information. After introducing myself, the manager's first statement was,
>> and I quote; ""Zahir, I know how this works. You ask for my watch and
>> tell my boss what time it is. So, lets get on with it."" I am thankful we


Zahir,

That's about as much fun to hear as the (presumably made up) phrase at http://www.despair.com/consulting.html (actually, I do find the one at Despair.com funny -- not what I do, I certainly hope, but funny). That statement the manager made was important data, though.

While I think Kim is partially right -- we can defuse some emotion through analysis -- I'm not sure showing that manager a model of how his emotional reaction was hurting his organization would have been well received. If he was upset by having his manager bring in an outsider and if he felt that was a slight to his ability, trying to trump his views with the outsider's technical virtuousity wouldn't likely have fixed much -- it would have been more like proof of his fears.


>> - The only thing more important than customer satisfaction is
>> analytical integrity and quality. The answer is what the answer is.
>> - We don't tell you what to do. We build models that help you
>> decide what to do.
>> - The 'WHAT' is important, but the 'WHY' is invaluable.


I like those.

Bill
- --
Bill Harris
Facilitated Systems
Posted by Bill Harris <bill_harris@facilitatedsystems.com>
posting date Fri, 21 Sep 2007 07:03:10 -0700 _______________________________________________
Locked