Earlier this month, I spent some time in London where I attended a conference organised by the Operational Research Society. It was, in short, an eye opening and refreshing experience; it was great to see how analytics and scientific thinking are being used across a wide range of subjects and industries to influence better decision making. However, it was all too apparent that across the profession we’re facing similar issues; too often analysis and OR is being brought in after the fact, to fix problems, rather than to drive intelligent strategy from day one. So why are we still seeing “gut feel” driving decision making?
First of all, a disclaimer: I cannot answer that question in one article! This, I hope, will open a conversation both with OR practitioners and the support engineering community to talk about what we do, how we do it and why. To that end, this is just one piece in a series where I’m putting out my ideas around how we can influence decision making (for the better).
What is Operational Research?
From the first session onwards, one question was prevalent: what do we mean by Operational Research? I saw a variety of definitions of OR across presentations and plenary speakers, ranging from the concise to the complex; some with a focus on techniques and others talking about its aims. Potentially a useful way to compare these is in the words mentioned: four which were used repeatedly were “decisions”, “science”, “analysis” and “better”. The OR Society’s tagline “the science of better” is a good example, although perhaps a little too short to be considered a definition. My attempt at a short definition, based on this conference, would be:
“the use of scientific methods, analysis and reasoning to aid decision making and strategy development”
A point that came up in many discussions was “what is not OR?”, and I don’t believe there was (or will be) any consensus on that question. An issue the profession may face is that different terms and definitions are used dependent on the industry. I was surprised by the lack of presentations around engineering subjects (although there was great representation from the defence sector) as, to me, OR is intrinsic to engineering and manufacturing management. Working somewhere on the border between OR and engineering myself, I wonder whether the subject is just not referred to as OR in our domain; indeed I had been practicing OR for quite a while without realising it had this name, having instead heard (and used) terms such as analytics and decision support. A similar discussion addressed the difference between OR, Data Science, and Analytics; I would describe the latter two as areas within OR, but there is definite debate on that opinion (and I could spend an article on that subject alone).
Where does Operational Research fit?
Hopefully, from the definitions above, it’s clear that OR does not sit in isolation as a business function, but necessarily must be integrated within the organisation and its operations and processes. It was interesting to hear from other members how their organisations deploy analysts and analytical capabilities. Some had teams split along disciplines and techniques (simulation, data, business intelligence, etc…) and deployed these capabilities depending on what technique was required. Others organised themselves into the domains and problems they operated on (e.g. utilities, engineering, security).
Is either approach necessarily better than the other? The examples of real effective OR I have seen have an approach that is somewhere in the middle. We have to strike the right balance between domain knowledge and technical capability; this is why you’ll rarely see projects being delivered by a single person, effective analysis is delivered by teams covering a range of techniques, capabilities, and domain experience.
As Operation Researchers, scientists, analysts, whatever we want to call ourselves, we should be thinking of ourselves as problem solvers. We shouldn’t ignore a problem, or say we can’t solve it, just because it doesn’t fit exactly in our niche, and neither should we try to apply the one technique to every problem because it’s the one we’re most familiar with. The quote that all should be familiar with is:
“if all you have is a hammer, everything looks like a nail”
This quote barely needs explaining, though it should be beaten into all aspiring analysts, scientists and engineers. It is good practice to think about what we do in terms of two categories; techniques and problems; and by doing so we can focus how we develop capabilities and not get caught in the trap described by this quote.
So how do we get OR working for us?
I described a problem at the start of this article, that too often “gut feel” directs early decisions and then OR is brought in later to clean up the mess. I’ve then gone ahead and not really addressed that at all, but simply tried to define what Operation Research is, and how we do it. In all honesty, we can’t solve that problem in one article, and I’d be very surprised if it was ever solved completely. The first step, I believe, is to figure out what we actually do, why organisations should care and, most importantly, how we talk about it.
Over the next few weeks I’m going to be releasing a few more articles around this subject, through which I aim to explore these points in more detail and figure out how we should convince organisations to focus on intelligent, evidence based decision making over instinct and guess work. To keep up to date with this series, follow us on LinkedIn here or contact us directly here.
For those of you interested in the support engineering side of things, our next webinar will talk about the importance of conducting analysis in the early stages of system life cycle, and how that can be facilitated. You can register here.
To learn more about The Operational Research Society, you can find their website at www.theorsociety.com