Good enough is the new perfect

A blog by Martin Erasmuson.

As a nation, New Zealand prides itself on its innovative reputation, a country that regularly boxes above its weight on the international stage.  And indeed in areas of creativity, innovation, research and development, New Zealand’s track record is impressive by any standard: Ernest Rutherford, the father of nuclear physics (1); the first Nation in the world to have equal voting rights for women (19 September 1893 (2)); Richard William Pearse, aviation pioneer, possibly flying nine months before the Wright Brothers (3); Sir Edmond Hillary, first to summit Mount Everest (4); Sir William Hamilton who invented the modern jet-boat (5); John Britten designer/builder of a world-record-setting motorcycle (6); Peter Jackson, film producer and director of The Lord of the Rings trilogy; just to name a few.  Many of these endeavors were achieved on a shoe-string budget and in some instances, literally in their garage or back shed.

While the word ‘innovation’ frequently appears on corporate mission statements, is that still a valid description for a New Zealand psyche; our modus operandi?  Is this alleged innovation making a difference?  How does NZ compare on the world stage? 

As it turns out, not that well. 

In 2011 Forte Management (7) suggested that: “New Zealand’s problem is not a lack of innovation but rather [it’s] inability to convert [that] legendary inventiveness into productivity, profitability and prosperity”.  Indeed in a 2003 report the OEDC stated: “The mystery is why [New Zealand] seems so close to best practice in most of the policies that are regarded as the key drivers of growth is nevertheless just an average performer.”  In the 2013 report ‘Productivity by the numbers: The New Zealand experience’, the NZ Productivity Commission points out that ‘New Zealanders work about 15% longer than the OECD average to produce about 20% less output per person’.  That’s working harder, not smarter!

Many of us experience the symptoms behind this working in our companies and organisations where even the most routine aspects of day to day business are encumbered by a bewildering assortment of policy and process.  But what are the strategic drivers for this  approach?  Is it a natural aversion to risk?  The changed business environment itself?  This blog explores these questions.

On the eve of the Forth Industrial Revolution (9), today’s business environment has changed markedly from even a decade ago.  The third industrial revolution brought us modern computers and automation with previously undreamed of information creation and analysis capabilities.  It defined a generation of IT and Information Management professionals.

An organisations capability comprising people, technology, data and processes was built-up over several decades as did the policy and project approaches supporting them.  These approaches, which we now call 'Best Practice’ typically relies on quality data, strong cause/effect relationships and high levels of certainty and agreement about the nature of the problem, the intended actions, the expected outcomes. 

So why does it seem so difficult to pivot those capabilities to support this changing era?  There are several agents at play here.

In his book ‘The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail’, Clayton Christensen (10) proposes that an organisations capabilities also define its disabilities.  Those 20th century capabilities; people (experts, highly trained), technology (on-premise, single-purpose enterprise applications), data (structured, tightly coupled) and processes (rigid, clearly defined frequently with an overarching Quality Management System).  Christensen explains how processes define how resources are combined to create value.  While resources can be bought and sold, retrained, hired and fired, it is the processes and the overarching culture and values that define an organisations capabilities or he suggests, their Achilles-heel. 

How does that play out in practice?  In considering alternative approaches, I often hear organsiations reporting “we tried agile but it didn’t work”.  As philosopher Alfred Korzybski suggests; ‘The word is not the thing’ In looking at their failed attempt at agile you typically see that existing resources (people, data, technology) were rebranded ‘agile’ and then plugged into fundamentally unchanged values and processes with the same outcomes as before.

This emerging 4th Industrial Revolution comprises cyber-physical systems, the Internet of Things, the Internet of Systems with omnipresent and constantly evolving business environment and emerging information.  Before the turn of the 20th century it was possible and practical for a company or industry to capture the information to know what they needed to know to be successful.  In 2013 human knowledge was doubling about every 13 months (11).  But with the emerging 4th Industrial Revolution, that rate will increase exponentially. IBM theorizes it could someday double every 12 hours!  We are already seeing the symptoms; undifferentiated problems present themselves with disagreement and uncertainty about how to proceed.  How to handle that?  More on premise or cloud-based storage?  Faster internet?  More training?  By the time an organization runs its traditional linear approach, the situation has changed and there are new challenges.

Knowledge is power, even more so in a Knowledge Economy, but only if relevant information can be identified and accessed quickly.   The problem is not a lack of information but rather our ability to identify and use relevant information from the chaff.  How important is solving that dilemma going to be?  In his book ‘The Fourth Industrial Revolution’ (12), Professor Klaus Schwab suggests that “[These] changes [will be] so profound that, from the perspective of human history, there has never been a time of greater promise or potential peril.”

This tsunami of information is impacting every organization and while many are at least beginning conversations, they are understandably put off by the risk such uncertainty represents and so we see many sticking doggedly to their 20th century approach; ignore the uncertainty, assume the future is knowable and will look like the past.  That approach is failing many organisations around the world and is I believe behind New Zealand’s poor productivity.  So what to do?

Our 20th century model relied on ‘knowing stuff’, it was our ‘security-blanket’ with accurate data, good processes and stable technology and an unshakeable confidence that we could predict and be ready for the future.  In this emerging Knowledge Economy, Christiansen suggests that the ‘information required to make [] decisions does not exist. Failure and iterative learning are required’. So how do change-leaders respond to such a paradoxical situation?  In her book ‘Rookie Smarts’, Liz Wiseman has a single sentence which I believe represents the approach every organization must adopt: ‘When there is too much to know, the only viable strategy is to know where and how to find the information you need, when you need it’.

As discussed earlier, resources can be bought and sold, particularly data and applications with today’s plethora of XaaS offerings.  Rather than the tightly-coupled, accurate, structured gold-standard data we came to expect in the past; the veritable silk-purse; we need a more ‘good-enough’, heuristic culture; a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals.  How does that work in practice?

Case Study – Canterbury Earthquake Recovery Authority (CERA) (13) StratSim Customer CERA

Christchurch New Zealand was rocked by devastating 6.5 and 7.1 magnitude earthquakes in 2010/11.  The Christchurch City CBD was virtually destroyed, along with 1,000 km of roads, 500 km of sewer pipe with 100,000 houses damaged, 8,000 written-off.  In April 2011 the NZ Government created CERA to coordinate response and recovery efforts.  StratSim partners Martin Erasmuson and Stephen Ferriss were behind the design and implementation of the CERA spatial data infrastructure (SDI) – download the whitepaper free (14).  In the initial weeks and months of CERA, such was the chaos on the ground and complexity of overlapping demands from multiple agencies, NGOs, the Private Sector and the Public, no-one could tell you their information requirements for tomorrow; let alone in a month or two.  Suffice to say that once they knew what they wanted, they'd want it almost immediately.  THAT statement itself was the requirement; 'an information infrastructure (people, data, technology, processes) that could support on-demand discovery, access and use of any ‘potentially’ relevant information to the earthquake recovery effort'.

In the initial months in Christchurch there was anecdotal evidence of significant population movement of people relocating to other parts of the city, the region and even further afield.  Policy folk were desperate to understand what was happening, but how?  The SDI team sourced two disparate datasets for postal redirections and power meter readings.  By identifying properties that had a postal redirection and a zero or low power meter reading we could extrapolate suburb-level population movement to plus/minus 20%.  The work was completed in just four days.  By traditional standards the entire exercise seems rough-and-ready, the data was loosely-coupled, the process made up on-the-fly, the outcome unprecise.  But when provided to the CERA policy folk they were all over it like E.coli on week-old Chicken Kiev!  It turns out George S. Patton was correct when he said: ‘a good plan violently executed right now is far better than a perfect plan executed next week’.

What are the various organizational elements behind this capability?  At StratSim we work with organisations to create their own agile capability to survive and thrive in the emerging Knowledge Economy, the so-called 4th Industrial Revolution.  This involves understanding and establishing a range of adaptive strategies, tactics and approaches not to replace, but augment existing capability.  The goal is an adaptive capability with people/systems/processes capable of responding quickly to change; discovery-based-planning that acknowledges the complex and uncertain nature of today’s business environment, that at the start of the endeavor little is known and much is assumed, and that ‘the answer’ must be ,discovered’ as the journey unfolds; practical approaches encompassing nimble tools for snapshotting  the information ecosystem for quickly working out what to pay attention to and what to ignore; plus a ‘no-blame’ culture that encourages, as Christensen put it 'failure and iterative learning' in a ‘fail-fast’ environment. 

While there will still be situations where quality data is required and there is time to acquire it; for the most part it requires acknowledging that, as Christensen suggests, 'the data does not exist' or at best is loosely applicable, with a heuristic, ‘Good enough is the new perfect’ culture, getting it done today because tomorrow there will be a new challenge.  Most important is acknowledgement that this won’t end.  There is no 'there' to get to.  Exponential change means just that. 

Yes that is a scary prospect if you are responsible for a bunch of 20th century infrastructure and people.  But you can do this.  I choose to embrace the first part of Professor Klaus Schwab’s statement: “There has never been a time of greater promise”. 

Your comments are welcome and the StratSim team would welcome the opportunity to help.  Please give us a call.