Opinion & Humour

Pushing back the pendulum

Having a look back over time shows how the cost risk pendulum has swung back and forth between clients and service provider in technology services, and new commercial thinking is now needed to meet the challenges of 21st century technology delivery

If you look back over the vast vista of time at how technology services have been delivered you can see the evolution of risk between clients and suppliers with the pendulum of doom swinging back and forth between clients and suppliers. In the very early years, most organisations did everything themselves, but then eventually realized they didn’t need to do that. And so did suppliers who set themselves up to take advantage of the growing market, and the unwary clients…

1. Evolution of IT cost risk over the years

From before “noughties”, there was something of a “wild frontier” as clients had little experience of outsourcing technology services and suppliers made their fortune selling their wares. The deals back then were often not well constructed and frequently client unfriendly (even “gouging”). Often all of IT was thrown to the suppliers leaving a very thin retained organization, with most of the “brain” outsourced as well as arms and legs. So clients were at a significant disadvantage, and suppliers could be like a fox in the chicken coop, writing business cases for their own new services to be signed off by client managers who had no bandwidth to challenge them! Being virgin territory, services were also poorly defined and relatively unstructured, accompanied by opaque or just plain bad commercial models (e.g., pure T&M or lop-sided ARC/RRC models) that dumped the risk firmly in the clients’ lap – the commercial nadir…

In the 2010s, the pendulum started to swing back in client favour, as they started taking back control of strategy and architecture, business case development and key aspects of the “intelligent client” model (think of the previous model as the “hostage client”!). Other developments in service and commercial structures also helped as lessons were learnt from the first generation experience. For example, the service “tower” model became more well established, also with the beginning of Service Integration/SIAM disciplines to manage multi-vendor setups. The pricing models improved with the introduction of transparent PxQ “utility” pricing, aligned and integrated with well defined performance management and incentives. This was probably the (first) zenith of the art of technology service outsourcing.

As we move through the 2020s, you can see things falling apart from that peak of perfection as client “digital” demands and the technology landscape change with their technology teams looking for new ways of working to meet that. Probably the most significant trend is client organisations bringing the management of technology services and indeed some execution and delivery back in-house. There are no doubt many reasons for this drive, including disappointment with and inflexibility of previous outsourced arrangements and a perception that direct control is needed to increase agility. To be honest, the problem doesn’t always sit with the service providers, and reorganizing doesn’t often solve systemic problems, but there you go.

Apart from the swinging “in/out” door, there are other drivers. In particular, SaaS and IaaS are eroding “traditional” infrastructure and application management services reducing the scope for outside services (you can read more about that here). This erosion significantly thins out the service management layer required on top of the “Cloud” services compared to old-style hosting, which impacts the implicit business case. Automation also changes the landscape, increasing the reach and operational leverage of the people in the driving seat; a fully automated DevOps/SRE type mode needs no IT Ops people (well, that’s the philosophy). Demand Management (including FinOps) are now key skills with cloud bloat replacing VM bloat of yesteryear. Demand management is logically a client side function, in terms of the link to the business, although you can outsource / delegate the supply side matching.

If we look forward to the future, we can expect to see the incursion of GenAI further eroding the traditional opportunities for outsourcing, and further confusing the clear and simple lines.

2. What does the future hold?

The actual effect of genAI in terms of cost risk is probably one for the crystal ball just now. GenAI and its siblings might improve the profile as automation replaces yet more labour costs, but flakey implementation of expensive systems with unclear benefits and increasing supplier lock-in can easily drive the pendulum the wrong way. For suppliers, it drives a further shift of revenue from people to technology; the more fleet of foot will probably win either way

Coming back to the here and now, as client organisations swing back to a more in-sourced model, you can see repercussions with, for example, sub-optimal (re)sourcing when using third-parties. Often, resourcing generally reverts to unstructured staff augmentation and body-shopping to fill specific gaps in the in-house teams, forgoing the benefits of a more coordinated approach to third parties. Whilst previous models may have had carefully crafted off-shore resourcing to benefit from labour cost arbitrage, the in-house models have the hidden costs of expensive on-shore day rate contractors back-filling vacancies.

2. Buying “Bodies” with exacting specifications is hard

The “bodies” are often requested in an ad-hoc manner and to an exacting specification that significantly constrains what can be provided, artificially limiting supply and so driving up the cost and risk. For example, the role they to be filled in the patchwork structure will have a very tight skill specification, must work in a specific location and can only be quite senior/experienced (the logic being “we’re not paying to train junior supplier people on our projects”). In this model, the client organization carries all the risk on cost, productivity & quality.

So the unintended consequence of the well-meaning strategic changes in service resourcing model is the loss of some of the good stuff that went before. As well as the fragmented supplier deployment and inefficient resourcing, another major impact is the resultant loss of clear service structure and definition and unclear/limited performance management. So the cost risk pendulum is swinging significantly against clients who are now again bearing the risk.

4. Reintroducing third party service structure

The way out of the quandary is to reintroduce some sort of structure with a clearer and somewhat wider scope that can be resourced more flexibly as a service responsibility (even a small one), including the innate ability to refresh and improve itself. One of the challenges is that supplier have quite understandably aimed to limit their risk by walling themselves inside tightly negotiated scope with exceptions for anything that goes wrong everything outside that.

As a concept, it is one that lawyers and commercial managers like and that they can wrap themselves up warmly when they go to bed at night. However, it is somewhat dinosaur thinking, and is unsustainable in the modern world where business-technology performance is better and more meaningfully measured by end-to-end end experience service levels (e.g., with much touted XLAs – “experience level agreements”) or even more holistic business outcome metrics that are actually linked to the performance of the client business. Employee bonuses are commonly linked to such “One Team” metrics, however, supplier agreements are rarely so.

Some of that lack of performance-benefit linkage is historically because previous attempts to connect end-to-end have often failed dismally, be it due to poor definition, poor data, or somehow the grand ideas of partnership formed with a handshake on the golf course just didn’t pan out in the cold light of implementation (golf courses are really not good venues for making major strategic decisions!).

Some of the challenge however is also down to old-fashioned thinking about how the boundaries of responsibility are defined and how the benefits of supporting a winning business can be shared (or, of course, losses taken). You probably can’t blame the lawyers as they still think using Latin in contracts is smart, are mired in centuries of historic case law and outdated legislation and just don’t think along commercial service lines!

5. Service structure design decision

Without attempting to solve the entire conundrum of aligning client and supplier incentives and risk-reward sharing, we can push back against the reversion to bad (re)sourcing behaviours, by imposing some structure on the service requirements. That could be by moving back to the comfort blanket of older “tower” models. However, perhaps instead moving forward to a micro-sourcing model with small service components plugged together to fit the client need, and framed with more innovative commercial and performance management structures with a “One Team” twist. There will be nay-sayers who declare “but it’s not market standard”, however the appropriate response to that is “stasis is death” and “average is for losers”, so I say: think up, think harder and imagine the commercial possibilities!!!

Pushing back the pendulum Read More »

When “Smart” is Stupid

As a technology descriptor, “Smart” waxes and wanes in the various cycles of hype and marketing overreach, but looking at “Smart Home”, some of the real life examples are pretty Stupid…

If you look at the definition of Smart technology in general, you will see the sort of key attributes that look like this…

1. Defining “Smart” technology – key attributes

Those being:

  • Connected, by Wi-Fi or whatever, with some Cloud services and remote access to your devices
  • Friendly UI, typically connected to a mobile app
  • Automation of some sort, often by linking to Alexa (or similar, other virtual home assistants are available), or IFTTT
  • Using data analytics and processing to drive some level of intelligent behaviour
  • Being part of an eco-system of devices

In the home, central heating is one of the more mature uses of Smart technology, with the possibility of managing your home heating efficiently and effectively, using a mix of distributed WiFi/Zigbee thermostatic valves on the radiators controlled centrally and accessible by voice command using you home assistant. It’s a beefing up of the older analogue thermostatic controls

2. Smart Central Heating – a good application of Smart technology

Looking in the kitchen or utility room and you will however see another story, the sorry tale of woe that is “Smart” home laundry.

3. Home Laundry – not Smart Technology

Yes, the big names will sell you a “Smart” Wi-FI enabled washing machine or tumble dryer, but the marketing doesn’t match the reality. For example:

  • Cloud connected. Generally yes, the remote diagnostic feature can be useful…
  • Friendly UI. I suppose having a cute mobile app to programme the machine might be nice, but you need to be standing in front of the machine to load it, so just twiddle the knobs…Pointless.
  • Remote access. What is the point of being able to programme your washing machine when you are 20, 30, 50 or 10,000 miles away – you are not there to put the washing in. If you forget to put in it before you went out, then you are royally stuffed. (It’s not like a Wi-Fi enabled cooker that you could use to check if you left the gas on which would be very helpful, especially if you could command it to shut the gas off.) Also pointless
  • Automation. Having an alert when your washing cycle is finished might be useful if your house is sooo big that you can’t hear the washing machine when it finished, but if the house is that big, you are probably in the demographic that doesn’t do their own washing anyway. Again no use if you are far away, it will just have to stay in the machine and get wrinkled and fusty smelling. More pointlessness.
  • Analytics. OK, this part might work in some way, in that the machine can work out how much you put in and then adjust the water level and wash and spin cycles to match. Otherwise it’s not going to give you much useful advice like “last time you turned me on at this time you used the cotton cycle programme, would you like to do that again” or “55,345 people are watching this cycle now“, or “your friends are currently relaxing and watching TV whilst you are doing the washing“. Annoying
  • Ecosystem. This is the killer issue. The home laundry process is an almost entirely manual and labour-intensive, and so there is no automated continuous flow of washing passing through for which to apply Smart technology. Showstopper!

So, there you have it; it’s Stupid.

You can envisage some ways of changing the Home Laundry paradigm:

  1. Don’t wash your clothes – the Null solution, but loses you friends very quickly
  2. Outsource and send your clothes to a central laundry which is continuous flow, may being picked up and delivered by a Johnny-cab auto-taxi
  3. Truncate the whole process with self cleaning clothes – these sort out the bio-stink with little copper wires in the fabric but I am sure they would need to have the dirt washed off them, or maybe you just recycle them. You could try the HercLeon Apollo Self Cleaning T-Shirt which to quote the sales blurb “can be comfortably worn for days, weeks, and even months without having to be washed with soap” [my bold]
  4. Build the Laundry Jet laundry collection systems into your house
  5. Buy a Panasonic Laundroid laundry robot if they ever launch (apparently they invested $60m in this)

Time will tell what innovations will arise…

You can consider some other examples of Stupid and work out what differentiates Smart from Stupid like this, to the left of the donkey…

4. When “Smart” Technology is Stupid

Stupid technology shares a lot of the attributes of Z-list celebrities, that is, like a showroom dummy with a pretty face and all the intelligence of pond life which needs a handler and has nothing to say worth listening to.

  • Coffee machine. A coffee machine would only be smart if it was part of a caffeine delivery flow system, ordering fresh capsules, cup management robot, free flowing water and liquid waste pipework, and disposal / recycling system for the capsules and grounds. But they don’t, just a pretty UI and pointless Wi-Fi connection, like the Delonghi Primadonna Soul Bean-to-Cup Coffee Machine, a snip at £1299
  • Toaster. The crew of Red Dwarf had issues with the Talkie Toaster, so maybe a full continuous flow toast making eco-system could be an issue, but the current generation of Smart Toaster are just a pretty face, like the Revolution InstaGLO R180B Touchscreen Smart Toaster, yours for £366 on Amazon, and it doesn’t even have Wi-Fi
  • Pressure Washer. As my family would tell you I have a love-hate relationship with pressure washers. Even allowing for that bias, in my humble opinion, the use case for Smart pressure washers is pretty well non-existent. I suspect, however, the purpose is actually a dark, spooky objective to gather customer data (somehow). The Karcher K7 Premium Full Control Pressure Washer has a Bluetooth connection linking to the Karcher mobile app – why?. I installed the mobile app, couldn’t see the point and deleted it…

The key insight that we learn from this analysis above is that to be properly Smart, technology has to be part of a continuous flow system which is largely automated, otherwise it is just lip-stick on a pig.

So we can revise the first chart at the top of this article, and add that continuous flow requirement to the key attributes, thus…

5. Defining Smart Technology – key Attributes (Revised)

So there you have it, now we can spot when Smart is Stupid, and also have the signpost on the road to make things proper Smart

When “Smart” is Stupid Read More »

Digital, Phygital, Fiddlesticks

Digital is a rather abused term that has been round the block a few times, and now we have “Phygital” which is a load of bull..

I was prompted to think about the meaning of “Digital” recently by the unlikely conjunction of two disparate events, viz:

The first is a great step forward for a brand that has up to now been firmly “bricks and mortar”, and the second is apparently something “phygital” with the incursion of technology into actual clothing for reasons.

I get the commercial consumer driven logic of the first, but the second is somewhat more puzzling and perplexing. However, I don’t really care about clothing and fashion so it is a market logic that I would have to work hard to understand, so we’ll see how that business model succeeds over time.

Anyway, it set me thinking about words…

Digital has been around for many years, but “phygital” is a much more recently coined term, attributed to Chris Weil, Chairman of Momentum Worldwide, in 2007 (Thanks, Chris), picking up momentum c.2017. You can look at the frequency of some key technology terms in Google NGram Viewer…

NGram frequency of key technology terms by year

PCs were obviously quite a thing back in 1985 and also gave mainframes a little bump at the same time too. I tried “minicomputer”, but that barely features in this scaling, so apparently was not something that people talked about so much back then. Whilst departmental computing was a big wave of change versus mainframe in the 1970s and 80s, it was only in the business domain and so general awareness and interest was lower, I suppose.

Web and Internet were clearly also big talking points in 2000-ish, and beat down the Microcomputer Revolution in volume. But throughout you can see “Digital” growing steadily until it has actually overtaken what were the leaders, “Web” and “Internet”, with Web taking a sudden down-turn.

Most of the other newer terms like AI, “blockchain” and “metaverse” still bumble around at the bottom of awareness at this scale so not hitting it by the current 2019 end date of the NGrams corpuses. “Fintech” also is a relatively low scorer, even though it has now spawned a constellation of many new digital “<ANYthing>Tech” neologisms, like “InsureTech”, “PropTech”, “FemTech”, “EdTech”, “LegalTech”, “FoodTech”, “AgriTech” and so on). These are also probably more business vertical specific than broad-based so don’t get the volume of attention.

And don’t bother looking for “phygital” which also dribbles along the bottom of the chart if you add it to the query.

Before around 2015, “Digital” used to mean stuff related to computers generally. However, from then onwards it started to acquire jazzy new meanings related to exciting things like customer experience, digital marketing, mobile apps and otherwise being a “Digital” business, and with “digitalisation”, the process of becoming that thing. McKinsey had a go at defining it which you can read at your leisure.

What got lost is that many businesses have been digital for years and that technology rubbed up against the real world in many places, often not so glamorous. Like in manufacturing, supply chain, vending machines, door locks in hotels, the kitchen systems at KFC

To get to grips with this you can draw up a simple gameboard that maps out business typology against its manifestation.

Business classification – Typology vs manifestation

The business typology separates the places (“venues”) where people interact (e.g., actually trade or just get together and interact to do people stuff, like throwing sheep) from the actual trading businesses themselves, i.e., those those that generally exchange some value for some thing or benefit. These can be actual products, services and money but also in the wider context, could be social kudos, environmental benefit or other non-monetary value. For these purposes, broker-type businesses fit in the “trading” slot as they facilitate other peoples’ trading.

By the way, for the bankers reading this, we shall deliberately ignore where the trading transactions (financial, social, emotional, environmental, or otherwise) are cleared and "payments" handled, let's keep things simple for the purpose of this treatise.  

The manifestation dimension separates the real from the non-real. Physical covers what you expect (to be construed according to context as the lawyers say): buildings made of straw, sticks and bricks in locations with actual geographic locations, or cars, or books made of paper. The virtual covers everything that isn’t that, a nicely mutually exclusive definition. So can include virtual assets like photos, videos, software, financial products, and virtual businesses that provide places for people to connect and trade.

You can map out some businesses onto the landscape to see how the Pickup Sticks fall.

Digital business classification – some examples

What you can see (obviously) is that those which fall into the virtual column are heavily technology based (indeed, since we have selected this to exclude ectoplasmic spirit world businesses, wyverns, harpies, vampires, magic wand shops and other virtual manifestations of a more mystical sort). Whilst some of the virtual venues like Facebook support virtual interactions, a virtual platform like Uber facilitates real world transactions between car drivers and their passengers. And Utility Warehouse is a virtual business that loosely speaking brokers people-energy trading.

In this classification, the Metaverse is just another venue, and it could yet be a three-star Michelin restaurant experience or just a greasy spoon, as we shall see. But like the financial exchanges of today, the venues (exchanges) make a dribble of money in comparison with the eye-watering value that flows in the trades they facilitate. It’s largely what you do that makes the money, rather than where you do it (whether you have Meta-legs or not…).

The caveat to that is that a business with a captive supply base, and monopolistic channel control, like the Apple App store, can make shed-loads of money at its 30% transaction tax. Similarly, Facebook as a venue makes lots of money by selling access to its users for advertisers compared to the unfathomable value of the social interactions that take place upon it.

The key point here is that the businesses in the right-hand Physical columns also use technology, and often extensively, although not so visible to the untuned eye. Even the Louth Livestock Market, a very physical place with real farm animals and open outcry selling round the ring, also has a website and online auction trading. In other words, they are Digital businesses too.

So Digital is embedded in both Physical and Virtual manifestations and forms a solid and critical substrate on which almost all businesses run today. Like a seam of gold running through quartz…

Digital substrate embedded in most businesses

What does a “Digital” business actually look like these days? Well, it would undoubtedly include, internally, solid chunks of systems for Customer, Product & Operations and Performance & Control, and externally, multiple channels, non-linear supply chains and the like. But that is is a story for another day,

We used to see businesses sprout silo’d business units separate from the mainstream and built on electronic channels (oh yes, Digital channels) back in the early 2000s. This is less xenogenesis to birth something new and quite unlike its parent, than it is temporary firewalling to incubate a new way of doing things in the same business. Consequently, these offshoots have long been absorbed back into mainstream business models as they matured.

Many businesses have been omni-channel for years; it is no longer a rocket scientist level insight to suggest that, for example, you should have common stock management between an online store and physical shop, for example. However, the wave of the reworked “Digital” businesses in the last 5-7 years regurgitated the concept as something new, when indeed it is not.

The upshot of all this above this is that the newer Virtual businesses were called Digital by their over-enthusiastic and imprecise evangelists in thrall to a form of cognitive bias and so Virtual has been confused with Digital. This created the misbegotten conflation of two terms to describe an omni-channel experience across Physical and Virtual.

So we got “Phygital”. However, Digital embraces Virtual and Physical, so “Phygital” should really be “Phyrtual”, or “Virtical” or someother bull.

Digital is perfectly good…we don’t need Phygital, let it wither and die, like the eCommerce business units of old

Digital, Phygital, Fiddlesticks Read More »

What the Bell?

I was rather interested to see a post on LinkedIn recently about “The Myth of the Bell Curve” which was saying (relatively) recent research had shown that human performance is more like a Power law distribution, than a Normal distribution.

The consequences of this is that a cherished HR sacred cows needs slaughtering.  Anyway you can read the post yourself, however, what tickled my interest is what would the two distributions look like when laid next to each other.

There is an image in the publicity material that attempts to show this…


…but that must be mathematically wrong, surely!

Nurse, bring the oxygen!

Both the Normal Distribution and Power Law are both types of probability density functions. however, as far as I can see from the published links, they have different axes:

  • Normal Distribution:  X = performance metric, Y = probability of that performance metric
  • Power Law :  X = some indicator of population; Y = performance metric of some sort

The problem of comparing these two is is that you need to rework the data to get both on the same axes.  Making the hypothesis that the x-axis of the Power Law is the performance rank of an individual – like a Zipf curve equivalent.

So X is not the size of the population, ‘cos that is just absurd:  the curve would otherwise show that for that any population of 1 is really brilliant, whereas the bigger it gets the more stupid it is…mmmm, weelllll, depends on who is counting themselves as the One, and how many of the rest read the Daily Mail/Mirror/Express/Sun/Star…

So if you work the data on that basis (modelling an arbitrary population size of 100 people) then the curves actually look like this…

Power law

…so they are curves with quite different shapes.  And if you re-plot them the other way round, then they look like this…


…which might superficially look like the picture at the top, but is actually showing the population of the long tail as the tall spike, not top performers.

Still a rather scary picture, as it indeed suggests that most of the people in the “team” are rather serious under-performers, hanging on the coat-tails of the many fewer high-flyers!

This may be a figment of the example data somewhat, and taking a probably unsubstantiated analytical leap, we can readjust the power law chart to align the median figures of performance and come up with a chart like this…

Normal (power adjusted)

…which even still suggests that there are a load of sub-middle slackers sitting on their hands, and they should really get moving and DO SOMETHING!

My general theory that if when leaving the house on the way to work in the morning, you harbour the thought that “today, “I will not make a difference”, go back indoors and get back under the duvet.

So I have scratched my itch, not sure it was so much fun for you, so here is another useful framework to help guide thinking and action and considers the destination of projects…

Thinking is…


A wasted opportunity swirling round the Plug-Hole of Life



by way of a path of good intentions




Implementation is…

What the Bell? Read More »

Unbalanced reporting, or just unbalanced?

Jonathan Swift was on to something when he picked out in Gulliver’s Travels the pointless and apparently irreconcilable Big-Endian / Little-Endian debate on the island of Lilliput (a metaphor for religious schism in 16th and 17th century England, as it happens).

The question “Are you a Morning Lark or Night Owl?” is another of those that has its merry bi-band of quarrelsome, bifurcated and dichotomous disputants, bickering and unable to arrive at any accommodation, mutual agreement or consensus view.

So the recent study by Dr.  PK Jonason was, of course, like oxygen to those people who live to stir things up a bit, leading to headlines such as

Night owls have more ‘evil’ personality traits: study (Business Standard)

That’s not what the study actually says, as it just focuses on three personality characteristics (The Dark Triad) and their distribution amongst Chronotypes and so does not equally point to any irksome, venal, or other unpleasantness of the early risers.    However when did balance ever come into the equation in getting a headline?

Well in chemistry actually, where you definitely have to balance your equations…

Didn’t Mark Twain say “Never let the facts get in the way of a good story”.

Did he or didn’t he, I don’t know, does it really matter, is it something people have a fight over, well let them!

And he was a story teller, not a scientist…

So the study may have been a bit narrow in scope, but still science (assuming the peer reviewers also agreed about this), and sadly abused by a rather unbalanced headline.

In the Larks/Owls debate, of course, you find it is of course that things are more complex than the simple binary,

There are some key consulting frameworks that are designed to help people solve more complex problems than just by pure binary thinking

The Boston Grid is a good example that expands thinking to at least two (binary, smoothed, averaged) dimensions and has four outcomes (or more if you start sub-dividing the individual boxes, but that gets hard to read, and clarity of thinking is, of course, the whole point, not “clever” smart aleck chart drawing).

Note in particular the national difference that shows that, in the analysis, Spaniards are more Owl-ish than the Larky Italians (well, relatively), and Machiavelli was Italian, so put that in your pipe and smoke it…Gift with a bow

Binary decision constructs are generally (*see footnote) grossly over-simplified and massively averaged “big picture” way of thinking about stuff, and most situations are actually formed from a spectrum of factors, which the human mind reduces ad absurdum to “are ye wi’me or agin me”.

Someone once said that I “hoover up complexity” which was, I think, a compliment overall, on balance, and in the general scheme of things, but also a cogent warning to avoid rocket-scientist gibberish, too!

Even a well-meaning spectrum view can still present a one sided and possibly biased picture, and the balancing aspects need to be added.

The Autism spectrum is an example of what can be considered as a one-sided spectrum, since “Normal” sits at one end, not in the middle like a properly balanced continuum.

I rather liked this view Psychosis and Autism as Diametrical Disorders of the Social Brain: converging evidence!! that describes a wider spectrum from Schizophrenia through “Normal” to Autism.

The real story is probably nearer to a 2D surface, or in fact many more dimensions, but that does start to hurt a little bit.

And so, maybe a enigmatic spider-web diagram to finish…

Spider web chart



Ok, so putting “generally” against any assertion is one of those averaging and simplifying devices used to smooth over the roughness of real-life situations.  But what the heck, this is rhetoric, and I stand my ground, Sir!

Unbalanced reporting, or just unbalanced? Read More »

Technology and the Zone of Uselessness

They let me out for a short trip to the shops today, and whilst I was waiting to pay, I watched an old geezer struggling to put his chip'n'pin card in the right way round.  Which set me off thinking about what happens when you get old, and at what point does the pace of technology evolution overtake and you are left in the dust, a crumbly, fumbling, useless old curmudgeon, no longer able to function properly nor interact sensibly with the environment.

To further the analysis we can consider this table of the evolution of user interfaces (keeping a fairly tight scope to cover mainly electronic means)…

Primary Mode
of Interaction
Examples Era of invention
Tap Telegraph key (button) Late Georgian
Shout Candlestick phone Victorian
Rotate Rotary phone, Wireless with Bakelite knobs, steering wheel (drive by wire) Victorian
Bash / Prod  QWERTY keyboard, keypad Victorian
Look Eye tracking Early Miss-Marple
Wiggle Joystick Wilson-WhiteHeatian for electrical (although Early Edwardian/La Belle Époque (for mechanical)
Blow Typing aids,
Blow controlled mobile phone, ignoring the Captains speaking tube…
Waggle Mouse Engelbarto-Xerox PARCian
Scribble GridPad, Apple Newton, Palm, Ipaq, Tablet PC Yuppie-time
Fondle & Stroke Smart phone, tablet SonyEricssonian-Jobsian
Wave Nintendo Wii, Xbox Kinect, data glove TomCruisian
Shout 2 Speech Recognition Rock and Roll, but it hasn't really happened yet properly, maybe JeremyClarksonian, when it does (JC is famously unable to use any voice operated equipment)
Think emotiv EPOC neuroheadset Yuppie-time

…and whilst you can see that a lot of stuff was actually invented a long time ago, having been around for over 100 years in some form, there has been quite a rush of invention in more recent years, hanging on the cot-tails of the primary evolution of computing technology, no surprise there, I suppose.

One of the more interesting insights, for me as an analyst and connoisseur of number crunching, is that whilst many of the newer inventions have been for various methods of computer control,  there is a paucity of newly invented data entry methods, beyond the humble and ancient keyboard.  

With the dominant design of the QWERTY keyboard to the fore, there have been really no successful disruptive plays, and most inventions have focussed on just reworking the layout (e.g., DVORAK, frogpad, FITALY and their kin).  Chord keyboards made a bid, but, of course, like any shorthand method you need to learn a new language, and they never took off.

The FITALY keyboard is a nice design that fits well with modern joy-pad units like xBox and smartphone touch interfaces, as it minimise the amount of clicks, or finger movement movement to type a letter so is quite fast , however at $49 for a tablet computer it is never going to amount to much

Extending the idea of chord keyboards and use of non-verbal language, there is undoubtedly some scope for non-keyboard data-entry devices using gesture control  to recognise sign language (and that hopefully avoid Gorilla-arm that afflicted early days vertical touch screen users).   Although, the new “language” learning problem still exists, and Babel will always be an issue, unless we all adopt Ameslan or Microsoftlan, or AppleJobsLan.

Now I believe that I can rightly consider myself  pretty well up on the world of technology and there is very little that fazes me.

In fact, many pieces of broken equipment will just fix them in my presence, or so it seems, when my family call the DadHelpdesk, and I just lean over languidly and in my calming presence, and the recalcitrant kit just bursts in to life (maybe with a judicious key press or two)

But don't ask me about *&^$*^%ing plumbing – compression joints, meh!

So I do think that my threshold of uselessness is likely to be pretty high (or do I mean low), and consoling me today, my elder son told me that “people don't get dumb, they just get old” (i.e, if they were stupid to start with, they will be stupid, old people), so maybe there will be some hope…

However, like VCRs, which kids can programme with ease whilst their parents just fumble, the evolution  of new technologies and UIs in particular, is much influenced by the volume of fluent, capable users, which itself flows with the generations.

To this, one area of technology that I do not really bother with is computer games beyond a half-finished PC version of Dune in 1992, I'm just not interested in playing them (I can feel my life slipping away).  Therefore I am not particularly adroit when it comes to using a joypad, and have not built up great dexterity and flexibility in my hands and fingers (unlike most teenage boys) for that type of device.  The one time I played Castle Wolfenstein, I spent the whole game bumping into walls whilst staring at the floor or sky!  And Second Life, oh so bad!

More so, I  have never been able to make the three-fingered boy scout sign – I never was a boy scout, also just not interested – my hands just don't bend that way.

And finally, I have a very highly tuned embarrassment inhibitor which tries to stop me doing things that would cause a red face (it doesn't always work, even with my personaility type…)

So what is my old-age technology nightmare scenario?

  • having to visit Castle Wolfenstein to get my pension…
  • …electronically bruised after a long, slow, meandering (virtual) walk from the entrance of the Cyberspace Business Park…
  • …inputting my data by waving my arms wildly whilst holding my walking stick trying not to fall over…
  • …and making complex mudra with my crippled and twisted old hands.

Ye gods!  Build me a Bluetooth neural uplink, and make it snappy!

Technology and the Zone of Uselessness Read More »


Spring is busting out all over up here in Lincolnshire, spurred on by the lovely weather recently.  The swallows are back in the barn, always good to see that they made it back from South Africa (where the RSPB tells me British swallows over-winter).  The trees and flowers are all blooming, not quite yet into the Bluebell season yet, but plenty of colour… Lincolnshire Spring 2011

Whilst enjoying the sunshine and in the full flush of the other joys of Spring, one of the topics on my mind recently recently has been Service Integration, an important ingredient for delivering excellent IT services.  The nub of the issue that Service Integration looks to solve is like this:

In recent years, the trend for contracting IT Services has been to push beyond the big-bang mega-deals of old to selective outsourcing of like groups of IT services , dubbed “towers” by industry pundits such as Gartner and their ilk, thus:

IT Service Towers

I won’t bore you with the detail here as to why this model doesn’t work that well

However, user services are often a combination of pieces from each tower, so to make users happy, avoid incident “ping-pong” and other good things, services really need to be managed in a joined-up way, orthogonally to the towers, like this

End to end service model

This glues, or integrates, if you like, the different elements of a complete service, hence, this joining-up is “Service Integration”

You could abbreviate Service Integration to SI, but this is ripe for confusion with the older usage of SI, as Systems Integration, all about gluing together bits of software and hardware to make new systems, i.e., Building systems rather Running services.

There have been a number of landmark deals espousing the Service Integration model , from ABN/AMRO and the “Guardian” model back on 2005, through to the most recent state if the art at National Grid with the recently penned deal with HP ( Computing article on National Grid / HP SMI deal and HP Press Release).

One of the usual suspects in building such a model, is dear old ITIL, now ISO/IEC20000

Not to be confused with Tyltyl and Mytyl, characters from Maeterlink’s Blue Bird, a well-known childrens’ classic

ITIL is a worthy model  and has been around for many years since penned by the CCTA, and is now at version 3,. Version 3 is quite good, as it has finally acknowledged that services have a life-cycle, and gosh, this sort of stuff is iterative (what a buzz).  V1 and V2 in contrast had rather static views of the world)

I was astonished recently in one of those rare, but memorable, jaw-dropping, goggle-eyed moments when a sales guy from some other organisation opined in a meeting (to paraphrase) “Why all this fuss about V3, there’s some really good stuff in V2”.  Everybody in the room looked  at the poor unfortunate in deadly silence as he swallowed his foot and half his leg up to the knee and lower thigh, and from that moment he became nobody, a nebbish, a zero.  Ouch!

Nebbish – a Yiddish word meaning “an insignificant, pitiful person; a nonentity”, very effectively characterised in a book I once read but can no longer recall the title so cannot name-check or credit the author (sorry), as a person who when they walk into a room is like someone just walked out

ITIL V3 was published in 2007, but only just really made it into the 21st century with its iterative, nay, agile, flavouring, yet there are a number of “elephants in the room”, major topics not covered that are essential components in the full business architecture of modern IT service delivery…

Not just a few elephants, but a thundering herd in fact, in the form of (not exhaustively):

  • Innovation
  • Managing Technology investments
  • Multi-vendor service integration
  • Deal Structure  and Partnership Management
  • Pricing, Billing & Charging
  • People, Culture & Structure (at least 3 elephants, in just this line alone.)

COBIT makes a much broader sweep in its attempt to embrace the whole entity that is a living, breathing IT organisation and seems to fill many of the gaps not covered by ITIL.

Yes, I know I bang on about Innovation quite a bit in these posts, but it is a common current complaint I hear that innovation has been squeezed out  in deals struck in the 2000s, and now the demand is to find ways to enable it again, even to the extent of considering to pay an Innovation “premium”.

COBIT, founded in GRC, and providing an excellent check-list with which to herd most of the elephants, is as blind as ITIL when it comes to Innovation.  If you search through the text of the COBIT 4.1 framework definition doc, you will find the word “innovation” writ not once at all in its 197 pages!

GRC = Governance, Risk Management & Compliance, in case you wanted to know

Risk management is as central to innovation and agility as it is to the philosophy of COBIT with its focus on control.  Yet there is a classic schism between the COBIT GRC-based shibboleths and the ways of innovation and agility, and one might cynically draw the conclusion that COBIT is about stopping things getting done, whereas innovation and agility are the polar opposite – “skunkworks” innovation is the antithesis of the GRC mindset, even anathema.   And, of course, COBIT is process-oriented, just like ITIL.  Good, but rather 1990s Hammer & Champy and still further to go to get into the 21st century.

The COBIT 4.1 Executive Summary contains a hugely contentious and flawed.headline on page 13 vide “PROCESSES NEED CONTROLS”. To justify this you need to follow this syllogism:

  • Processes are risks
  • Risk need controls
  • Therefore, Processes need controls
    (yes, this is, indeed, nonsense)

This does not compute, the base premise is wrong: Yes, High risks, whether processes or otherwise do need controls, but don’t waste time putting controls on low risk processes.

“Quick, evacuate the building, we’ve had a slightly embarrassing failure to detect a root cause in Problem Management”

Also, processes can be controls, so do we add meta-control processes to control the control processes? – Quis custodiet ipsos custodes ad nauseam.

You may think that I am  being rather harsh on the solid works of people who have created ITIL, COBIT et al.  These frameworks are all useful check-lists of best practice, but need to be used with some care, lest the medicine kill the patient.  And they help with the standardisation of service descriptions when making like-for-like comparison somewhat easier in the sourcing and procurement process.  However, they are inevitably behind the leading edge. for example, getting joined-up in the customer experience dimension (orthogonal to both towers and process orientation) is yet another step to go.

On the other hand, getting up the curve to build Innovation into modern IT service deals, well, that can be done right now (give me a call!)

Elephants… Read More »

Aristotle and all that

I have been away from my desk quite a lot recently cavorting around the motorways of England, racking up the miles on my poor hard-worked steed, but now I have a few minutes to sit down and pass on an interesting observation….

Just a momentary tangent before we head into the main meat, so to speak, there is another blog post that I have been meaning to write about Broadband Britain, Cloud Computing, the Innovators Dilemma, passing by the new statistic that the number of of old people in the UK now exceeds the number of young, and arriving finally at some as yet unthought pithy comment about Silver <read, Grey> Surfers. However, it is really just an excuse to create a comic juxtaposition alluding to the alleged practice of North American ethnic peoples (no longer Eskimo) to abandon their old folk on ice floes, whereas I have observed over the long miles I have travelled in the last few months that we British seem to abandon them at Cherwell Valley Services on the M40…so lets move on

Anyway, my recent revelation is related to this framework below plucked from the world of transformation consulting and change management as relayed to me some years ago by one of my erstwhile consulting chums.  The blobs relate to managing communication with people during significant changes on three dimensions: Rational, Political and Emotional.

rpe balls (web)

The ‘sweet spot’ is in the centre when all communications are most compelling as they appeal to all these three.

Coincidentally, whilst  trying to be a useful parent and reviewing a Classics essay, I prodded Google about some topic to draw back the veil of my ignorance on such topics and it popped up with Aristotle’s three modes of persuasion

  • ήθος – Ethos
  • λόγος – Logos
  • πάθος – Pathos

Thus, in seasonal form…

aristotles baubles (web)

Whilst equating Ethos to the Political dimension somewhat turns my stomach when I think of the more venal and self-aggrandising aspects of the political world, the three blobs of the R…P…E model are a pretty good match for what Aristotle laid down.

So there you go….

Aristotle and all that Read More »

Beware of BS Benchmarks & Krap KPIs

Recently our esteemed Green Knight, Sir Jonathan Porritt was attributed with saying  “Overweight people are ‘damaging the planet'”.  Of course it turns out that he said something like this in about 2007, in fact building on a comment by the then Secretary of State for Health, Alan Johnson.  But somebody else unearthed it again for some typically twisted reason – nothing can be more topical than mixing global warming with a bit of “fatty slapping”.

The hypothesis behind the hype is that fat people use more resources because they eat more food, but why not then include teenage boys (unfillable, as empty fridges around the country can testify), people with very high metabolic rate, and other some such big eaters.  Ah, well, the logic goes that fat people also drive everywhere and so contribute more CO2 than thin people who, of course, walk or cycle everywhere.   Well, maybe it applies in towns, but it is certainly not true in the countryside, so drawing a different intersection in the Venn diagram I am sketching out here in hyperspace, maybe the headline should have read “Teenage boys and country people with very high metabolic rates are ‘damaging the planet”” – not quite so catchy, or right-on, eh?

But, of course, there is a secondary thesis which is that obese people can be “cured”, especially if they all got out of their cars, walked and cycled, and stopped scarfing all the pies, whence their weight would magically drop away and they would join all the normal people in the happy mean.

When you look at whole populations analytically then of course you usually see some sort of distribution (Normal or otherwise) of whatever factor (weight, in this case) that you might be measuring.   So the theory is that by thinning down the fatties, the shape of the distribution will be changed. However, there are flies in this particular ointment, and if you look around you can find suggestions that obesity is actually a structural feature of a/the/any human population, that everybody has got fatter and that you need to treat the population as a whole, not just focus on the upper tail.

All in all, an example of woolly loose thinking gussying up to a political agenda.

BMI  is one of the weapons in the “fatty slapping” armoury, a metric with some very well documented short-comings, yet standard (mis-)guidance would label people like Lawrence Dilaglio, Jonah Lomu & Mel Gibson as over-weight or obese.  Whilst BMI might have some trivial diagnostic uses, some lard-brained, fat-heads try to use it as a decision-making metric, vide ‘Too fat’ to donate bone marrow – the 18-stone 5’10” sports teacher with a technical BMI of 36.1 who was ejected from the National Bone Marrow Register.  To make a proper health assessment, you need to have a more detailed look at structural features, like waist size, percentage of body fat and so on, before pronouncing.

Just pausing a moment to dissect BMI further, it has units of kg/m2 which is not unlike the metric used to define paper thickness.

Many organisations these days used 80gsm printer paper which is more environmentally friendly than the more sumptuous 100 paper of oldAnd even less rich feeling than the 120gsm paper that Tier 1 consultants use to create a table-thumping report – the dollars are in the loudness of the thump.

As Marshall McLuhan told us, the medium is indeed the message, thickness = quality, and just feel that silky china clay high white finish. Oooohhh…

Sorry, started to get rather indented there, must coach self, control tangents…

So a person who has a BMI of, say, yeah, like 25, is like a piece of 25000gsm paper, no really…equally a piece of A4 paper might have a BMI of about 0.08…


Thus BMI is a prime example of a benchmark ratio or KPI that is NOT a good basis for making decisions, as it fails to take account of significant structural factors.

This parable provides an important lesson for practitioners in the world of Information Technology Economics, where many a ratio is measured and analysed by pundits including Gartner et al, a classic being “IT Costs as percentage of Revenue”, one of their IT Key Metrics.

It is defined quite simply as:


If you dig into the typical drivers of the top and bottom parts of this formula as below, say,

MicroEconomic Drivers – Typical Examples
IT Costs Revenue
  • Business configuration, e.g., Channel/Distribution infrastructure
  • Organisation structure (e.g., headcount)
  • IT Governance & Policies (e.g., Group standardisation)
  • IS architecture and legacy (complexity)
  • IT Service definitions and service levels
  • Development methods & productivity
  • Sourcing/procurement strategy & execution
  • Supplier market diversity
  • Market Structure
  • Competitive environment
  • Market share
  • Product design
  • Consumer behaviour
  • Sales & Marketing performance
  • Customer Service (retention)

then you might surmise that it is quite possible that the Revenue numerator has significant elements that are certainly outside the direct control of the IT organisation, and indeed outside the control of the company, whereas the IT Costs are defined largely by the structure of the organisation, its distribution channels, and internal policies and practices.  The top line is also, I conjecture, more volatile than the denominator, and being mostly outside the control of the IT so a very unfair stick to beat the IT donkey with.  So in qualitative logical terms this metric is certainly appears to be a very poor ‘apples and oranges’ comparator.

If you stretch the analysis further, you can ask the question “what does it mean?”  Is the ratio intended to show the importance of IT? or IT leverage/gearing (bang for the buck)?

Well, if it is some level of importance we are trying to assess, then we should analyse the relationship between this benchmark ratio and true measures of business value, such as, Operating Margin.  Looking across a range of industries the curve looks like this:


OK, is is a deliberately silly chart, just to make the point that this is clearly a wobbly relationship.
If you do a linear regression analysis of the relationship between Operating Margin% and the IT Cost/Revenue ratio and a sibling ratio “IT Cost as a %age of Total Operating Costs” (or “Systems Intensity” to its friends), then you get these results for R2


IT Costs as %age of Revenue vs Operating Margin%


IT Costs as %age of Op. Costs vs Operating Margin%


What this shows is that there is no particularly significant linear relationship between these two key metrics and Operating Margin, so quantitatively, the ratios do not really tell you anything about how IT costs/investment drive overall business performance at all.

Even within an industry ratio comparisons are fairly meaningless.  For example, in the past UK Banks had an average Systems Intensity around 20%.  If you were to calculate the Systems Intensity for Egg, the Internet bank, at its height, you would come out with a number ranging from about 17% to 25% depending on how you treat the IT cost component of outsourced product processing and some other structural factors.  And I do recall having a conversation with one Investment Bank CIO who declared, “Yes, of course, we do spend 20% of our operating costs on IT, it’s how we set the budget!”

The whole averaging process loses information too.  Look at the four distributions below, they all have the same mean (i.e., average) but are wildly different in shape.


Without further detail on their parameters than just the mean value of the curves,  you cannot make a sensible comparison at all.

So all these ratios give is some rather weak macro illumination of the differing levels of IT spending between industries, like saying to a Bank “Did you know that, on average, Banks spend 7.3 times more on IT than Energy companies” to which the appropriate response is “YEAH, SO WHAT?”…

…Oh, and maybe, some vague diagnostic indication that there may (or may not) be something worth looking at with a more detailed structural review.  So, why not just go straight there, and dig out the real gold!

And so the morals of this story, O, Best Beloved,  are that just because you can divide two numbers, it doesn’t mean that you should, and be prepared to dig into the detail to truly understand how cost and performance could be improved.

Just so.

Beware of BS Benchmarks & Krap KPIs Read More »

Just Words

So it has been a torrid couple of weeks for MPs outed having been caught with their hands in the cookie jar.   Schadenfreude, Epicaricacy, aighear millteach and their ilk are good words to roll around the tongue, and savour whilst we lob cabbages and rotten tomatoes at those in the pillory: all the more unattractive being that their “misfortune” was brought about by their own actions and a display of lower moral standards than  is clearly desirable in our political representatives.

Auto-Epicaricacy: a term I just made up, applying some word logic, would mean taking pleasure in your own misfortune. Definitely an unhealthy and paradoxical mental state, but I suppose optimistic, in that every cloud has a silver lining…

I was particularly fascinated and driven to ask “how does that work, then?”  by the declaration of one misadventurer that “Of course I feel that my reputation is tarnished, but my integrity is intact”.

Integrity: the unimpaired state of anything : uprightness : honesty : purity – Chambers 20C 

What logic system do you have to apply, what set of axioms must one have, how must one deconstruct common sense to be able to make this statement?  For a start, you would have to look at redefining some core words: unimpaired, anything, upright, honest, pure – take your pick.

Words are a key part of a consultant’s stock-in-trade, and pictures too.   One of my favourites satirical sites, now sadly defunct, was SatireWire which ran a series of bizarre and entertaining statistical charts like this…

Madrigals By Freshness
… a hearty lampoon of opaque and confusing “Management by Cartoon” Powerpoint presentations (one step up, though, from “Management by In-flight Magazine” which is significantly more dangerous).

And of course every industry has its buzz-words and jargon, which can be useful short-hand for many forms of communication, but often quite poisonous when they leak into other places.

Note the use of the word “key” in the preceding paragraph – a consultant-y sort of word if ever there was one, it means important, significant, stands out from the crowd.  Non-key things are not interesting…now go back and carry on reading here

The recent attempt by the Local Government Association to proscribe some logofluvial jargon-words was a valiant attempt to stop etymological pollution in Local Government communication with the rest of us.  I am certainly a fan of Plain English, and keeping things short and sweet with some sharp Anglo-Saxon monosyllables replacing  loquacious logorrheic verbal peregrinations, but equally a devotee of precision and conciseness which some longer words can bring to a sentence, by conceptual elision, perhaps.

So I was interested to see some words on the list that I have used myself and as have many of my colleagues.  These are words from the consulting domain that do have proper surgically precise and correct meanings in the right hands, but indeed deadly in the wrong.  Other words on the list would be posionous in any context:

  • Baseline” is a word I know well that has meaning both in project planning and also in procurement – in both areas being the datum from which you measure some sort of progress or achievement.
  • Predictors of Beaconicity“, however,  is never going to win any prizes for clarity….

The list is also very good material for Buzz-word Bingo…

And talking of words and in an interesting juxtaposition of neurons firing, I noticed that the BBC were having a Poetry Season.  Being a self-professed iconoclast and fact-based sort of person, I have a completely tin-ear for poetry which is just a form of “talking funny” (in an unfunny way, unlike puns).

So to finish, I have constructed a Boston grid attempting to make some sense and classify some of the odder behaviours of my fellow human, viz….

“Talking Funny”
  • Poets
  • Committee meetings
  • Consultants (some)
  • Morris Dancers
  • Mickey Mouse
  • Street Mimes
  • Opera Singers
  • Michael Jackson
  • Klingons
  • Most people
  • Cycle couriers
  • Bee-keepers
  • Sports-people (most)
  • Customer service agents in uniform
  • Builders
Normal “Dressing Funny”

Just Words Read More »