The constructal law of organization in nature: tree-shaped flows and body size | Journal of Experimental Biology – Bejan (2005)

 

Source: The constructal law of organization in nature: tree-shaped flows and body size | Journal of Experimental Biology

The constructal law of organization in nature: tree-shaped flows and body size – Adrian Bejan

SUMMARY

The constructal law is the statement that for a flow system to persist in time it must evolve in such a way that it provides easier access to its currents. This is the law of configuration generation, or the law of design. The theoretical developments reviewed in this article show that this law accounts for (i) architectures that maximize flow access (e.g. trees), (ii) features that impede flow (e.g. impermeable walls, insulation) and (iii) static organs that support flow structures. The proportionality between body heat loss and body size raised to the power 3/4 is deduced from the discovery that the counterflow of two trees is the optimal configuration for achieving (i) and (ii) simultaneously: maximum fluid-flow access and minimum heat leak. Other allometric examples deduced from the constructal law are the flying speeds of insects, birds and aeroplanes, the porosity and hair strand diameter of the fur coats of animals, and the existence of optimal organ sizes. Body size and configuration are intrinsic parts of the deduced configuration. They are results, not assumptions. The constructal law extends physics (thermodynamics) to cover the configuration, performance, global size and global internal flow volume of flow systems. The time evolution of such configurations can be described as survival by increasing performance, compactness and territory.

Source: The constructal law of organization in nature: tree-shaped flows and body size | Journal of Experimental Biology

 

 

Systemic design: examples of current practice – Design Council – Medium

Systemic design: examples of current practice

Cat Drew
Following
Jan 2 · 10 min read

At the beginning of December, Design Council worked with The Point People to host an event on systemic design. Jennie Winhall and Cassie Robinson spoke about their work to create and move towards new systems, and Alistair Parvin, Ilishio Lovejoy and Nick Stanhope spoke about specific elements of their work to design systemically. We had 120 people sign up in less than 24 hours and a large waitlist. There is interest and intrigue. What is systemic design and why is it important?
There is a longer version of this blog here and the slides are here.

via Systemic design: examples of current practice – Design Council – Medium

Beyond reductionism – systems biology gets dynamic

Steven Hertzberg on LinkedIn https://www.linkedin.com/posts/stevenhertzberg_beyond-reductionism-systems-biology-gets-activity-6619799222780866561-4C_b/

quoted:

“Real biological systems – in the wild, as it were – simply don’t behave as they do under controlled lab conditions that isolate component pathways. They behave as systems – complex, dynamic, integrative systems. They are not simple stimulus-response machines. They do not passively process and propagate signals from the environment and react to them. They are autopoietic, homeostatic systems, creating and maintaining themselves, accommodating to incoming information in the context of their own internal states, which in turn reflect their history and experiences, over seconds, minutes, hours, days, years, and which even reflect the histories of their ancestors through the effects of natural selection.”

and

“Enactivism sees organisms as creating their own reality through dynamic interaction with their environment, assimilating information about the outside world into their own ongoing dynamics, not in a reflexive way, but through active inference, such that the main patterns of activity remain driven by the system itself. This perspective is well described by Varela, Thompson and Rosch, and developed by Evan Thompson in his 2007 book Mind in Life, and by others, including Alicia Juarrero (Dynamics inAction) and Andy Clark (Surfing Uncertainty), for example.”

 

Beyond reductionism – systems biology gets dynamic
By Kevin Mitchell – September 14, 2019

Is biology just complicated physics? Can we understand living things as complex machines, with different parts dedicated to specific functions? Or can we finally move to investigating them as complex, integrative, and dynamic systems?

For many decades, mechanistic and reductionist approaches have dominated biology, for a number of compelling reasons. First, they seem more legitimately scientific than holistic alternatives – more precise, more rigorous, closer to the pure objectivity of physics. Second, they work, up to a point at least – they have given us powerful insights into the logic of biological systems, yielding new power to predict and manipulate. And third, they were all we had – studying entire systems was just too difficult. All of that is changing, as illustrated by a flurry of recent papers that are using new technology to revive some old theories and neglected philosophies.

The central method of biological reductionism is to use controlled manipulation of individual components to reveal their specific functions within cells or organisms, building up in the process a picture of the workings of the entire system. This approach has been the mainstay of genetics, biochemistry, cell biology, developmental biology, and even neuroscience. When faced with a system of mind-boggling complexity, it makes sense to approach it in this carefully defined, controlled manner. In any case, in most of these fields it was technically only possible to manipulate one or a few components at a time and only possible to measure their effects on one or a few components of the system.

The productivity of reductionist methods, and the lack of viable alternatives, brought with it a widespread but often tacit commitment to theoretical reductionism – the idea that the whole really is not much more than the sum of its parts. Appeals to holism seem to many biologists not just out of reach technically, but somehow vague, fuzzy, and unscientific. We are trained to break a system down to its component parts, to assign each of them a function, and to recompose the systems and subsystems of organisms and cells in an isolated, linear fashion.

We can see this in genetics, with the isolation of a gene for this or a gene for that. Or in signal transduction, with the definition of linear pathways from transmembrane receptors, through multiple cytoplasmic relays, to some internal effectors. Or in neuroscience, with the assignment of specific and isolated functions to various brain regions, based on lesion studies or activation in fMRI experiments.

The trouble is that is not how cells and organisms work. Defining all these isolated functions and linear pathways has been productive, but only from a certain perspective and only up to a point. This enterprise has mostly depended on analysing responses to strong experimental manipulations – a trusted method to perturb the system but one that is inherently artificial (what Francis Bacon, the so-called father of empiricism, called “vexing nature”)*. And it has mostly analysed effects on limited, pre-defined readouts.

via Beyond reductionism – systems biology gets dynamic

shared work — news + insights — the outside

via shared work — news + insights — the outside

 

Why Shared Work?

Tuesday and Tim developed this foundational model for use in their systems change work. Time and time again, it has helped us to get unstuck when working in very diverse groups. We must identify our Shared Work together to move forward collaboratively. In this first session, Tim and Tuesday lay the groundwork for understanding this pragmatic model for systems change.

These links are tagged ‘shared work’ and give an overview – they also offer an online course (paid) at https://onlinecourses.findtheoutside.com/courses/shared-work

Helpfully subversive about frameworks | Agendashift

via Helpfully subversive about frameworks | Agendashift

Cybernetics: The Macy Conferences 1946-1953: The Complete Transactions (2016) — Monoskop Log

Cybernetics: The Macy Conferences 1946-1953: The Complete Transactions (2016)
16 December 2019, dusan
Filed under book, proceedings | Tags: · cybernetics, information, information theory, mathematics

“Between 1946 and 1953, the Josiah Macy, Jr. Foundation sponsored a series of conferences aiming to bring together a diverse, interdisciplinary community of scholars and researchers who would join forces to lay the groundwork for the new science of cybernetics. These conferences, known as the Macy conferences, constituted a landmark for the field. They were the first to grapple with new terms such as information and feedback and to develop a cohesive and broadly applicable theory of systems that would become equally applicable to living beings and machines, economic and cognitive processes, and many scholarly disciplines. The concepts that emerged from the conferences come to permeate thinking in many fields, including biology, neurology, sociology, ecology, economics, politics, psychoanalysis, linguistics, and computer science.

This book contains the complete transcripts of all ten Macy conferences and the guidelines for the conference proceedings. These transcripts are supplemented with an introduction by Claus Pias that charts the significance of the Macy conferences to the history of science.”

Edited and with a Foreword by Claus Pias
Publisher Diaphanes, Zürich, 2016
ISBN 9783037345986, 3037345985
734 pages

Publisher
Distributor
WorldCat

PDF (89 MB)

via Cybernetics: The Macy Conferences 1946-1953: The Complete Transactions (2016) — Monoskop Log

Trailer videos for the Systemic Leadership Summit #SLS2020, January 12-19, 2020 (and available online after that)

via 07. Benjamin Taylor – Paradoxes, polarities and paradigm shifts in systems work – Trailer – YouTube

Oh hey – it’s me! And Jennifer Campbell made me sound smart 🙂

sampler for #SLS2020, the systemic leadership summit – book now https://sls2020sp01.krtra.com/t/PYBIDA2rkjYf
(affiliate link) – you can book for live access as they go out, access plus recordings, or both plus transcripts!

And get the likes of:
Dave Snowden https://www.youtube.com/watch?v=PiL70pgK6Gs
Ed & Peter Schein https://www.youtube.com/watch?v=a_pCOdFXf9c

Dr Glenda Eoyang
Joan Lurie
Dr Louis Klein
Nora Bateson
Dr Mette Boll
Patrick Hoverstadt
Dr Orit Gal

…and many many more!

APM – Developing the practice of governance

Patrick Hoverstadt on linked says:
https://www.linkedin.com/feed/update/urn:li:activity:6620727231649988608

Project X report on the governance of major projects for UK government has VSM at its core.

Developing the practice of governance | APM Research

Source: Developing the practice of governance | APM Research

Developing the practice of governance
About the research
This research highlights the fact that good governance is the key to establishing a successful project by exploring academic literature combined with expert input from practitioners to understand what is known and where gaps in the knowledge base lie. The research focused on governance of large public-sector projects and the report aims to provide guidance to project professionals. The research is part of Project X, a broader research programme seeking to generate insights into major government projects and programmes.

The review has three purposes:

To synthesise and summarise the knowledge base on project governance and assurance
To identify from the academic literature, gaps in the existing knowledge base
To provide guidance from both knowledge of practice and academic research.

Why is the research important?
It has been identified that a project needs to be governed from concept all the way through to delivery in order to be successful. Despite this the literature review that was undertaken as part of this study has shown that there are significant gaps in the knowledge base and that the literature does not agree on the structure of a robust project governance model, only that it should be based around four key principles.

This research looked at different types of projects, fixed-goal and moving-goal, and has endeavoured to give professionals guidance for the governance of each. It also looked at how governance changes during the different phases in the project Lifecyle.

Intended audience
The study should be of interest to experienced project professionals in both the public and private sector and anyone with an interest in the governance of major projects.

What did we discover?
The review found that:

There is a considerable amount of literature available, either directly based on project governance or in areas of importance for governance however, it also found, despite this, that some areas of governance have very little research. This highlights areas in which no firm guidance has been identified within the knowledge base. These areas include complexity, assurance, the informal phase, avoiding excess optimism and benefits realisation and maturity models.
The research has highlighted that an assurance system is an integral part of governance and therefore the assurance system needs to be developed alongside the governance system.
The research has found that two types of project exist (fixed-target and moving-target) and that the governance and assurance system will be very different for these projects because they are fundamentally different entities.
The research has identified that there have been cases of metric manipulation reported.
Many problems within major public investment projects have their origins before the final decisions to go ahead, which means that there are opportunities to re-scope and improve the projects. Soft analysis methods are important in helping to ‘se through complexity’ and inform major decisions. Soft analysis methods should be applied to all major projects to identify the most critical issues and risks ahead of time and ahead of the final decision to go ahead.
Acknowledgements
APM and the authors would like to acknowledge the support of the Infrastructure and Projects Authority (IPA), along with colleagues within the Project X research initiative. They are also grateful for the important contributions of the participating organisations, individuals and access to data to enable this research to take place. For more information on Project X, please visit www.bettergovprojects.com

Source: Developing the practice of governance | APM Research

Systems thinking and startups – any ideas

Someone on twitter asked me about the application of systems thinking to startups, specifically:

any case study of applying systems thinking for developing a new product on a small scale within a short time frame
Or simply systems thinking applied by startups

It’s a fair question and setting out to look for some answers reminded me why people often castigate systems thinking as simplistic and limited – some half-regurgitated systems dynamics, Senge, vague links to Deming, ‘service design’, and Theory of Constraints, mentions of Ackoff and maybe a token ‘wilder’ systems thinker (in this case, funnily enough, John Gall) – it’s not that there isn’t value there – the links below often present some solid ideas, well. But I can’t find anything really neat and solid in this space.

What I would recommend is a good look at the core underlying dynamics of organisation, as presented by the VSM – http://scio.org.uk/sites/default/files/VSM_ph_db.pdf and http://www.scio.org.uk/resource/vsmg_3/screen.php?page=home

And (explicitly covering some of the above as well as systems thinking, and only partially systems thinking, with a strong focus on public services), look for what is interesting in the RedQuadrant reading list – https://drive.google.com/open?id=1zHUp52IRdoiQYoQTdsQDiqFH-xymURV3&authuser=benjamin.taylor@redquadrant.com&usp=drive_fs

All contributions on this specific topic welcome!

Articles I found:

Applying basic systems dynamics thinking: https://www.techinasia.com/talk/apply-systems-thinking-startup (another version of same https://www.jotform.com/blog/systems-thinking/)

And more: https://blog.teamweek.com/2019/09/5-advantages-of-systems-thinking/

Ackoff and Fifth Discipline-inspired ‘organic, social system’ (then Deming and H. Thomas Johnson) https://www.strategy-business.com/article/10210?gko=cf094

A little bit of modelling and organisation/environment fit (framed as ‘customer’) https://blog.leanstack.com/your-business-model-is-a-system-and-why-you-should-care-a9c3164c5d3a

Another one which says Lean Startup is insufficient, and gestures at ‘systems’ as holistic thinking: https://innov8rs.co/news/lets-get-real-lean-startup-not-right-everyone/

Simple opinion piece on non-linearity and looping thinking: https://nextconf.eu/2019/02/why-we-need-systems-thinking/#gref

Think about the ‘wider systems’ impact of your startup https://medium.com/maria-01/time-to-burst-techs-bubble-systems-thinking-in-tech-7e60855958a

Five recommended systems thinking books (a couple of nice surprises in there) https://hackernoon.com/5-books-that-ramp-up-your-systems-thinking-ability-74fa76f86dce

‘Two hands are a lot’ — we’re hiring data scientists, project managers, policy experts, assorted weirdos… – Dominic Cummings’s Blog

Well, this has been a bit controversial on twitter – however I thought it might be of interest.

 

Source: ‘Two hands are a lot’ — we’re hiring data scientists, project managers, policy experts, assorted weirdos… – Dominic Cummings’s Blog

‘Two hands are a lot’ — we’re hiring data scientists, project managers, policy experts, assorted weirdos…

‘This is possibly the single largest design flaw contributing to the bad Nash equilibrium in which … many governments are stuck. Every individual high-functioning competent person knows they can’t make much difference by being one more face in that crowd.’ Eliezer Yudkowsky, AI expert, LessWrong etc.

‘[M]uch of our intellectual elite who think they have “the solutions” have actually cut themselves off from understanding the basis for much of the most important human progress.’ Michael Nielsen, physicist and one of the handful of most interesting people I’ve ever talked to.

‘People, ideas, machines — in that order.’ Colonel Boyd.

‘There isn’t one novel thought in all of how Berkshire [Hathaway] is run. It’s all about … exploiting unrecognized simplicities.’ Charlie Munger,Warren Buffett’s partner.

‘Two hands, it isn’t much considering how the world is infinite. Yet, all the same, two hands, they are a lot.’ Alexander Grothendieck, one of the great mathematicians.

*

There are many brilliant people in the civil service and politics. Over the past five months the No10 political team has been lucky to work with some fantastic officials. But there are also some profound problems at the core of how the British state makes decisions. This was seen by pundit-world as a very eccentric view in 2014. It is no longer seen as eccentric. Dealing with these deep problems is supported by many great officials, particularly younger ones, though of course there will naturally be many fears — some reasonable, most unreasonable.

Now there is a confluence of: a) Brexit requires many large changes in policy and in the structure of decision-making, b) some people in government are prepared to take risks to change things a lot, and c) a new government with a significant majority and little need to worry about short-term unpopularity while trying to make rapid progress with long-term problems.

There is a huge amount of low hanging fruit — trillion dollar bills lying on the street — in the intersection of:

  • the selection, education and training of people for high performance
  • the frontiers of the science of prediction
  • data science, AI and cognitive technologies (e.g Seeing Rooms, ‘authoring tools designed for arguing from evidence’, Tetlock/IARPA prediction tournaments that could easily be extended to consider ‘clusters’ of issues around themes like Brexit to improve policy and project management)
  • communication (e.g Cialdini)
  • decision-making institutions at the apex of government.

We want to hire an unusual set of people with different skills and backgrounds to work in Downing Street with the best officials, some as spads and perhaps some as officials. If you are already an official and you read this blog and think you fit one of these categories, get in touch.

The categories are roughly:

  • Data scientists and software developers
  • Economists
  • Policy experts
  • Project managers
  • Communication experts
  • Junior researchers one of whom will also be my personal assistant
  • Weirdos and misfits with odd skills

We want to improve performance and make me much less important — and within a year largely redundant. At the moment I have to make decisions well outside what Charlie Munger calls my ‘circle of competence’ and we do not have the sort of expertise supporting the PM and ministers that is needed. This must change fast so we can properly serve the public.

A. Unusual mathematicians, physicists, computer scientists, data scientists

You must have exceptional academic qualifications from one of the world’s best universities or have done something that demonstrates equivalent (or greater) talents and skills. You do not need a PhD — as Alan Kay said, we are also interested in graduate students as ‘world-class researchers who don’t have PhDs yet’.

You should have the following:

  • PhD or MSc in maths or physics.
  • Outstanding mathematical skills are essential.
  • Experience of using analytical languages: e.g. Python, SQL, R.
  • Familiarity with data tools and technologies such as Postgres, Scikit Learn, NEO4J.

A few examples of papers that you will be considering:

You should be able to explain to other mathematicians, physicists and computer scientists the ideas in such papers, discuss what could be useful for our projects, synthesise ideas for other data scientists, and apply them to practical problems. You won’t be expert on the maths used in all these papers but you should be confident that you could study it and understand it.

We will be using machine learning and associated tools so it is important you can program. You do not need software development levels of programming but it would be an advantage.

Those applying must watch Bret Victor’s talks and study Dynamic Land. If this excites you, then apply; if not, then don’t. I and others interviewing will discuss this with anybody who comes for an interview. If you want a sense of the sort of things you’d be working on, then read my previous blog on Seeing Rooms, cognitive technologies etc.

B. Unusual software developers

We are looking for great software developers who would love to work on these ideas, build tools and work with some great people. You should also look at some of Victor’s technical talks on programming languages and the history of computing.

You will be working with data scientists, designers and others.

C. Unusual economists

We are looking to hire some recent graduates in economics. You should a) have an outstanding record at a great university, b) understand conventional economic theories, c) be interested in arguments on the edge of the field — for example, work by physicists on ‘agent-based models’ or by the hedge fund Bridgewater on the failures/limitations of conventional macro theories/prediction, and d) have very strong maths and be interested in working with mathematicians, physicists, and computer scientists.

The ideal candidate might, for example, have a degree in maths and economics, worked at the LHC in one summer, worked with a quant fund another summer, and written software for a YC startup in a third summer!

We’ve found one of these but want at least one more.

The sort of conversation you might have is discussing these two papers in Science (2015)Computational rationality: A converging paradigm for intelligence in brains, minds, and machines, Gershman et al and Economic reasoning and artificial intelligence, Parkes & Wellman

You will see in these papers an intersection of:

  • von Neumann’s foundation of game theory and ‘expected utility’,
  • mainstream economic theories,
  • modern theories about auctions,
  • theoretical computer science (including problems like the complexity of probabilistic inference in Bayesian networks, which is in the NP–hard complexity class),
  • ideas on ‘computational rationality’ and meta-reasoning from AI, cognitive science and so on.

If these sort of things are interesting, then you will find this project interesting.

It’s a bonus if you can code but it isn’t necessary.

D. Great project managers.

If you think you are one of the a small group of people in the world who are truly GREAT at project management, then we want to talk to you. Victoria Woodcock ran Vote Leave — she was a truly awesome project manager and without her Cameron would certainly have won. We need people like this who have a 1 in 10,000 or higher level of skill and temperament.

The Oxford Handbook on Megaprojects points out that it is possible to quantify lessons from the failures of projects like high speed rail projects because almost all fail so there is a large enough sample to make statistical comparisons, whereas there can be no statistical analysis of successes because they are so rare.

It is extremely interesting that the lessons of Manhattan (1940s), ICBMs (1950s) and Apollo (1960s) remain absolutely cutting edge because it is so hard to apply them and almost nobody has managed to do it. The Pentagon systematically de-programmed itself from more effective approaches to less effective approaches from the mid-1960s, in the name of ‘efficiency’. Is this just another way of saying that people like General Groves and George Mueller are rarer than Fields Medallists?

Anyway — it is obvious that improving government requires vast improvements in project management. The first project will be improving the people and skills already here.

If you want an example of the sort of people we need to find in Britain, look at this on CC Myers — the legendary builders. SPEED. We urgently need people with these sort of skills and attitude. (If you think you are such a company and you could dual carriageway the A1 north of Newcastle in record time, then get in touch!)

E. Junior researchers

In many aspects of government, as in the tech world and investing, brains and temperament smash experience and seniority out of the park.

We want to hire some VERY clever young people either straight out of university or recently out with with extreme curiosity and capacity for hard work.

One of you will be a sort of personal assistant to me for a year — this will involve a mix of very interesting work and lots of uninteresting trivia that makes my life easier which you won’t enjoy. You will not have weekday date nights, you will sacrifice many weekends — frankly it will hard having a boy/girlfriend at all. It will be exhausting but interesting and if you cut it you will be involved in things at the age of ~21 that most people never see.

I don’t want confident public school bluffers. I want people who are much brighter than me who can work in an extreme environment. If you play office politics, you will be discovered and immediately binned.

F. Communications

In SW1 communication is generally treated as almost synonymous with ‘talking to the lobby’. This is partly why so much punditry is ‘narrative from noise’.

With no election for years and huge changes in the digital world, there is a chance and a need to do things very differently.

We’re particularly interested in deep experts on TV and digital. We also are interested in people who have worked in movies or on advertising campaigns. There are some very interesting possibilities in the intersection of technology and story telling — if you’ve done something weird, this may be the place for you.

I noticed in the recent campaign that the world of digital advertising has changed very fast since I was last involved in 2016. This is partly why so many journalists wrongly looked at things like Corbyn’s Facebook stats and thought Labour was doing better than us — the ecosystem evolves rapidly while political journalists are still behind the 2016 tech, hence why so many fell for Carole’s conspiracy theories. The digital people involved in the last campaign really knew what they are doing, which is incredibly rare in this world of charlatans and clients who don’t know what they should be buying. If you are interested in being right at the very edge of this field, join.

We have some extremely able people but we also must upgrade skills across the spad network.

G. Policy experts

One of the problems with the civil service is the way in which people are shuffled such that they either do not acquire expertise or they are moved out of areas they really know to do something else. One Friday, X is in charge of special needs education, the next week X is in charge of budgets.

There are, of course, general skills. Managing a large organisation involves some general skills. Whether it is Coca Cola or Apple, some things are very similar — how to deal with people, how to build great teams and so on. Experience is often over-rated. When Warren Buffett needed someone to turn around his insurance business he did not hire someone with experience in insurance: ‘When Ajit entered Berkshire’s office on a Saturday in 1986, he did not have a day’s experience in the insurance business’ (Buffett).

Shuffling some people who are expected to be general managers is a natural thing but it is clear Whitehall does this too much while also not training general management skills properly. There are not enough people with deep expertise in specific fields.

If you want to work in the policy unit or a department and you really know your subject so that you could confidently argue about it with world-class experts, get in touch.

It’s also the case that wherever you are most of the best people are inevitably somewhere else. This means that governments must be much better at tapping distributed expertise. Of the top 20 people in the world who best understand the science of climate change and could advise us what to do with COP 2020, how many now work as a civil servant/spad or will become one in the next 5 years?

G. Super-talented weirdos

People in SW1 talk a lot about ‘diversity’ but they rarely mean ‘true cognitive diversity’. They are usually babbling about ‘gender identity diversity blah blah’. What SW1 needs is not more drivel about ‘identity’ and ‘diversity’ from Oxbridge humanities graduates but more genuine cognitive diversity.

We need some true wild cards, artists, people who never went to university and fought their way out of an appalling hell hole, weirdos from William Gibson novels like that girl hired by Bigend as a brand ‘diviner’ who feels sick at the sight of Tommy Hilfiger or that Chinese-Cuban free runner from a crime family hired by the KGB. If you want to figure out what characters around Putin might do, or how international criminal gangs might exploit holes in our border security, you don’t want more Oxbridge English graduates who chat about Lacan at dinner parties with TV producers and spread fake news about fake news.

By definition I don’t really know what I’m looking for but I want people around No10 to be on the lookout for such people.

We need to figure out how to use such people better without asking them to conform to the horrors of ‘Human Resources’ (which also obviously need a bonfire).

*

Send a max 1 page letter plus CV to ideasfornumber10@gmail.com and put in the subject line ‘job/’ and add after the / one of: data, developer, econ, comms, projects, research, policy, misfit.

I’ll have to spend time helping you so don’t apply unless you can commit to at least 2 years.

I’ll bin you within weeks if you don’t fit — don’t complain later because I made it clear now.

I will try to answer as many as possible but last time I publicly asked for job applications in 2015 I was swamped and could not, so I can’t promise an answer. If you think I’ve insanely ignored you, persist for a while.

I will use this blog to throw out ideas. It’s important when dealing with large organisations to dart around at different levels, not be stuck with formal hierarchies. It will seem chaotic and ‘not proper No10 process’ to some. But the point of this government is to do things differently and better and this always looks messy. We do not care about trying to ‘control the narrative’ and all that New Labour junk and this government will not be run by ‘comms grid’.

As Paul Graham and Peter Thiel say, most ideas that seem bad are bad but great ideas also seem at first like bad ideas — otherwise someone would have already done them. Incentives and culture push people in normal government systems away from encouraging ‘ideas that seem bad’. Part of the point of a small, odd No10 team is to find and exploit, without worrying about media noise, what Andy Grove called ‘very high leverage ideas’ and these will almost inevitably seem bad to most.

I will post some random things over the next few weeks and see what bounces back — it is all upside, there’s no downside if you don’t mind a bit of noise and it’s a fast cheap way to find good ideas…

Microservices and Biological Systems

 

Source: Microservices and Biological Systems

Microservices and Biological Systems

Mallard with six ducklings swimming

In 2010, several researchers at Yale attempted to look at biological systems versus computer software design. As would be expected, biological systems, which evolved over millions of years, are much more complex, have a considerable amount of redundancy and lack a direct top-down control architecture as found in software like the Linux kernel1. While these comparisons aren’t entirely fair, considering the complexity of biology2, they are a fun thought experiment. Microservices are a new-emergent phenomenon in the software engineering world, and in many ways, microservice architectures evolve in environments that are much closer to a biological model than that of carefully architected, top-down approaches to monolithic software.

In 2007, I was introduced to the concept of Service Orientated Architecture. The general concept around SOA is that in a large company where you had a lot of teams and data, departments who were the source of record for certain types of data would also provide services to access and modify that data. This initially took the form of just having shared libraries, but eventually evolved to network services, often web services, using SOAP as their transport. At least, that was the idea in concept, but in reality, often many different teams would write similar services for the same parts of the database and have direct access to a lot of the same data stores. Some teams would open services up to others, and services would require multiple versions to be maintained concurrently in order to transition from one version to another. This lead to a system that was complex, coupled in odd ways that it probably shouldn’t have been, and could lead an organization into the current era of microservices.

“…So here’s a graph of [Uber’s] service growth. You’ll note it doesn’t end at a thousand, even though I said a thousand. This is because we don’t have a reliable way of sorta tracking this over time. It’s like bizarre, but it’s somehow really hard to get the exact number of services that are running in production at any one time because it’s changing so rapidly. There are all these different teams who are building all these different things and they’re cranking out new services every week. Some of the services are going away, but honestly, tracking the total count is like this sort of weird anomaly … it’s sort of not a thing people care about.” -Matt Ranney, GOTO 20163

The speed at which software engineers can write and deploy services has grown significantly in the last couple of years. It is still possible to quickly write a lot of bad software (services without unit tests, committed without code reviews, and riddled with security issues, that contribute to technical debt). However, it’s also become easier to develop well written software, with good test coverage and continuous integration pipelines. In a mid to large sized company, this can cause rapid growth in many interdependent services, tooling pipelines and 3rd party integrations.

Environments where microservices start to thrive are much more akin to biological processes. The traditional waterfall approach requires that components be laid out in a linear fashion, often with rigorous design documentation, with each component being completed and tested before dependent tasks can begin. This type of process was essential to ensuring quality and minimizing delays.

Back in the era of AS/400s and mainframe computers, small mistakes were not easy to undo, and could lead to delays of tens of thousands of man hours and millions of dollars. Even today, hardware still needs to meet strict requirements. In 1994, the Pentium FDIV bug, which affected floating point math, lead to a recall with an estimated cost of $475 million USD4. In more recent years, Intel has been hit by numerous security concerns over side-channel attacks including Spectre and Meltdown. Mitigating these attacks in software can lead to considerable performance issues in some workloads5. In the world of physical engineering, adhering to strict approaches, and heavily testing any new approaches, is essential. Any oversight can be potentially disastrous, such as with the 2018 bridge collapse at the Florida International University-Sweetwater6, or the ongoing grounding of the Boeing 737-MAX 8.

There is a lot of software today that is written for business cases and non-critical systems. Software outages and failures could lead to a loss of money or convenience for some, but people won’t die if they can’t reach Instagram or YouTube for a day (although I’m some people wouldn’t shut up about it, and become incredibly annoying to be around). These are the types of environments where microservices tend to incubate, grow and thrive.

From the Ground Up

Microservices tend to be built around things. Products can be built at different speeds throughout a company. It’s not uncommon for a service to handle multiple versions of a given message schema, in order to be ready for the time when other teams can transition. Political motivations and pet projects are things that microservices usually work around, but they can also be used to introduce new technologies without interfering with current workflows. Sometimes services provide a level of redundancy, or at the very least, immutability. Some teams choose to never change a service once it’s deployed; instead simply deploying a new version and telling everyone to get off the old one. If a bad service loads two million records incorrectly, a team can often fix the service and re-queue messages in order to back-fill the data.

“Everything’s a tradeoff … You might choose to build a new service, instead of fixing something that’s broken … that doesn’t maybe seem like a cost at first … maybe that seems like a feature. I don’t have to wade into that old code and risk breaking it. But at some point the costs of always building around problems and never cleaning up the old problems … starts to be a factor. And another way to say that is, you might trade complexity for politics … instead of having to have a maybe awkward conversation with some other human beings … this is really easy to avoid if you can just write more software…that’s a weird property of this system…“ -Matt Ranney, GOTO 20163

It’s not that microservices can’t be built to be resilient, but in an environment filled with services, where everything is now a remote procedure call into a complex system, they have to be built well. Resiliency, handling bad data and monitoring all must be built into each service in order for the entirety of the business process to be reliable.

In all these situations, microservices evolved around systems that are developed very quickly. They tend to grow from breaking down larger monolithic applications into their core components. You should never start with microservice. In a company where a newly hired principal engineer comes in from a microservice shop, they may tend to immediately build small modules in individual repository with empty stubs everywhere. This is a terrible idea.

You should always start with a monolith. Ensure that it is well tested, well designed and developed with several iterations from the core developers. Trying to start with microservices will lead to changes needing to be merged into dependent projects, in order to increment a version number, just so those features are available for downstream projects. It turns into a fragile mess of empty stubs, missing documentation, and inconsistent projects, instead of the strong independent (yet potentially redundant) series of systems that come from a more natural evolution of software development.

Caveat

I want to make the disclaimer that I’m not equivocating the complexity in microservices to actual biological systems. I’m simply using biology to the measure it was used in the aforementioned PNAS paper, which made comparisons between cell regulation and the monolithic Linux Kernel1. In Ierymenko’s article on artificial intelligence, he makes the argument that neurons can’t be modeled as simple circuits or closed-form equations2. The following image shows the gene regulatory network diagram from e. coli (left), a literal poop microbe. Compare that to a partial human cell’s gene regulatory network (right) which is important for understanding variability in cancer7.

Regulatory Network from e. coli (left) compared to a subset of a regulatory network in a human cell (right)
Regulatory Network from e. coli (left) compared to a subset of a regulatory network in a human cell (right)

Actual biological systems are insanely complex. For decades we’ve barely scratched the surface on understanding gene regulatory systems. In this context, the comparison I’m showing simply makes for a fun, and hopefully useful, analogy.

Conclusions

Microservices can be done right, or rather, after systems evolve at an organization, some teams can have really good, well thought out services, with large numbers of unit and integration tests. Yet, they are still the ultimate product of an often weird evolutionary process, that tends to be muddled with technical debt, company policies, legal requirements and politics. People who try to start with a microservice model are asking for a world of pain and hurt. Good microservices come from using the foundation of well written monoliths as a template for splitting out and creating smaller components. You don’t build a city out of molecules. You have several layers of abstraction in place so you can build with bricks, structures and buildings.

Critical software is like building a bridge, where engineers attempt to think out each component and their integrations in their entirety. Mistakes in the software of someone’s pacemaker, or in the safety system of a vehicle, are bugs that literally can never be recovered from. In contrast, biological organisms tend to have static and unchanging components that preform a discrete set of tasks. Although susceptible to random mutations, the individual parts of an organism are vying for the best fitness in a given environment.

Microservices are more akin to biological evolution, often more resilient to change and inconsistency, and built to handle interference from the outside world. But like biological organisms, they are also complex, susceptible to environmental changes, disease and outside factors that can cause them to fail. Like a degenerative disease or cancer, they may have failures that propagate slowly or silently, in ways that are incredibly difficult to track down, diagnose and fix.

  1. Comparing genomes to computer operating systems in terms of the topology and evolution of their regulatory control networks. 18 May 2010. Yan, Fang, Bhardwaj, Alexander and Gerstein. PNAS.  2
  2. On the Imminence and Danger of AI. 16 March 2015. Ierymenko. (Archived)  2
  3. GOTO 2016 • What I Wish I Had Known Before Scaling Uber to 1000 Services • Matt Ranney. 28 Sept 2016. Ranney. Goto; 2016. (Video)  2
  4. Pentium PDIV flaw FAQ. 19 August 2011. Nicely. (E-mail / Archived) 
  5. Meltdown and Spectre. 2018. Graz University of Technology. Retrieved 23 December 2019. 
  6. The Ordinary Engineering Behind the Horrifying Florida Bridge Collapse. 16 March 2018. Marshall. Wired. 
  7. Vast New Regulatory Network Discovered in Mammalian Cells. 14 Oct 2011. Bioquick News. 

Source: Microservices and Biological Systems

London Space Launch – Systems Innovation – March 18 6:30-9:00pm Free (registration required)

 

Source: London Space Launch – Systems Innovation

This event is going to be a very special date in the development of our community as we launch the first of our localized groups here in London, UK. This space will be home to regular events including presentations and networking for those interested in systems thinking and systems change.

At this opening event in mid-March, Joss Colchester will give a presentation on the foundations of systems thinking, complexity theory, systems innovation and their relevance to the complex challenges faced by organizations today. This will be a “getting to know you” event where members will have the opportunity to introduce themselves and say a little about their interest in systems thinking.

More details about the event will be posted here closer to the date. If you wish to receive email notifications about events at this location then simply become a member by choosing it as an option during the registration process – or updated your profile if you are already a member.

Source: London Space Launch – Systems Innovation

s
search
c
compose new post
r
reply
e
edit
t
go to top
j
go to the next post or comment
k
go to the previous post or comment
o
toggle comment visibility
esc
cancel edit post or comment