Jacob Welsh: Where do you think the Worse is Better approach falls between the two poles you describe? The current state of things seemed to me largely the inevitable result of the subsequent two decades' continuing dominance of exactly Gabriel's recipe, and for the reasons he described. Perhaps on the first take I confused it with "being practical"; and I found it repulsive although I could not refute it. Kind of like the "network effects" thing (looks like the continuation of that thread was shortly after the Young Hands closure, perhaps I should bridge it in).
Diana Coman: Reading that Gabriel piece you linked, I admit that.... hm, I found it quite painful to read because I can't really stand *either* of the approaches he describes. I suppose it can be simply because I'm actually more extreme than all extremists (only I care less to shout/argue about it really) but I suspect the difference really is one of more fundamental approach. I might need to give it some more thinking and a write-up to make it clear.
This is part of the write-up promised above. That suspicion of mine turns out to be quite true (and the supposition possibly not entirely false either!) - only it took first of all a lot more (re)reading of Richard P. Gabriel's books and references1 before I could start to get even as little as just a glimpse of what that difference might be, as it goes deep indeed and it spreads also into a whole web of roots before any of it shows out in the open, directly exposed to be looked at.
In a very compressed nutshell that threatens anyway to spill and grow not into a single tree but into a whole jungle at the very least, I think that software is essentially a cultural artefact rather than an industrial product and this means that both its impact and its generative process are intertwined and very far indeed from being adequately captured by any linear (whether iterative or not, piecemeal or not) model of mere patterns, recipes, methodologies or even neatly predictable at every step frameworks. Essentially, the generative process is fractal in nature. As a result, approaches don't have just a straight line on which to position themselves and what might seem wildly different steps when looked at from close-up or over a short-term interval, can predictably converge nevertheless to results that share crucial characteristics to the point where their differences -such as might still be found at a very close look- are in fact irrelevant.
Software being a cultural artefact goes both ways and in quite the precise manner, too - just as software impacts and transforms the wider social environment of its time, it is also influenced and constricted by it so that the observed patterns or shapes of successful software or even those of its development process are more likely to reflect and be restricted by wider patterns than merely those of the domain itself, let alone even finer grained ones such as those supposedly imposed through direct management. The exact way in which the broader patterns get reflected and find their expression in the software domain is not random either, nor is it based on either direct copy or just a broad and imprecise, seemingly haphazard similarity. Instead, the similarity is that of the fractal nature2 and as such, any attempt to convincingly model (and thus predict correctly) software development in a lasting way will have to find the correct mix of chaos and order or perhaps, at the very least, some sort of clever distillation of its core into something useful, a la DS.
There's a big gap indeed between patterns, methodologies and even complex industrial processes on one side and fractal modelling (such as it exists currently) on the other. Perhaps more of a chasm even rather than a mere gap - it is after all nothing less than moving from the neatly intuitive world of low integer dimensions to the bewildering world of non-integer dimensions. This is at the very root of why I perceive both "the right thing" and the "worse is better" approaches3 as merely variations of the same failure mode rather than alternatives in any meaningful sense: they both have in common an ultimately sterile, too narrow, even too linear at times focus, be it on some illusory ideal correctness in the first case or on some equally illusory ideal simplicity in the second case. The existing attempts at improvement, even when they go beyond the strict linearity of early models, still fail to address the root cause because their informing worldview conceives only of integer dimensions. And all the while, life teems instead beyond the decimal point.
From a more practical point of view, the narrow focus on an ideal, theoretical correctness fails in the marketplace for the very reasons that Gabriel identifies from his direct experience4 and I'll add to that only one piece of my relatively recent experience with just one instance of why and how theoretically wonderful ideals that are however practically untenable can literally turn even a valuable concept into a worse than useless software product. Note though that the trouble doesn't stem from *correctness* itself but from a narrow and too rigid focus on it that fails first of all to distinguish what truly matters from what doesn't and then to take into account the much more complex and less ideal constraints that real world use imposes nevertheless. This is why I consider this approach simply misguided at core - it's correctness wrongly applied that ends up sterile, not correctness per se.
Essentially, the narrow focus on a theoretical correctness, sometimes dubbed "the right thing" approach, chases the chimera of perfection as if perfect can ever apply to a live (hence, evolving) artefact existing in the real world. Like all products of the imagination though, such chimera can be indeed very attractive to contemplate or even to work towards, since it has, for as long as it lasts, the powerful allure of the ideal compounded by the unlimited freedom of imagination to pick and choose what it likes, to ignore what it doesn't want to handle. And it can be certainly made to seem entirely real and worth pursuing for a while, usually while there are still resources coming in from outside to support the chase. Sooner or later though, chaotic reality with its various messy parts including all sorts of different and even conflicting requirements and priorities catches up with it and the ideal turns out to be indeed neither possible in reality nor quite that easily adaptable at this juncture to it. As a result, whatever comes out of this approach tends to fail to thrive and survive (meaning further evolve rather than merely existing), dieing perhaps as a result of an overdose of order and structure or, looking at it from the other side, starved for lack of chaos.
If, however, one is so dispirited by such failure or so hungry for the sheer vital force that is indeed plentiful in chaos to abandon order and structure entirely and let chaos take full hold, the result is but a primitive beast, the most pedestrian interpretation of the "worse is better" argument. Such beast as might come out of chaos without much structural guidance at all is undeniably alive and even quite fascinating perhaps, especially for as long as it's still small enough so that some amount of taming is still possible and thus some use can be extracted out of it. But its chaotic nature means first of all that, as it gets bigger, it becomes increasingly difficult to control or handle at all, let alone grow or even merely steer in some specific direction that doesn't happen to match its inclination5.
As internal control becomes increasingly (and even non-linearly so) difficult, the focus necessarily shifts to controlling the environment instead and this is indeed more easily achieved at this juncture, nevermind that such control is spelling death in the longer term6 as it is stifling and draining any value out of the environment instead of adding to it. The money accumulates just as the vitality drains7 and there is then no choice but to press forwards and keep getting bigger through destruction of anything and everything else, through the consumption of anything that exists or that still finds a way to grow on its own8. Instead of creating anything, the overgrowth of internal chaos and external control merely sterilizes and otherwise consumes - currently this consumption process is generally dubbed "acquisition" and it usually targets startups that supposedly win whatever is paid for them but lose in fact not just any value that they had but even the very possibility of having a value or just a different approach of their own at all. Perhaps it even is a good deal for the startups or at least for those that never thought or stopped thinking that they had any other chance to matter, anyway.
Looking again at my notes from reading the "Patterns of Software" book and the set of "worse is better" essays by Gabriel, what strikes me most is how often my observations and questions either directly predict what turns out to follow a couple of paragraphs further or otherwise keep pushing for the broader context and at least a glimpse of a model of what is talked about, instead of just a collection of patterns. Arguably this is part and parcel of that fundamental difference of approach that I was noting above: I am always and forever apparently aiming to model something (meaning to understand and integrate it fully to the point that I can then reliably *predict* it, in other words generate it) and never fine with merely describing and cataloguing it, no matter in how much detail. At the same time, both Richard P. Gabriel and Cristopher Alexander, the original architect of the patterns approach, seem to state repeatedly that it's precisely the "generative" approach that they are aiming for and fail to achieve satisfactorily in practice.
Perhaps I'm missing something (and especially for Alexander's work, there is still a lot of reading that I need to do before I have something more specific to say about it) but to my eye there's quite a big and obvious missing link between describing something as observed (even across many different instances in various environments) and modelling successfuly its underlying, generative process. Extracting the common observed core is one approach useful for generalization, certainly but I don't think it's all that often enough for making that next step of modeling the process that resulted in the observed artefacts. Possibly the focus on descriptive patterns identification, definition and matching is in itself a mark of Gabriel's background in AI but for me it seems yet again uncomfortably close to that overspecialisation and overfitting, too narrow redefinition of success even, that I railed against and refused to go with, not so very long ago, despite any and all "common consensus" or "how things are done" or even apparent simplicity considerations. That choice of mine was extremely satisfying, very fruitful both in terms of concrete results and as a learning endeavour9 and the parts publicly documented on this blog are just that, only parts of a much bigger, working whole that exhibits precisely the sort of quality that proves so elusive (for a good, underlying reason, I'd say) when following the patterns-matching branch: it's generative and fully capable of carrying meaning too, when embedded in an environment designed on the same fundamental principles. This is neither an accident nor specific to computer graphics or online games but flowing directly from what modelling is and how it works.
Coming back to the question that prompted this article, if neither the chimera of perfection nor the beast of worse truly appeals, the unnamed other is yet to fully spring out from that very nutshell I mentioned earlier. Perhaps a more palatable (hence, practical) description of what that nutshell holds is that software requires a discerning, integrative and generative approach rather than a narrow focus. Discernment is crucial as it's the only way to identify what matters and without such a correct identification, everything that follows is already doomed to fail. Integration is required for growth, properly speaking and it should be a flexible rather than fixed approach, bringing together different parts that maintain their character and their own capability to further evolve, quite a different thing altogether from the rigid and constricting approach of separated objects, containers and the like, while also, at the higher level, very different indeed from that devouring approach of "acquisition". Generative means that the whole maintains at all times that balance between sufficient structure to support and enough chaos to enable new and even unexpected things to appear. These are not simple requirements and they don't work in isolation either. This is why I think that what is needed is not just some new development method as such but an environment to match, with a guiding, principled structure that supports and even causes as wild variety as possible followed then by selection and filtering. Note though that such selection and filtering are not meant to be restricted to the usually binary choice of yes/no, dies/lives. Selection and filtering should allow and even mean the much more variety-friendly choice of continuous readjustment of relative positions and relationships.
On the very bright side, I think that the necessary parts for all the above are in fact already available and what it takes as next step is putting them all together and to practical use. Hence my previously unexamined calling of that unnamed other approach that I've been already using in practice for quite some time as simply... being practical10.
Yes, I was already aware of his most famous writings but the start of that awareness dates back to more than 10 years ago, when I was much younger and knew way less - and as I had encountered his writing mainly in a sort of second hand way, through experience with approaches and results that claimed to follow his direction but I found to be misguided at best (those catalogues of software patterns come first to mind and I don't even want to go further with that recollection), I never really saw the point in digging deeper to the roots that apparently yielded such rotten fruit. Quite the example of why and how exactly the wrong publicity really is worse than no publicity at all because I think that his life experience is certainly valuable and despite being clearly, neatly and even entertainingly written and discussed in his books, it actually got more or less lost in the shadow of the sound bite that became... viral, indeed, quite ironically given its very topic.
In any case, after a full read of his books on patterns and on open source, I'd say that one can perhaps disagree with his conclusions, interpretations and even approach (from what I gather even he still disagrees at times with himself!), as well as dismiss his advice or direction but only after taking in fully the solid and very relevant account of his experience in the field that both informs and makes his writing read as it does.
As possibly always and everywhere, the sound bite is neither enough nor useful by itself and trying to pluck the results of experience without digesting at least the available account of that experience is just another form of misappropriation that results inevitably in sad and rotten fruit. The responsibility for such rotten fruit though is not with the original writing but with those who picked only the viral part out of its context and ran then blindly with it.
It is "the" and not "a" fractal nature. A breadcrumb to mark the entrance to this particular rabbit hole is in a footnote to a very compressed review of a very compressed film aptly called La Grande Belezza. And yes, I was indeed referencing exactly this, back when last summer ended. ↩
In fairness, from my reading of some of Richard P. Gabriel's books and essays as published on his website (Innovation Happens Elsewhere: Open Source as Business Strategy, Patterns of Software: Tales from the Software Community, Worse Is Better, Worse is Better is Worse, Is Worse Really Better?), it seems to me that he has been aiming and even advocating for exactly a less narrow approach to software development, not at all for what turned up in practice even while quoting his essay as inspiration or source. In other words and in reply perhaps to Jim Waldo's statement that "The classic essay on "worse is better" is either misunderstood or wrong", I think that the classic essay is mostly misunderstood -and even misappropriated, I would add- rather than wrong. ↩
Read the last two parts of his "Patterns of Software book" and you'll find it all there. ↩
I'm talking from direct experience with fully taming such a chaotic beast, not that I ever actually wanted to gain this sort of experience nor that it was what I had signed up for or even imagined programming might turn into. It's still quite real and if you think that it isn't so, I'm quite curious to hear about it so go ahead and leave a comment in that box below this article, please. ↩
There's a lot begging to be said on this point alone but for now I'll merely add a single breadcrumb picked more by serendipity than anything else: vitality can be very easily perceived as "too much to handle" and it's again a matter of and telling about the wider environment as a whole. ↩
This is how Microsoft ended up turning personal computers into a sort of semi-interactive television sets and Google ended up turning a vast library of knowledge into a giant advertisement panel and discussion of ideas into a mere popularity contest that bots are winning, too. ↩
Most of the past year's articles on this blog would fit as reference for this and they don't even capture it all, not even by far. ↩
On examination though, there certainly seems to be quite a lot more to this being practical of mine, since its attempted description doesn't fit even by far in all the 3000+ words of this article (without even counting the references) and there are also quite a few directions that I didn't even mention here, just to try and keep that nutshell reasonably compact and clear, hopefully. ↩
Comments feed: RSS 2.0