Links to all five parts of this piece — not all of which are out yet, but will be available at these links if not already ::: Part 1 ::: Part 2 ::: Part 3 ::: Part 4 ::: Part 5
As a reminder, I said this in the previous introductory post where I was signposting the next few weeks’ worth of material that’ll be surfacing on these pages. Feedback, comments are extremely welcome!
…I wanted to briefly introduce the next few posts. Last autumn, after many months of stewing ideas and/or wildly procrastinating, I finally sat down to write a dirty first draft of a lengthy new article taking a critical look at the epistemological arc of the sciences (my original home territory) through the memoiristic lens of autotheory. It’s going to be included in the second issue of the wonderful Ars Scientia, a new journal of scientific arts, which takes as its second theme “axiomatics”. Since the article is a weird mix of history and philosophy of science, historical recollections, and post-disciplinary meaning-making, and also because it still needs a little something extra according to some critical feedback I received and agree with, I thought it might be a good place for us to start. It’s pretty long — first draft was over 10k words, and the journal editors originally asked for 3-5k — so I’ll be splitting it into 3 or 4 digests and saving a final one for some additional rumination and riffing.
I also wanted to include an excerpt of the great motivational text that the editors of Ars Scientia posted on their website, to give a sense of where my piece is coming from.
Ars Scientia Issue 2 on Axioms
“It had been tactless of me to prove something on the topic of man – mathematically!”
Stanislaw Lem, His Master’s Voice, 1968Axioms are “self-evidently true without proof.” In axiomatic systems, truth is preserved throughout the system as statements derive directly from the axioms. While the axiomatic approach can seem like a top-down monolith, in 1931 Gödel famously challenged the claims of metaphysical access through axiomaticity by demonstrating that within any such system there are true statements that cannot be proved. In addition to Gödel’s challenge, the history of axiomatic thinking also presents a multitude of traditions, concepts and crises.
For instance, automated theorem proving is a field of mathematics which attempts to automatize mathematical proof-writing with computing. Confirming the validity of this method and achieving automatization beyond first-order logic has been greatly challenging, where-as the methods of writing proofs by hand or with computerized (but not automated) proof assistants do not face the same scrutiny of “viability.” How does this history of mathematics relate to other interests more broadly?
In the first issue of Ars Scientia, we took the lead from Goethe's Metamorphosis of Plants as a study in the systematization of the relation between observer and phenomena, a relation that makes up a blueprint of the simultaneity of aesthetic practice and scientific inquiry. In the upcoming second issue of Ars Scientia, we are interested in what happens when, as Thomas Moynihan writes, “this plenty [of the organic world] and progressivism becomes divorced, decoupled from, devoid of purpose? What happens when the artisan recedes from the picture—as the century following Goethe witnessed unfold?” (2021, Philosophical Life of Plants). We want to extend this conflict beyond the discipline of mathematics or the illustration of nature and towards synthetic practices between the artistic and the scientific.
Are our axioms so silent as they might appear? What is formalization, systematization when devoid of purpose? What does it mean to reintroduce such a purpose? What are the possible axiomatics that speak from below and beyond the mathematical impetus of abstract proof? Where is the hand? Can the silent speak for the quivering?
Axiomatic Realism I: on the Fence of the Pure Image
The Foreclosure of Subjectivity
Fractals don’t care about your feelings
Systems disregard their externalities
Formalisms cannot reason about the real.
Cryptographic Necroprimitivism, 0xSalon, 2020
As the project of veracity lurches from crisis to crisis, we might ask ourselves: was it not always so? It seems uncontroversial to posit that, today, we are woefully mired in relativisms of all flavours, ranging from the phenomenological, to the moral, to the aesthetic. That said, the ‘existential threat to truth’ is ultimately and necessarily epistemic in nature, caught between the prongs of a pincer movement between the most unusual of allies.
Whereas one flank of this unholy amalgam is the incumbent doctrine of the scientific method — concerned with the processes and relationships of hypothesis, simulation, measurement, repetition, verification, and peer validation — the other is that which does not seek legitimacy in abstract formalisms or evidence at all, instead finding refuge in the extra-mundane, divine, and esoteric: spiritual believers, conspiracy theorists, “___ truthers”. These days, even fervent believers in the potency of the academy are somewhat obliged to accept the limits of knowledge, most noticeably in ‘pure scientific disciplines’ which have for the longest time gained their identity, gravitas, and relevance by way of a dogmatic absolutism. Here it will be argued that axiomatic thinking is a false God — a demiurge in the Gnostic sense — providing materialistic sensations of substance and sustenance, but only capable of delivering a pale and banal virtual senescence: a place where systematic thought goes in order to die, but can never escape purgatory. Caught in Dante’s Entfernung, in anticipation of the fulfilment of the prophecy of prophetic wisdom itself. Some sorrow for saṃsāra, tomorrow.
Pax Scryptographica
نَظير ⇌ 𝝖 ⇌ आकाश ⇌ 𝝮 ⇌ سمت الرأسAngelic stria intersect at heavenly incidences, mapping the frontiers of virtuous topologies. A Euclidean sacrament that transmutes algebra into liturgy. Formal expressions are derived, within which the principle components of faith are (k)not constant. Illuminated homeomorphisms: the preaching of sanctum bellum; the execution of symbolic closures. Heavenly trigonometriarchs elicit cosmognostic conjugations of constellations: divining nadir and zenith of the fault lines of desire. Before everything, Alpha and Omega beget Akasha.
At the risk of overly speedrunning the formalities of formality, there is a well-worn path from the earlier surviving traces of axiomatic reasoning which typically proceeds as follows. Ostensibly as a continuation of the Platonic school, Euclid’s Elements posited a plethora of fundamental axioms describing geometric relationships of points, lines, and planes in rectilinear space of varying dimensionalities. Euclid lived in approximately the third century BCE, teaching in Alexandria, during the Ptolemaic Kingdom in present day Egypt. Elements was, according to varying accounts, part original synthesis and part compilation of the works of earlier thinkers from both Hellenistic and Arabic spheres, including Hippocrates of Chios and Theudius, bringing about the enduring motifs of the so-called Platonic Solids. It is uncertain how much credit should be given to Euclid for the concretisation of this framework, as some scholars including Árpád Szabó suggest that pre-Euclidean mathematicians — including the Pythagoreans — were already employing formal deductive reasoning, as a counter-proposal to the Eleatic dialectics of Parmenides and Zeno. As time passed, and technologies for the mass-production of literature proliferated, Elements was reissued countless times (in the hundreds at the very least) and doubtless the work of later scholars was also included in these ever-expanding compendiums.
The Dawnward Spiral
1 ⇌ ∥ ⇌ ☩ ⇌ 𒀪 ⇌ ☪ ⇌ ∅ ⇌ ٥ ⇌ 0In the beginning, the binding of light and the bending of space heralded the corruption of cubiquity. Deviancy from the Chosen Way: the straight and narrow path charted by Descartesans. The Absolute, fracturing into horizons with no exit. The solvation of salvation, precipitating newfound nonlinear equilibria with the canonical Christallographic forms.
Many thinkers doubtless proposed, tested, and otherwise contributed to a continuous current of axiomatic inquiry from the Euclidean until ‘modernity’, taken here as emerging between the European Renaissance and the Enlightenment. In our haste, we must not leave Al-Khwarizmi out of the picture: based in Baghdad in the early 9th Century CE at the House of Wisdom (بَيْت الْحِكْمَة), Muhammad ibn Musa al-Khwarizmi, born in lands which constitute present-day Uzbekistan and Turkmenistan, compiled Al-Jabr — the etymological origin of today’s algebra — providing the first systematic treatment to address the solution of linear and quadratic equations.
In the midst of the Enlightenment, René Descartes brought geometry and algebra into the rectilinear analytic framework of what we now refer to as Cartesian coordinates. In addition, Decartes is credited with an attempt to axiomatise subjectivity itself, encapsulated in the phrase most commonly referred to as je pense, donc je suis or cogito ergo sum: I think, therefore I am. We should take heed that the final form of this proposal, included in his posthumous The Search for Truth by Natural Light, is in fact dubito, ergo sum, vel, quod idem est, cogito, ergo sum: I doubt, therefore I am - or what is the same - I think therefore I am. This equivalence between thought and doubt, unwittingly heralded the beginning of the end of the axiomatic project. Over time, the doubt that Descartes expressed has washed away, leaving only thought: criticality superseded by an ostensible rational neutrality.
The interplay between rigid formalisms providing certitude at the expense of verisimilitude, and the fuzzy, incommensurable, and contingent, would also be the later battleground between the 20th century paradigm of quantum mechanics and its predecessors. Quantum mechanics was developed in order to provide a system of knowledge that would address the ontological insecurity of classical (“Newtonian”) and relativistic (“Einsteinian”) physics in the domains within which they were demonstrably incapable of providing reasonable predictions or explanations for phenomena at the photonic, atomic, and molecular scales. Quantum systems have a restricted set of possible states of existence, and moving between those required particular ‘quantised’ transitions in order to take place. In the atomic frame of reference, the newfound understanding of electronic structure from theories such as the Pauli Exclusion Principle led to a long-overdue systematic understanding of how the fundamental characteristics of the quantum states of atomic constituents - protons, neutrons, and electrons - gave rise to the elements which constitute complex matter. In essence, the chaotic mess of chemistry became explainable through the variation of a handful of quantum properties. Another early success of quantum theory in terms of light was an improved understanding of black body radiation, as Max Planck sought to overcome the so-called Ultraviolet Catastrophe which could not accurately represent the relationship of radiation wavelength to thermal temperature at the short wavelengths of the ultraviole(n)t region of the electromagnetic spectrum.
The Perversion of the Curve
A debased, naked derivative. A future, with nothing underlying. A promise of endless renewal. Prescience without presence. An acceleration beyond finitude, surpassing luminal and libidinal limits. A proselytism without prophylaxis. Sickness of the cyclical, nonlinear kink, hyperbolic seduction. The subsumption of creation and destruction into symbolic relativistics. The Denizens of the Dysgraphsful, the Fallen Angles, the Godsphereing. Some sorrow for saṃsāra, tomorrow.
In the 1930s, the first generation of quantum theory was replaced by time-dependent formulations made possible by the Schrödinger equation: a partial differential equation that is capable of describing the form of the probability waves of a quantum system. de Broglie’s theories of duality sought to rationalise why light and matter could seemingly possess wave-like or particle-like characteristics, depending on the context. Heisenberg’s Uncertainty Principle posited limits of precision in the measurement of related qualities such as the position and the momentum of a particle. In doing so, the nature of unknowability at the most fundamental scales became apparent. Planck time, and Planck distance also enforced apparent limits on the resolution of the knowable. Quantum entanglement and the observer effect challenged the very bedrock of our understanding of the Universe: that the speed of light is inviolable, and that the subjectivity of a conscious agent could cause ripple effects far beyond their comprehension.
In the post-quantum age, the fuzziness of unknowability metastasised beyond the realm of particle physics. Kurt Gödel’s 1931 incompleteness theorem proposed that there are limits to provability in formal axiomatic theories: most concretely, that no consistent axiomatic system can prove all of the truths that are possible within the system. Furthermore, such a system cannot even prove its own consistency. Building upon Georg Cantor’s work with set theory and the transfinite, Gödel was in turn followed by Alfred Tarski's theorem on the formal undefinability of truth, Alonzo Church's proof that David Hilbert's ‘decision problem’ is unsolvable, and Alan Turing's theorem that there is no algorithm to solve the halting problem in order to determine the computability of an arbitrary process. The lunatics had taken over the epistemic asylum.
Concurrently with Gödel’s work on completeness, epistemology and philosophy of science re-entangled as Popper developed theories of falsification, intended to spur improvements in experimental design, and ultimately distinguish between ‘scientific’ and ‘non-scientific’ theories. Popper’s position was that confirmatory tests were impossible, and only disconfirmatory ones could be conducted. In essence, verification through provable veracity was no longer an option, even as mere desiderata. The very notion of truth had exited the building. Later in his career, Popper strived to go beyond Descartes in his attempt to systematise both thought and the products of it — institutions, inventions, art, laws, and so on — in a totalising, if crude, sort–of–object–oriented–ontology: his Three Worlds.
Responses to falsification theory came from several directions, most interestingly for us with treatments by Thomas Kuhn and Paul Feyerabend. Their respective texts ‘The Structure of Scientific Revolutions’ and ‘Against Method’ are both to be revisited later in this text. Kuhn’s book was published in 1969 and Feyerabend’s six years later, and though there are many commonalities, the differences in position are more pertinent for our purposes. Kuhn introduced the now-ubiquitous notion of the ‘paradigm shift’ as the mechanism by which progress occurs, as successive generations of incumbent scientific theories are overthrown and replaced by ideas which were initially considered fringe, gauche, or even heretical. It’s important to focus on the social aspect of this process: ultimately, the scientist must also be an influencer, not only performing the scientific method but also campaigning for the acceptance of the upstart paradigm. Kuhn saw competing paradigms as incommensurable with each other, requiring revolutions rather than the incremental or reformist intra-paradigmatic ‘normal science’ to make meaningful progress. For Kuhn, paradigms are the lenses through which researchers see the world, and perhaps more importantly limit what can be seen. The paradigm was conceived to be the default epistemological tool of world-building within the institutions of science.
Feyerabend’s ‘Against Method’ was supposed to be a counterpoint to contemporary Imre Lakatos’ ‘For Method’, with both to be published together in a book. Lakatos died after Feyerabend wrote his part, so it was ultimately published by itself. In ‘Against Method’, Feyerabend’s performative and provocative approach emboldened him to adopt extreme positions, which he might not necessarily have held with any real conviction, but were nonetheless interesting to explore. Just as an epistemic trickster might. Feyerabend called for the ousting of scientific method itself, advocating epistemological anarchy and a return to subjectivity. There was no longer a place for universalism or dogma, and all knowledge was contingent and incommensurable: in his words, “anything goes”. Feyerabend drew from the Dada movement, which in some senses, advocated for a violent return to anarchy.
Where Kuhn minimised the scientific method, Feyerabend wanted to abandon it altogether, instead centring scientific genius as the driving force of progress within this deep and radical subjectivity. Feyerabend attacked Kuhnian ‘normal science’ as the championing of drudgery, as opposed to creating the conditions to help the radicals and geniuses flourish. If axioms and paradigms were doctrinal scaffolds — the timber frames and brick walls, if you will — of the Church of Science, Feyerabend was an epistemic arsonist. If there is no paradigm, then the Kuhnian notion of progress seems limiting at the least, and actively harmful at worst. Toxic positivity…or should that be toxic positivism?
Here concludes our review of two thousand and three hundred years of certitude, ensconced between the logical incompletenesses of Gödel alongside the epistemological inadequacies identified by Feyerabend, in a benign but seemingly unescapable subjectivity coated with the thinnest veneer of rationalist worlding. A time later, around the turn of the millennium, your unwitting (and perhaps, unreliable) narrator enters the scholarly stage as a fresh-faced, naïve, and overly idealistic young buck/fool in the stale and starchy worlds of photophysics and spectroscopy. These two connected analytical fields straddle theory, experiment, and computational simulation between physics, chemistry, life sciences, engineering, and astronomy.
That’s it for the introductory section, next up is a story from my PhD years, starring section headings such as Hyperformalisation: is X a branch of pure mathematics?, Worse Things Happen at c, and A Gödel That Does Not Play Dice.