Thursday, November 20, 2025

Bishop George Berkeley – What, in God’s name, is an infinitesimal?

“These quantities seemed like ghosts, phantasms that were somehow necessary to calculate derivatives but vanished the moment they had served their purpose. Berkeley coined the phrase, “ghosts of departed quantities,” and it stuck”. Dr Colin WP Lewis For a realistic introduction to the great philosopher of science, and mathematician, George Berkeley, see my (Damien Mackey’s) article: Common sense philosophy of Irish Bishop George Berkeley (5) Common sense philosophy of Irish Bishop George Berkeley Meanwhile, Dr. Colin Lewis wrote this on George Berkeley in 2024: The Bishop Who Hated Infinitesimals (and Why It Matters) The Bishop Who Hated Infinitesimals (and Why It Matters) Bishop Berkeley and why we should fix AI’s Black Box …. Bishop Berkeley’s work is a strong reminder for AI developers and society. Many of us know about the Newton-Leibniz feud, that bitter quarrel of the calculus pioneers. Who invented it first, who borrowed from whom, and who ultimately deserved the glory. But behind this main act was a peculiar side story, one that features neither Newton's gravity-defying apple nor Leibniz's elegant notation. This is the story of George Berkeley, a philosopher-priest who … took direct aim at the very core of mathematics and may have asked a simple yet devastating question: What, in God’s name, is an infinitesimal? The answer, he implied, was that there really wasn't one, not in any meaningful, provable way. Who Was This Bishop, Anyway? George Berkeley was not, by any stretch, your typical critic of mathematics. He was a philosopher, first and foremost, and he wore the robe of an Anglican bishop in a corner of Ireland not exactly famous for producing math prodigies. Berkeley lived in a world shaped by the Enlightenment, an era where reason and empirical science were beginning to challenge religious and traditional authorities. Newtonian physics was rapidly becoming a kind of intellectual gospel, a framework through which one might explain both falling apples and the movement of stars. The rise of empirical methods and a mechanistic worldview presented a direct challenge to metaphysical and theological explanations, creating a cultural backdrop in which Berkeley's critiques took on profound significance. Yet Berkeley's own interests were less about the celestial and more about the nature of knowledge. He was deeply concerned with how we know what we know, not just the content of our knowledge. And when he took a long, hard look at Newtonian calculus, he saw a big, glaring problem. To Berkeley, calculus appeared to rest upon shaky conceptual ground. The mysterious infinitesimals, those infinitely small quantities that appear in the numerator or denominator only to vanish without a trace, bothered him. For example, in early calculus, one might consider the derivative of a function as the ratio of two infinitesimally small changes in variables, such as Δy/Δx, with Δ approaching zero. These quantities seemed like ghosts, phantasms that were somehow necessary to calculate derivatives but vanished the moment they had served their purpose. Berkeley coined the phrase, “ghosts of departed quantities,” and it stuck. He wanted mathematicians to explain, coherently and rigorously, how they justified their reliance on such specters. It wasn't until the introduction of the rigorous δ-ε (ε, δ) definition of limits by mathematicians like Cauchy and Weierstrass that calculus found a solid foundation. This approach allowed mathematicians to precisely define what it means for a function to approach a value, addressing Berkeley's concerns about the logical inconsistencies of infinitesimals. To Berkeley, these ghostly quantities were no better than a magician's sleight of hand, a trick where mathematicians pretended rigor while actually cutting corners. Damien Mackey’s comment: I doubt if George Berkeley would have been any more impressed by the Epsilon-Delta, δ-ε (ε, δ), ‘solution’ of Cauchy and Weierstrass. More mental artefacts. The question is asked at: real analysis - Does the epsilon-delta definition of limits truly capture our intuitive understanding of limits? - Mathematics Stack Exchange Does the epsilon-delta definition of limits truly capture our intuitive understanding of limits? …. I've been delving into the concept of limits and the Epsilon-Delta definition. The most basic definition, as I understand it, states that for every real number ϵ>0, there exists a real number δ>0 such that if 0<|x−a|<δ then |f(x)−L|<ϵ, where a is the limit point and L is the limit of the function f at a. While I grasp the formal definition, I'm grappling with the philosophical aspect of it. Specifically, I'm questioning whether this definition truly encapsulates our intuitive understanding of what a limit is. The idea of a limit, as I see it, is about a function's behavior as it approaches a certain point. However, the Epsilon-Delta definition seems to be more about the precision of the approximation rather than the behavior of the function. In the book "The Philosophy of Mathematics Today" by Matthias Schirn, on page 159, it is stated that: "At one point, Etchemendy asks: 'How do we know that our semantic definition of consequence is extensionally correct?' He goes on to say: 'That [this question] now strikes us odd just indicates how deeply ingrained is our assumption that the standard semantic definition captures, or comes close to capturing, the genuine notion of consequence' (Etchemendy 1990, 4-5). I do not think that this diagnosis is correct for some people: for some logicians, the question is similar to: How do we know that our epsilon-delta definition of continuity is correct?". This quote resonates with my current dilemma. Does the Epsilon-Delta definition truly capture the essence of what we mean by a 'limit'? though the epsilon-delta definition is a mathematical construct, what evidence do we have that it accurately reflects our intuitive concept of a limit? How can we be sure it is not merely a useful formalism, but a true representation of the limit as a variable approaching some value? Are there alternative definitions or perspectives that might align more closely with our intuitive understanding of limits? I would appreciate any insights or resources that could help me reconcile these aspects of the concept of limits. Thank you in advance for your help. ________________________________________ Dr. Colin Lewis continues: How to Roast a Mathematician To be fair, Berkeley wasn’t against mathematics. He was against bad epistemology. He published his arguments in The Analyst (1734), ostensibly as a critique of calculus, but also, perhaps even primarily, as a jab at atheists who used calculus to boast about the superiority of scientific over religious reasoning. His work challenged both mathematicians and those who saw themselves as the new intellectual priests of a secular age. Mathematicians like Colin Maclaurin and others attempted to counter Berkeley's arguments by defending the practical success of calculus, even if the foundational rigor was lacking. Maclaurin, for instance, worked to justify Newton's methods and demonstrated that the results of calculus were not merely coincidental but systematically reliable, despite Berkeley's philosophical objections. Berkeley did what any good philosopher does, he got under everyone's skin. He asked, in essence, ‘If you’re so smart, why are you still relying on ideas you can’t even define properly?’ The result was irritation, sure, but also introspection. What Berkeley managed to do was shake the faith in the calculus as it then stood. Not because he proved it wrong, the answers calculus produced were undeniably correct, but because he revealed that no one was quite sure why they were right. Mathematicians had powerful tools but lacked a firm philosophical foundation. The power of Newton's fluxions or Leibniz's differentials was there for all to see, but Berkeley's critique revealed the scaffolding underneath was less than sturdy. How could they talk about “infinitely small quantities” and act as though these unseeable, untouchable entities had real, measurable existence? They couldn't, at least not without a clearer articulation of the concepts in play. The Mathematicians Respond Berkeley may have lacked mathematical training, but his philosophical training was sharp enough to provoke a substantial response. The very discomfort he caused led to a slow but monumental shift in mathematics. His critique highlighted the need for rigor. Mathematicians like Augustin-Louis Cauchy and later Karl Weierstrass set out to reframe calculus in terms that even a skeptic could accept. As I mentioned above, enter the “δ-ε definition or (ε, δ)” of limits, a notion that allowed mathematicians to rigorously define what they meant when they said something was approaching zero, but not quite there. It took roughly a century for calculus to shed the phantoms Berkeley had identified, but it did. The ‘rigorization’ of calculus became one of the crowning achievements of 19th-century mathematics. Moreover, in the 20th century, infinitesimals themselves were given a rigorous foundation in non-standard analysis, a branch of mathematics developed by Abraham Robinson. This new approach provided a formal way to work with infinitesimals, addressing Berkeley's concerns in a modern context. So, while Berkeley's critiques were valid in his time, mathematics has since evolved to address these issues comprehensively. The Irony of Berkeley's Victory There’s a twist in this story, though, which Berkeley might have appreciated. In questioning calculus, he paved the way for a stronger, more resilient mathematical framework, even if it meant giving legitimacy to the very secular science he so often combated. His critique indirectly led to developments that would ultimately make calculus indispensable in the very scientific worldview that was undermining religion's intellectual authority. He never got to see this outcome; Berkeley died in 1753, long before the mathematicians took his rebuke to heart. But his role as the gadfly of early calculus is, if not celebrated, certainly acknowledged by those who understand the history of mathematics. He forced math to grow up, to address its own inconsistencies, and to become what it is today: a field rooted in careful, precise definition. ….

No comments:

Post a Comment