[Reader-list] IFS 3rd Posting: A Genealogy of the Code - II

Dwaipayan Banerjee dwaipayanbanerjee at yahoo.co.in
Thu Jun 7 23:56:22 IST 2007


INTRODUCTION TO THE SECOND POSTING

In this posting I continue my effort to uncover a
genealogy of symbolic code that is at the heart of
computing and programming today.  By this I mean both
in the languages of programming, as well in the design
of logical systems such as the computer - the binary
system.  In this post, we shift focus to symbolic
algebera in the 16th and 17th centuries - and look at
the works of those who preceded Leibniz and Boole. 
While Leibniz and Boole are credited (and rightly so)
for a definitive role in the history of thought that
made computing possible, we push the chronology
slightly further and begin to uncover the conditions
of thought that made their work possible.  We argue
that - alongwith the quest for universal languages in
the first posting - the development of symbolic
algebra was absolutely crucial to the development of
mathematical and computing logic.  

Also, this is not a historical quibbling on origins. 
A tracing of the genealogy and strengthening our
understanding of it allows us to make certain
arguments about code and mathematical languages as a
certain kind of symbolic arrangement in our modern
times.  By following the transformations and the
incredible variations that have occured in the
relationship between such language-systems and the
world, we stand to gain much knowledge of our the
human subject is configured to the field of
representations in our times.

The next posting will follow this debate into Leibniz
and Boole and attempt a theory of the changing status
of symbolic languages in changing epistemes.  The
posting after that will follow these debates into the
20th century and the status of universal languages of
logic (the direct precursors of modern code) in
mathematical disciplines.  The posting after that will
address directly issues pertaining to code as we
understand it today, strengthened by a thorough
genealogical survey.







SECOND POSTING

In the works explored in the preceding posting, we
were confronted with a relationship between the world
and language that had yet to experience a schism. 
Whether in the Renaissance episteme or the Classical
that followed, language was thought wholly capable of
expressing the world that it existed in.  The unity
between language and the world (representation and
being) was supported ably by a unity across what are
now a multitude of disciplines.  As Foucault argues,
the classical configuration of a universal order
enabled a kind of ‘general science’ practiced by
polymaths like Leibniz and Descartes.  

With the transition into modernity however, the
reordering of the episteme (the questioning of
absolutes, the birth of a new positivism) fractured
both these kinds of unities.  The appearance of
positivism and the centring of the human subject saw
the loss of the transparent visible and univocal
tables of the classical order.  The birth of new kinds
of invisibles (the metaphysic, the unconscious) as the
twins of the new positivism only reinforced the loss
of the classical unity.  This division resonated in
the divisions that thus occurred in disciplines – the
pure formal sciences were separated from empirical
sciences.  

What happened then to language?  Separated from a
mathesis, it lost its ability to speak for the
universal.  Philology turned in upon itself, concerned
only with its own density, its internal patterning –
it became one object among many others.  Most
interesting to us here is how Foucault thinks this
profound loss was compensated for.  No longer sharing
a simple correspondence with reality, the focus
shifted onto how language could be best internally
structured to serve as an expression system of
mathematical and logical thought; how it could be
stripped of ‘singularity and accidents’, how it could
be formalised, how it could again aspire to the
privileged status it possessed in the classical
episteme.  Most interestingly, one result was the
innovation of even newer forms of expression
(primarily logical) that sought to represent thought
and at the same time provide bases for a new kind of
universal mathesis.  

So we see that the quest we were following in the
first posting continues, albeit in a radically altered
way.  In this posting we shall we interested in
following the continuation of this altered quest in
the form that it survived – the birth and development
of symbolic algebra and symbolic logic.  We shall
simultaneously be interested in charting the exact
nature of transformations that occurred in their
character.  To briefly anticipate this, as we move
into modernity the two chief and radical differences
will be:
1)	A movement from a nominalist to a syntactical
conception of language and then finally from the
syntractical to the purely symbolic.
2)	A reordering of the conception of infinity and the
limits of mathematics in light of the new positivities
of knowledge.

In this and the next posting then, the focus shifts
from language in general to the specific language of
mathematics and logic.  This narrowing is made
inevitable by the fracturing of disciplines that
occurred in the 17th and 18th centuries.  However, we
are still interested in seeing how the quest for
universality and perfect duplication of thought is
continued, and the precise transformations that take
place in the nature of this search.





Vieta’s ‘logistice speciosa’

Francois Vieta is known to us now as the father of
modern algebra.  Even such a generous epithet perhaps
does not do justice to his achievements.  Writing in
late 16th century France, Vieta inaugurated a symbolic
and syntactic conception of algebra that – flourishing
under Descartes and Leibniz – would provide the
firmaments for mathematical thought and logic for
centuries to come.  Coupled with the Flemish
mathematician Simon Stevin who provided a theory for
the continuity of numbers and posited them as a level
of representation that were analogous to objects,
Vieta provided the tools that would flourish in use
under the classical episteme.  To put it very
simplistically, while Vieta inaugurated a symbolic
language in mathematics that allowed an internal
syntactic referentiality, Stevin connected this system
to being and object – allowing and retaining the
process of naming that is the fundamental task of
classical representation.   The conditions for the two
defining characteristics in the classical episteme
were slowly being forged – a level of representation
separate from objects that were internally consistent
syntactically while not losing touch with its power to
name the objects that it existed separately from (a
partial symbolic).

To understand this more precisely, let us retrace our
steps and examine Francois Vieta in more detail. 
Jacob Klein’s excellent work on Vieta’s sources
demonstrates his grasp of the ancients and the
influence of Diophantus and Pappus on his algebraic
work.  While we are not concerned with the details of
his sources here, we are interested in a novel
bringing together of two different stands in Greek
mathematical thought – Diophantine arithmetic and the
geometry of Pappus.  The understanding of algebra as
geometric – as essentially a theory of proportions –
is what Vieta’s definitive contribution to modern
thought can be briefly summed.  To put it differently,
for the ancient mathematicians, arithmetic was a
computation on determinate numbers.  It was strictly a
manipulation of definable magnitudes.  Vieta’s
comparison with geometry and a general theory of
proportions undermined this nominalist tendency that
thought of numbers as just counting.  It inaugurated
an internal proportion, a theory of the equation that
privileged a consistency between terms that were
internally defined.  

In establishing a plane of reference different from
objects (a plane of internal syntax) a problem is
certain to arise.  In what way can numbers as
determinate magnitudes (ancients) be represented as
indeterminate and hence open to syntactical
manipulation?   Vieta’s solution to this problem gives
us perhaps the most valuable mathematical contribution
of the time, the systematic presentation of a system
of algebraic notation that establishes a symbolic
conception in mathematics.  To move numbers from the
determinate to the indeterminate, he uses letter signs
to represent not a specific number, but the general
character of being a number.  This generality in the
syntactical unit allows a similar generality in the
analytical and problematical procedure.  Certain rules
of comparison and operations between these signs are
postulated, allowing a systematic manipulation of
these letter units.

We are entering thus a new domain of mathematical
analysis in which objects begin to be defined purely
by the syntactic contexts, a far cry from numbers as
pure magnitudes.  As Jacob Klein understands, this
moment marks a strong break from the fundamental
ontological science of the ancients, replaced by a
symbolic discipline whose ontological presuppositions
are left unclarified.  The structure of the world is
now to be understood as a symbolic calculus.  As Vieta
himself put it, the numeral reckoning of the ancients
(logistice numerosa) that operated with numbers is now
replaced by the reckoning by species (logistice
speciosa) that operates with species on the forms of
things.  At the same time, it would be an
over-reaching to say that the symbolic and syntactical
conceptions in mathematics was established at this
point once and for all.  In every way, this was just a
beginning.  The internal systemisation was rudimentary
and far from complete.  Additionally, the symbolic
letter signs that represented the general character of
being a number are just a first step in a symbolic
understanding in mathematics.  We shall find that in a
later years virulent debates will rage around the
nature of numbers themselves, their internal
consistency and determinateness and their relationship
to the world.  These debates will give rise to even a
hapless abandonment of numbers in favour of an even
more prior system that is needed to define numbers
themselves.  

While these debates will be discussed in more detail
in the next posting, they are alluded to briefly here
to remind us that while revolutionary in mathematics,
Vieta and his contemporaries’ understanding of
symbolism is an early and curtailed one, linked
strongly to the world that it seeks to represent.  

In other words, it is absolutely vital to remember
that while a general internal syntax has been brought
about, it is a syntax that transparently connects to
the ordering of thought and being, to the general
theory of proportions that was in pre-eminence during
the classical episteme.  The continuous, tabular,
grid-like arrangement of Vieta’s symbolic algebra
mimics the similar arrangements of representation of
language, biology and economics that Foucault unearths
in his archaeology of the same time.  The crucial link
between the two is that mathesis as we represent here
was an ordering of simple natures, and taxinomia (as
we came across in the first posting) was an ordering
of complex natures.  While mathesis was a calculation
of equality, taxinomia dealt with differences.  Both
in consonance established the grid of knowledge that
transparently represented knowledge.  The underlying
conditions, rules and philosophies of different
representations and disciplines were the same: a
system of identities and differences (as discussed
before).  Since there was no difference imagined
between how the world was and how it was represented
(although at a different plane), it was only natural
that different representations would obey the same
principles of ordering, which in turn would directly
illuminate the principle of ordering inherent in the
real world.  This is brief, is the classical notion of
order.  This is also the conditions of emergence of
Vieta’s symbolic algebra and the main point of
difference from the symbolism that will overtake the
discipline in modernity.





Stevin and Zero

Brian Rotman - in his comparison of Stevin and Vieta –
thinks of the latter as more limited in his symbolic
conception of the algebraic variable sign.  For him,
Vieta’s linking of letters to sets – which are
strongly constituted as collection of real ‘things’,
ties him to a nominalist understanding of the
algebraic sign.  To him, Stevin’s introduction of the
zero does something far more drastic to structure of
mathematical entities, something that posits a far
stronger break from the classical arithmos, the status
of numbers as mere magnitudes.  In Jacob Klein’s
discussion of Stevin’s work, the implications of his
mathematical innovations are interpreted in almost a
contrary way to Rotman.  His belief is that there is a
conflation of first and second intentions, of object
and concept, of being and thought that does not think
of numbers as free-floating signs but as distinct
materiality.

Let us first examine Stevin’s work briefly here.  The
principle contribution that Stevin made to the theory
of numbers was his understanding of zero as the arche
(the principle of construction) of the numerological
system.  He was borrowing this idea from the Arabic
scholars, rejecting the Greek arithmos that posited
the unit as the central principle and hardly
recognised zero.  Both Klein and Rotman agree that a
conception of numbers that has zero has its
fundamental basis made possible a continuous system of
numbers; one that had place not only for rational
integers but also for irrationals, quadratics and so
forth.  In other words, it made possible indeterminate
numbers for which the actual numerical value was not
known.  Since these were not determinate, Vieta’s
symbolic letter signs would be necessary to represent
them.  

>From this initial agreement Rotman and Klein take two
different courses.  Klein believes that in conjunction
with geometry, Stevin was attempting to reinforce the
materiality of the number by articulating a clearer
continuity that was given to geometric formulations. 
The lack of the Greek mathematicians and their
inability to conflate arithmetic with geometry was
their inability to understand zero.  Once this was
difficulty was surmounted and a continuous theory of
the number made possibly, the conflation of arithmetic
and geometry was made possible.  Algebraic variables
such as these did not break the conflation of object
and thought, of being and representation but
reinforced it by drawing parallels between a general
theory of geometrical proportions and its relations to
arithmetic through algebraic equations.

Rotman on the other hand argues that the introduction
of zero breaks away from numbers as units of definite
things.  He believes that Stevin is claiming for all
numbers the status of free-floating signs,
inaugurating a structural semiotic that gives them
meaning only in a potentiality and in reference to
other signs.  Signs are privileged over things in this
conception of algebra, and their terms gain meaning
only in reference to each other.

  

While Rotman’s argument has certain persuasiveness, a
reading of the original text Stevin’s L’arithmetique
pushes one here towards agreement with Klein. 
Stevin’s reasoning is consistently conflates concepts
with objects, language with things.  The very analogy
that goes into defining a number uses a reasoning that
ignores the distinction between the thing and its
symbolic referent.  To understand algebraic variables
in both Stevin (and indeed even in Vieta) it is too
early to talk as Rotman does of free-floating symbolic
systems.  The conditions for a complete dissociation
of representation from being have yet to come about. 
Under the classical episteme, it is much more probable
to conflate object and concept (given that the latter
is a transparent mirror of the first in any case) than
it is to think of a domain of structural semiotics
with a pure referentiality to other signs and only to
other signs.

Following Klein then, we are enabled to think of
Stevin as providing the crucial theoretical link
between the new syntactical domain inaugurated by
symbolic algebra and the world of objects that it
seeks to represent.  This is made possible by the
introduction of the zero as the fundamental principle
of arithmetic and the consequent understanding of
numbers as continuous and not discrete phenomenon. 
Once this is achieved, the classical episteme with its
emphasis on the twin operations of visible identities
and differences lends itself easily to a conflation
with the geometrical world of algebraic notations made
visible and continuous (hence operable upon to
calculate identity and difference).  In other words,
the eventual linking of algebraic and geometrical
proportions, which – since it was thought to be a
transparent representation of the ontological status
of the classical episteme – enabled symbolic algebra
to emerge as a new kind of pure representation, a
privileged language of enquiry and analysis. 
Manipulations of algebraic terms through internally
defined operations would enable an access to truth and
an answer to every question that could and ever had
been posed.   Hence Vieta’s proud and famous
contention:

‘Analytical art appropriates to itself by right the
proud problem of problems, which is: TO LEAVE NO
PROBLLEM UNSOLVED.’


      Did you know? You can CHAT without downloading messenger. Click here http://in.messenger.yahoo.com/webmessengerpromo.php



More information about the reader-list mailing list