# Popular Science Monthly/Volume 17/August 1880/Algebras, Spaces, Logics

←Types of the Nubian Race | Popular Science Monthly Volume 17 August 1880 (1880) Algebras, Spaces, Logics By George Bruce Halsted |
Chemical Exercises for Ordinary Schools→ |

AN UNTECHNICAL ILLUSTRATION OF DEVELOPMENT IN PURE SCIENCE.

WHEN at the making of a new university a lot of specialists were thrown together, I was impressed by their lack of information in regard to the progress of the eldest of the family of sciences, mathematics. One fellow, a graduate of the University of Virginia, said that, from what had been taught him, he had come to believe mathematics finished by Newton, and now he was puzzled by a talk of progress. Another, an engineer thoroughly grounded in what the previous one had considered all possible mathematic, asked what it could mean—this turning out of new algebras, this new geometrizing? He had heard that metaphysics was interminable, and knew that a pseudo-philosopher could spin out metaphysic by the yard; was this new mathematic something of the same sort, or was it worth his looking into?—and so on. Let me, then, try to give an untechnical illustration of the fact that mathematic, though with a safe start of perhaps a thousand years over the other sciences, may now lay claim to be more than ever fundamentally and rapidly advancing, developing. From the vast field of choice, let us, to fix the attention, confine ourselves simply to what is involved in the addition of a single letter, *s,* to three common words, algebra, space, logic; that is, implied in getting a plural to the ideas embodied in these words.

Algebra has been and still is defined as universal arithmetic, and is most commonly thought of as simply a generalized statement of the truths about natural numbers. And historically such it was; as such it started, and was indeed a very gradual growth. In the first known treatise on the subject by Diophantus, in the third or fourth century, the few symbols employed are mere abbreviations for ordinary words. The Arabians, who obtained their algebra from the Hindoos, did little or nothing toward its extension, though it retains in its name an Arabic touch, and the word *algorithm,* always, and now more than ever, associated with it, has the Arabic *al.* It was after their treatises had been carried into Italy by a merchant of Pisa, about 1200, that important improvements began. About 1500 the first problem of the third degree is said to have been solved. After that, Cardan first gave the general solution of a cubic equation, and employed letters to denote the unknown quantities, the given ones being still mere numbers. Toward the middle of the sixteenth century algebra was introduced into Germany, France, and England, by Stifel, Peletarius, and Robert Recorde, respectively. Recorde endowed it with the symbol of relation = , and Stifel with the far more important symbols of operation, +,—, √. In the same century Vieta introduced letters as symbols for known as well as for unknown quantities, and by this great advance not only laid the foundation for the general theory of equations, but rendered possible the birth of new algebras, children of the first.

The next step, a vast one, was definitely accomplished, when, in 1637, Descartes published his "Coördinate Geometry," involving an algebra of form. Sprouting from a numerical stem, this soon transcends merely metrical limits with a beautiful power of giving demonstrations projective, positional, descriptive. It matters not whether you prefer to think of this as a new algebra or as a new application of the first algebra of natural number. But, if you take the second opinion, you should know that you do so because the child is almost identical with the parent in formal algorithm. And there is a word coming into general use in pure science, yet whose present meaning is scarcely to be gained from dictionaries. It is an interesting word both in its birth and growth. When the Greek learning passed to the Arabs, so did the word ὰριθμὁς, as it has come to us in arithmetic. When the Arab and Moorish learning passed into Europe, the *al* was confounded with the following word, and from the Spaniards came the *g* between them. Thus, when the Indian numerals were introduced, this word came with them, and the new figures were denominated (by Chaucer, for example) *augrime* (or algorithm) figures; and rightly enough as being used according to an algorithm, for the old mathematical dictionaries give it in probably its real imported sense, as meaning the great rules of arithmetic. So Johnson in his old dictionary gives *algorithm,* or *algorism,* as the six operations of arithmetic; and the "Edinburgh Encyclopædia" has it as the rules of arithmetic, or the art of computing in some special way, and, finally, as the principles and notation of any calculus. Here we see it has sprouted and come very nearly into its present acceptation, in which I would define it as the fundamental operations of an algebra with their assumed laws and notation. In the algebra of natural number there are seven such, for we put in one more since the days of Samuel Johnson. As illustrations of simplicity and seeming insignificance, let me call your attention a moment to the three direct operations, which you have always known.

Suppose in counting we make a mark for each thing and connect them by Stifel's sign of addition, 1 + 1 + 1 +1. Then, if we go over them one, by one we have a mark to register our result. But, even without taking the trouble to count them, we can say they will amount to some number and call it "a." But suppose we have to count a lot of the same sort of rows all equal, we know that an actual count will give for each the same number which we have called "a," and we will get a as many times as we have rows; that is, a number of times, say *b* times, and the grand total will be *a* taken *b* times, or *ab.* But suppose the number of rows should be equal to the number of columns, then we would have *a* times *a,* or *aa;* and in the same way we might have *a* times *a a,* or *a a a,* etc. But why write all the *a*'s? Put one and a number above to right to tell how many, call the number *b,* and we have a^{b}.

These are the three direct operations, seemingly mere devices to spare a little trouble. You could hardly believe the conquest of the thought-world was lying dormant in them. Yet their undoing or inversion leads to the four inverse operations, and the seven, together with their working laws, are the algorithm of your algebra. So are they also of Descartes's application of algebra to form, and even Newton's fluxional calculus to a certain extent presupposes them, so that it was looked upon rather as an extension, a generalization, than as a new algebra of infinitesimals formulating its own working algorithm.

Therefore, much as we prefer Newton's character, and believe in his prior invention of the calculus, it is to Leibnitz that we assign the high honor first to have grasped the plural whose growth we are illustrating. After two of the most extraordinary of modern algebras were discovered and published, it was found that the possibility of each had been indicated by Leibnitz more than a century and a half before.

Toward the modern deep study of the formal laws involved in a pure science, Lagrange and Laplace led on also by the conclusion that theorems proved to be true for symbols representing numbers are also true for all symbols subject to the same laws of combination. Hence followed the principle of the separation of symbols of operation from those of quantity, with the "calculus of operations." The world of mind had now developed sufficiently to appreciate the definition of *an* algebra, though when it was first given I do not know. An algebra is an abstract science or calculus of symbols combining according to defined laws. There may be an indefinitely large number of sets of such defined laws—that is, of distinct, different, and independent algebras.

In the history of science it is a worthy illustration of the rhythmic character of great advance that, as if by an irruption of genius, the same year (1844) published three of the most fundamentally new and interesting modern algebras, and stamped for immortality the names of Rowan Hamilton, Hermann Grassmann, and George Boole.

Among the first men to systematically consider symbols combining according to laws more complicated than those of natural number was Sir Rowan Hamilton. After a struggle of ten years from 1833, his genius enabled him to escape from the rut of common thought by casting away the commutative principle in multiplication, which in numbers formulates the fact that twice three gives precisely the same result as thrice two. So, in 1843, he presented to the Irish Academy the principles of the algebra of quaternions, and published an article on the subject in the "Philosophical Magazine" in 1844. At the same time had appeared in Germany Grassmann's "Ausdehnungslehre," a more extraordinary algebra, which contains quaternions as a special case. But let me pause here. We have sufficiently shown our plural without even mentioning Cayley and Sylvester's invariantive algebra; Riemann's theory of a complex variable; the algebra of polar elements; or any of the many others that have sprung or are springing into being.

As for pluralizing the idea of space, that would follow very briefly if only I might talk in terms of the "Ausdehnungslehre." Quaternions, as Professor Tait has said, is content with one flat space; but Grassmann, in a little appendix of only two pages, has shown the ability of his extensive algebra to cope with the modern double plural of the old idea of space. Before this idea had germinated, while therefore there was no real use for the word "spaces," the parsimony of language applied it to mean *pieces of space;* but in the fullness of time it has received its heritage, and by spaces I mean an aggregate of which the space hypothetically infinite and containing the material universe is but one. A statement in the technical terms of analysis would probably tend very little toward clearing up this matter to one not already familiar with it. Let us, then, use rather the historical method—attack in the light of history.

As an eternal treasure and model to the world the Greeks bequeathed the synthetic science of a space. This is the particular space in which you believe, and are sure you and the stars are inhabiting. You will be glad to know that it has been made a fitting monument to the writer of the greatest classic, and inscribed with the name of Euclid. This Euclidean space is a tridimensional homaloid, and so, in distinction from it, spaces with positive or negative curvature are called non-Euclidean.

Through all the centuries up to the present Euclid's space contained at least the thought-world. The space analyzed in Euclid's "Elements" was supposed to be the only possible form, the only non-contradictory sort of space. And, after more than twenty centuries, it is to a little point in that same book that the new idea attaches itself and sprouts into being. This slender link is one of Euclid's postulates, misplaced in the English editions as the twelfth axiom. As the last of his six αὶτἠυατα (requests) Euclid says: "Let it be granted that if a straight line meet two other straight lines, so as to make the two interior angles on the same side of it, taken together, less than two right angles, these straight lines being continually produced shall at length meet upon that side on which are the angles, which are together less than two right angles." This somewhat complicated so-called axiom is only the converse or inverse of proposition seventeen, that "any two angles of a triangle are together less than two right angles," a theorem readily demonstrated from the preceding postulates and axioms. An inverse is usually exceedingly easy to prove. Then why not remove this inverse from among the postulates, place it after seventeen, and demonstrate it? This obvious way to improve on Euclid suggested itself to numerous geometers throughout the centuries. Hundreds tried it, and failed. As in squaring the circle, some claimed to have accomplished it; but against each one all the rest decided.

It now seems queer that no one during all this time systematically developed the results obtainable when this postulate is denied, is negatived, is thrown overboard. Euclid's method, the *reductio ad absurdum,* would have led them on to this if only it had ever entered their heads to suspect a plural to space. But the perfect originality of this step required genius, and has given a permanent rank in the history of science to two names of which otherwise we should probably never have heard, Bolyai and Lobatchewsky. Their publication of a non-Euclidean geometry gave the entire question a totally new aspect, and from that moment everything previously printed on the subject became antiquated; everything else became moribund, and the world of geometries was dualized into Euclid and non-Euclid. Like Columbus, they discovered and opened a new continent, into which for the last forty years geometers have been swarming, rewarded by many gold-mines. On non-Euclidean spaces and the kindred subject, hyperspaces, I have given in the "American Journal of Mathematics" a list of about one hundred and eighty publications since 1844. In dividing spaces with reference to the parallel-postulate, those in which through one point outside of a straight line can be drawn more than one parallel to that line are called hyperbolic spaces; that space in which through the point we can draw one and only one parallel is called parabolic; those spaces in which we can draw no parallel straight lines are called elliptic. In hyperbolic spaces the sum of the three angles of any triangle is less than two right angles, in parabolic equal to, in elliptic greater than, two right angles. Elliptic spaces are positively curved spaces, hyperbolic are negatively curved spaces, while the parabolic has no curvature, is a flat or homaloidal space.

This pluralization of the idea of space is independent of dimensionality and came synthetically. But about the same time came analytically a plural having reference to dimensions. Our perceptions, intuitions, imagings, are confined to a flat space of three dimensions, and this gives us a strong prejudice in favor of the belief that our bodies and the stars are also confined in a tridimensional homaloid. But this is simply a question of fact in the domain of physical experimentation.

How this belief might be negatived is easily illustrated. In 1872 Clifford said before the British Association: "Suppose that three points are taken in space, distant from one another as far as the sun from a Centauri, and that the shortest distances between these points are drawn so as to form a triangle. And suppose the angles of this triangle to be very accurately measured and added together: this can at present be done so accurately that the error shall certainly be less than one minute, less therefore than the five-thousandth part of a right angle. Then I do not know that the difference of the sum of the three angles of this triangle from two right angles would be less than ten degrees, or the ninth part of a right angle." This says that it is within the power of our astronomers to discover that our space is not flat. And already spiritualists claim to have experimentally demonstrated that our space has more than three dimensions. As for myself, I admit I am prejudiced just as you are. I do not think it probable that astronomers will prove that we are living in a curved space, and everything connected with spiritualism seems to me disgusting bosh. But it is not the probability that I want. I am simply illustrating the possibility, and this is enough to bring the matter into the domain of simple external reality.

You have the meaning of a fourth dimension strikingly put before you every time you look into a mirror. There you see yourself so turned around that your right hand has become your left. If you were to step straight out of the looking-glass every one would think you left-handed. Such a change could be accomplished by revolving you in the fourth dimension, and in no other way. Therefore a mirror will show you at any moment exactly the effect of a fourth dimension. Then why is this not a proof of the actual existence of a fourth dimension? I answer that here, as in the case of the spiritualists, there is deception.

It would be proof if there were no deception. The straight rays of light break against the mirror and are turned back. Our eyes give us no account of this break and turn, and so deceive us, putting before us, like the spiritualists, the effect of a fourth dimension. These are not questions which can be decided by reference to our space intuitions, for our intuitions are confined to Euclidean space, and even there are insufficient, approximative. For instance, you suppose you can imagine a curve on a plane, and so in physics curves are taken to represent functions. In reality you can not get any closer to it than what the Germans call a stripe. The analytical copy of the curve is not the function but the stripe.

But you may say, How can we ever go better and deeper than our intuitions? If I answer, "Logic," you are apt to feel soothed. It is wonderful what a strong though often unconscious distinction exists in the general English-speaking mind between logic and metaphysics. Metaphysics is always scorned and scouted; but if you say logic, ah! that is a very different matter. Again, I must acknowledge for myself sympathy with the general feeling. I think most metaphysics ought to be scorned; and I am glad that in English logic means formal logic, a pure science, and is rarely mixed up with a metaphysical Erkentnisslehre or *ken-lore.* To be sure, formal logic was for ages the most fixed of all things, and so fell into some disrepute, since to be stationary and unprogressive is to be so far unscientific. But at last came the awakening. In 1847 two mathematicians, Boole and De Morgan, published works on logic. Thenceforth was no longer applicable the latter's reproach: "First, logic is the only science which has made no progress since the revival of letters; secondly, logic is the only science which has produced no growth of symbols." Among De Morgan's many gifts to advancement, perhaps we should select as most important his founding a logic of relatives. But even he thought Boole's calculus of inference the most extraordinary advance ever made in logic. At a single stroke of genius, unheralded, sprang forth an algebra of logic. This stroke shattered the imprisoning magic circle of Aristotle; and in the last few years four or five new algebras of logic, differing more or less from Boole's, have come into being; another is now being published, and I know of two more preparing. I will not attempt here any explanation or eulogy of what has been thus accomplished for logic. This can already be found in English, French, and German. Modern logic will date from Boole.

But I wish to call attention to the fact that here we find the best, the most satisfactory introduction to the study of modern algebras, modern mathematics. When told that in these systems a product may not vary with each of its factors; that a product may vanish without either of its factors vanishing; that subtraction and division may be indefinite; that, in fact, any system, e. g., quaternions, where the products and powers of the units are themselves linear functions of the units, excludes the ordinary assumption that a product shall vary with each of its factors; that from q q, = o, it does not follow that either q = o or q, = o; that a quadratic equation, e. g., in quaternions, besides its sixteen roots proper, may have an indefinite number of roots which arise from the fact that the process of division is not a definite one; when told these, and very many more such, the beginner is only too sure to think, "This is a hard saying," and may give up the subject in hopeless confusion. If, however, he will start with Schroeder, "Der Operationskreis des Logikkalkuls," he will find the clearest explanation and illustration of these things contained in his own every-day thoughts about the commonest objects; and, while learning an elegant logic, will be mastering, perhaps, the most exquisite dual algebra.