Math Every Day Steveyegge2
Transcript of Math Every Day Steveyegge2
-
8/13/2019 Math Every Day Steveyegge2
1/10
John von Neumann andthe Origins of ...
William Aspray
Best Price $49.94
or Buy New
Privacy Information
I just read a book called John von Neumann and the Origins of Modern Computing. Every few years Iread abook that causes a discontinuity in my thinking -- a step function that's a lot larger than the
little insights that most books or articles produce. For me, this was one of them.
Incidentally, I don't expect it will have the same impact on everyone. I think there's a right time for
every book. You have to struggle with some set of issues for a while before the right book can really
hit home. You might like it, but I'm not necessarily going to recommend it. It's more of a history or a
biography than a technical book.
I thought I'd talk about some of the realizations, probably already obvious to you, that really "clicked"
when I read it.
What Math Really Is
I finally realized how important math is. Innumeracy, the mathematical version of illiteracy, is crippling
me and most of the rest of the programming world.
Von Neumann was originally "just" a mathematician (and chemical engineer), but he made lasting and
often central contributions to fields other than pure mathematics. Yes, he invented the computer and
computer programming as we know it, which is a fine thing to have on your resume, but it's only a
tiny part of his work. The list is so long that the book I read couldn't begin to mention it all, let alone
discuss it. I'll list a handful of his accomplishments here, but they don't begin to paint the full picture.
Johnny co-invented game theory, made critical contributions to the field of economics, and extended
http://www.amazon.com/dp/0262011212?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0262011212&adid=095AY16DEJ7B9MRXQBYZ&&ref-refURL=http://www.amazon.com/dp/0262011212?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0262011212&adid=095AY16DEJ7B9MRXQBYZ&&ref-refURL=https://sites.google.com/site/steveyegge2/blog-rantshttp://www.amazon.com/gp/dra/infohttp://www.amazon.com/gp/offer-listing/0262011212?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0262011212&adid=095AY16DEJ7B9MRXQBYZ&&ref-refURL=http://www.amazon.com/gp/offer-listing/0262011212?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0262011212&adid=095AY16DEJ7B9MRXQBYZ&&ref-refURL=http://www.amazon.com/dp/0262011212?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0262011212&adid=095AY16DEJ7B9MRXQBYZ&&ref-refURL=http://www.amazon.com/dp/0262011212?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0262011212&adid=095AY16DEJ7B9MRXQBYZ&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
2/10
http://www.amazon.com/o/asin/1579550088http://www.amazon.com/dp/053439339X?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=053439339X&adid=0183WNK9CW6RJEXTZPCG&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
3/10
http://www.amazon.com/dp/0072930330?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0072930330&adid=0FGZYJ1VN7AWZPNGD1QB&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
4/10
Discrete Mathematics
and Its Applica...
Kenneth H. Rosen
Best Price $2.22
or Buy New
Privacy Information
He created a sequential-instruction device with a fast calculation unit but limited memory and slow
data transfer (known as the infamous "von Neumann bottleneck", as if he's somehow responsible for
everyone else being too stupid in the past 60 years to come up with something better. In fact,
Johnny was well on his way to coming up with a working parallel computer based on neuron-like
cellular automata; he probably would have had one in production by 1965 if he hadn't tragically died
of cancer in 1957, at age 54.)
Von Neumann knew well the limitations of his sequential computer, but needed to solve real problems
with it, so he invented everything you'd need to do so: encoding machine instructions as numbers,
fixed-point arithmetic, c onditional branching, iteration and program flow control, subroutines,
debugging and error checking (both hardware and software), algorithms for converting binary to
decimal and back, and mathematical and logical systems for modelling problems so they could be
solved (or approximated) on his computing machine.
He did much of the hardware and materials engineering himself, working closely with engineers to
come up with solutions for memory, secondary storage, input and output (including hooking up an
oscilloscope to the computer to visually check results that were too complex for punch cards). He
secured the funding for building the computer and computing lab from government and other sources,developed training programs to teach people how to program computers, and worked with domain
experts to find problems that were too hard for mathematical analysis but tractable to numerical
solutions on his computer.
Of course he knew what was computable on his machine, because he worked out new definitions of
elegance and efficiency, came up with the mathematical models for analyzing the complexity of
algorithms being executed on his device, and with his staff, ran hundreds of experiments to learn
what worked well and what didn't.
John von Neumann invented our universe.
Then he died at the depressingly early age of 54, robbing the world of perhaps the greatest genius of
the 20th century. "Those who know" generally seem to rank Albert Einstein ahead of von Neumann,
but Johnny always gets a solid #2 vote. Frankly, though, I think Johnny had a far bigger impact on my
life, and not just because I'm a programmer. What did Albert do, really? Dashed all our hopes of
faster-than-light travel, that's what he did. Whined a lot about not agreeing with quantum
mechanics, that's what he did. To the best of my knowledge, Einstein didn't even know EJB, which
according to many Amazon folks makes him a retard.
When I say that von Neumann invented our universe, I'm not trying to be poetic or rhetorical. What
I'm saying is that his firstattempt at a computing machine, one that he didn't really like all that much
'
http://www.amazon.com/gp/dra/infohttp://www.amazon.com/gp/offer-listing/0072930330?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0072930330&adid=0FGZYJ1VN7AWZPNGD1QB&&ref-refURL=http://www.amazon.com/gp/offer-listing/0072930330?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0072930330&adid=0FGZYJ1VN7AWZPNGD1QB&&ref-refURL=http://www.amazon.com/dp/0072930330?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0072930330&adid=0FGZYJ1VN7AWZPNGD1QB&&ref-refURL=http://www.amazon.com/dp/0072930330?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0072930330&adid=0FGZYJ1VN7AWZPNGD1QB&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
5/10
Mathematics
Jan Gullbe rg, Pete...
Best Price $12.38
or Buy New $37.34
Privacy Information
Discrete math, data structures, algorithms: they've all been refined since he died, sure, but he
started it all. Virtually every discipline that we think of as "computer science" is like Euclidean
geometry: useful, sure, but far from the only kind out there.
Operating systems, threads, and processes are just a pathetic attempt at fooling you into thinking
you have a parallel computer, right? Gosh, computers are so darn fast that you can have the
processor zing around like Feyman's hypothetical "only electron in the universe", and it almost looks as
if we're smart. But we're not. All the world at large truly understands is serial execution, which is
precisely why we're so lost in the whole distributed computing space. Everyone talks about agents
and crawlers and web services and all this other crap we don't understand; Johnny would have taken
one look at it and invented tractable mathematical solutions.
Compilers: now there's onediscipline where Johnny clearly didn't have much influence. He was a big
old-school proponent of doing everything in machine code. He might have changed his mind if he'd
lived another few decades; hard to say. But ordinary mortals realized they needed shortcuts: higher-
level languages that would be translated into Johnny's machine instructions.
So people came up with a bunch of crap-ass languages that still had the exactsame abstractions as
the underlying machine: a global memory that you update by issuing statements or instructions,
expressions that can be computed by the arithmetic-logic unit, conditional branching and loops,
subroutines. Everything you need to be "Turing-complete", which is equivalent to von Neumann-
complete.
The new languages simply added various shortcuts: named storage locations ("variables"), syntax for
dealing with memory addresses, "types" for telling the compiler that a variable comes from a particular
set of valid values and has a particular set of legal operations, "scopes" and "namespaces" for
organizing your code a little better, "objects" for anthropomorphizing your data so it can be happy or
sad (mostly sad), and other cruft.
Bu it's all crap. Why? Because it's all just sugaring for the capabilities of assembly language on von
Neumann serial machines, plus a smattering of support for calling into the operating system code so
you can pretend your program is performing truly parallel operations. And none of that parallel stuff
works very well, since we don't understand it, because we don't know math.
http://www.amazon.com/gp/dra/infohttp://www.amazon.com/gp/offer-listing/039304002X?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=039304002X&adid=15X8R7DX3C56EQPKVRXY&&ref-refURL=http://www.amazon.com/dp/039304002X?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=039304002X&adid=15X8R7DX3C56EQPKVRXY&&ref-refURL=http://www.amazon.com/dp/039304002X?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=039304002X&adid=15X8R7DX3C56EQPKVRXY&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
6/10
What about data structures, you ask? Surely that's one island of purity, something that exists outside
of the von Neumann universe? After all, we worked so hard to understand them. (Well, some of us
did. Plenty of folks didn't even do that; they just call an API and it all works like magic.)
Sorry to disappoint you, but most of our data structures are fundamentally based on Johnny's
sequential machine. Think of all those pointers: they're just memory addresses. The best you can do
for a sorting algorithm, complexity-wise, assuming radix sort isn't possible, is n*logn. Or so you
thought; there are parallel algorithms that run in linear time. All your cherished data structures are
simply the best that clever people could do at organizing data on a Turing Machine. If someone
created a set of data structures whose pointers were URLs, thatwould be a step away from von
Neumann.
What about SMP or NUMA? Surely adding multiple processors is a huge step towards parallel
computing? Nope, sorry. Well, not much of one. There's still a von Neumann bottleneck; the channel
has just been made a bit shorter. We need it to be made infinitely widerby creating a truly parallel
computer that doesn't consist of CPUs all trying to update the same global store.
Face it: Computer Science was a misnomer. It should have been called Johnny's First Universe.
What's Left?
OK, so let's speculate about what would have happened if Johnny had lived another 30 years, and
built his parallel computer based on neural networks and cellular automata theory, and we all used
those instead of the Turing devices. What things out of an undergraduate computer science degree
would still be relevant?
That's easy: Math. Everything that's relevant would stem from math. There would be data structures,
but they'd be different ones, derived from the complexity analysis of the "inner economy" of the
parallel computer. There would be algorithms, but they'd also be different, and designing your own
algorithms would require mathematical analytical abilities, just like they do today. There would be
control systems and feedback loops and type systems and semantic nets, just like today, but all of
those things are just useless superstition without math.
Just like everything, really. It's hard to think of useful courses in school that didn't eventually require
you to start doing some mathematical analysis in order to understand the advanced concepts. Even
philosophy, psychology and social sciences need math -- let's not forget that logic, statistics and
stochastic/probabilistic methods are still branches of mathematics.
Other than math, the other big one is Linguistics. It's a field that has its share of math, to be sure,
and in the end, mathematics will probably solve the problem of why people are so scared of
languages. But until the human brain is really mapped out, which could be a while, an understanding
of languages and how they work is going to be fundamental. We can't even do math without having a
language for it.
Programming Languages
I said earlier that all programming languages are just desperate attempts to provide shortcuts for
operations on the von Neumann architecture. Actually, that's not entirely true. A handful of languages
are designed to express computations in a "pure" form, not tied to Johnny's omnibus.
One such language is Prolog. There's a branch of math (what else?) that focuses on being able to
describe or model a computation as a set of rules, and then having an engine evaluate the rules,
supposedly without having to know or care how the engine does it. In practice, you do need to know
how it does it, in order to be able to compute the complexity of the algorithm. This model is quite
-
8/13/2019 Math Every Day Steveyegge2
7/10
The Calculi of Lambda
Conversion.
Alonzo Church
Best Price $30.95
or Buy New $43.03
Privacy Information
But Lisp is particularly interesting, because unlike most other languages, it wasn't designed as a
language for the Turing machine. One of Alan Turing's professors, Alonso Church, was working on a
mathematical model for expressing computations at the same time Turing came up with his Turing
Machines. Church had created a system called the Lambda Calculus, and he and Turing quickly
realized they were formally equivalent, albeit very different-looking. The Lambda Calculus is a system
for building up computations by writing and composing funct ions.
As it turns out, Turing machines are easier to implement in (1950s) hardware, and the Lambda
calculus is easier for people to read and write by hand, as you'll know if you've ever tried to write out
a significant-sized Turing machine.
In 1958, the year after von Neumann passed away, John McCarthy invented a mathematical notation
called Lisp, which was based on the Lambda calculus, and allowed him to express programs much
more conveniently than the von Neumann instruction set allowed him to do. Soon afterwards, a grad
student decided to turn it into a programming language. Lisp survives today as the second oldest
programming language still in use, after Fortran, which was invented in 1957.
The fact that Lisp is based on the lambda calculus is important. It means it's not some cheesy hack
to provide syntactic sugar for von Neumann machine language. It's a model for writing programs that's
intrinsically powerful and expressive. For one thing, being a functional language, Lisp translates far
more easily to parallel machines, because it doesn't assume the world works by moving a head around
on an infinite tape; i.e. it doesn't work by side-effect.
L
isp has some pragmatic additions to allow you to do programming by side-effect. Some people think
this is bad, because that paves the way to ugly bugs. But as long as von Neumann machines are the
norm, it's going to be practical. At least Lisp encouragesyou to program without side-effects. C,
C++, Java, Perl, and most other languages embrace the von Neumann architecture and force you to
http://www.amazon.com/gp/dra/infohttp://www.amazon.com/gp/offer-listing/0691083940?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0691083940&adid=143XWZQ1KNS9AKXEA0W6&&ref-refURL=http://www.amazon.com/dp/0691083940?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0691083940&adid=143XWZQ1KNS9AKXEA0W6&&ref-refURL=http://www.amazon.com/dp/0691083940?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0691083940&adid=143XWZQ1KNS9AKXEA0W6&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
8/10
Algori thms Sequenti al &
Parallel
Laurence Boxer, Ru...
Best Price $28.00
or Buy New $84.98
Privacy Information
In the 1930s, people were solving large computational problems by accumulating huge masses of
laborers, and doling out little parts of the computation to each person, who would work it out on
paper using a slide rule or sometimes a calculator. People who did bits of calculations for a living were
called "computers", and groups of people dividing up a calculation are now called "human computers".
The human computers in in the 1930s weren't the first, either; they'd been around and well-documented, here and there, since the 16th century. And you've heard the speculations that the
monks' religious rituals in some ancient monasteries were actually a ruse to carry out some large
computation. But as computing machines increased in power, the need for human intervention in
computations decreased.
Or did it? If that's the case, why are we hiring hundreds more programmers? Are we just churning big
prayer-wheels here?
Mathematicians have traditionally avoided problems that they considered "intractable" to analysis.
They've known about numerical methods for centuries; even Newton devised a few, as did the
ancient Greeks. But brute-force methods have traditionall been met with some disdain because the
https://sites.google.com/site/steveyegge2/saving-time.htmlhttps://sites.google.com/site/steveyegge2/lisp-wins.htmlhttp://www.amazon.com/gp/dra/infohttp://www.amazon.com/gp/offer-listing/1584504129?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=1584504129&adid=1QNEWFZZF0CXCRJ9G0ZW&&ref-refURL=http://www.amazon.com/dp/1584504129?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=1584504129&adid=1QNEWFZZF0CXCRJ9G0ZW&&ref-refURL=http://www.amazon.com/dp/1584504129?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=1584504129&adid=1QNEWFZZF0CXCRJ9G0ZW&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
9/10
http://www.amazon.com/dp/0387953361?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0387953361&adid=1SWYNX0WGD4T23X9EYQX&&ref-refURL= -
8/13/2019 Math Every Day Steveyegge2
10/10
Mathematics and Its
History
John Stillwell
Best Price $13.64
or Buy New
Privacy Information
I've also bought myself some nice titles that teach you math and math history simultaneously,
including Mathematics and its History by John Stillwell. I've decided that the only way to make math
non-boring is to put it in historical perspective. Reading about the people, the discoveries, the
competition and the breakthroughs makes it interesting in a way my math courses never managed to
do.
And I bought some works by Noam Chomsky and other linguists. Time to familiarize myself with
linguistics, too, since the field has such a huge similarity to the study of programming languages and
other computer-science disciplines like search and natural language recognition. No need to drop
everything else to do this; a few hours of study a week should more than suffice. I'm looking forward
to my new education.
Math every day.
Postscript
There are so many reviews of Stephen Wolfram's insane rant about cellular automata that it's hard to
find any one in particular that you're looking for. I was browsing them looking for the one that
mentioned that Von Neumann was the man who invented CAs; it turns out to be the one titled The
Emperor's New Kind of Clothesby Joe Weiss, and comes up first if you sort the reviews by Most
Helpful First. (Hard to believe that's not our default configuration.)
The first few Most Helpful reviews of Wolfram's book are all quite good, but one of them is so good I
just have to mention it here; otherwise you might not go read it. I've read some funny customer
reviews in my time, but this one might be my new all-time favorite: A new kind of review. Enjoy!
(Published November 15th, 2004)
Back to Stevey's Drunken Blog Rants
Conectai-v | Raportai un abuz | Printai pagina | Eliminai accesul | Un produs Site-uri Google
https://sites.google.com/site/steveyegge2/blog-rantshttps://sites.google.com/site/steveyegge2/blog-rantshttp://sites.google.com/https://sites.google.com/site/steveyegge2/system/app/pages/removeAccesshttps://sites.google.com/site/steveyegge2/system/app/pages/reportAbusehttps://www.google.com/a/UniversalLogin?service=jotspot&continue=https://sites.google.com/site/steveyegge2/math-every-dayhttps://sites.google.com/site/steveyegge2/blog-rantshttp://www.amazon.com/gp/cdp/member-reviews/A3KL7ITUIPVNOIhttp://www.amazon.com/gp/cdp/member-reviews/AZOH247G42FLV/ref=cm_cr_auth/102-3080489-2998566?%5Fencoding=UTF8http://www.amazon.com/gp/dra/infohttp://www.amazon.com/gp/offer-listing/0387953361?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0387953361&adid=1SWYNX0WGD4T23X9EYQX&&ref-refURL=http://www.amazon.com/gp/offer-listing/0387953361?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=am1&creativeASIN=0387953361&adid=1SWYNX0WGD4T23X9EYQX&&ref-refURL=http://www.amazon.com/dp/0387953361?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0387953361&adid=1SWYNX0WGD4T23X9EYQX&&ref-refURL=http://www.amazon.com/dp/0387953361?tag=steveysblogra-20&camp=14573&creative=327641&linkCode=as1&creativeASIN=0387953361&adid=1SWYNX0WGD4T23X9EYQX&&ref-refURL=