A week-and-a-half ago, on Thursday, Sept. 29th, the 5th annual Fisher-Hopper Prize event was held at UC-Berkeley to a sold-out audience of one-hundred or so CIOs and IT executives from throughout the US, Canada, and Western Europe. It was a long day with the doors opening at 7:30am and the evening banquet concluding shortly after 9pm.
The 2nd presentation of the morning segment was by a father/son team, DuWayne Peterson, the former EVP/CIO of Merrill-Lynch, and his son, Brad Peterson, the current EVP/CTO&CIO of NASDAQ. It was a stunning performance; relevant, thought-provoking, and – at several points – very, very funny … as the laughter from the engaged audience showed.
Two of their slides caused me to mentally “wander.” The first of these was a slide that had nine hexagons, each containing one of nine “Technology Trends Shaping Financial Services: Innovations Advancing The FinTech Landscape”
- Cloud Services
- Smartphone Ecosystem
- Machine Intelligence
- Quantum Computing
- Big Data
- Augmented Reality/Virtual Reality
While this was being projected on the central screen and on four large monitors on the east wall of The University Club, Brad made a passing remark about the fact that it wasn’t particularly important that the slide showed “nine trends,” and my mind began to wander. “Nine” was important, possibly not in this exact context but certainly within the broader context of the “bits and bytes” that are the fundamental building blocks of our Digital Age.
A “byte” is comprised of eight “bits” plus (and here my memory was wrong) a parity bit (either “odd” or “even”) meaning that Brad’s nine trends were coincidentally the same in number as the true number of bits in a byte … but I was wrong. A day or so later, I checked and discovered that a “byte” contains seven data-bits and one parity bit for a total of eight bits, not nine.
However, while Brad was speaking my mind kept wandering into the murky waters of parity bits, odd or even parity, vertical and horizontal parity, and – finally – error detecting codes, error correcting codes, and (more to the point) Hamming Codes. As I sat there in that beautiful room high atop Memorial Stadium, I recalled a 1967 talk by Richard Hamming (then of Bell Labs) to a class of EECS graduate students taking a research course being taught that year at Cal by Bertram Raphael, a Visiting Scholar from Stanford.
As Brad and his father were outlining the future as seen through the lens of Financial Technology, I remembered a comment made Hamming …
Hamming’s talk was given in a smallish lecture hall in the old Engineering Building on the campus of the University of California at Berkeley. His topic was the past, present, and future of computer technology. One thing he said stuck in my memory, “Whenever there is a technological change that causes a functional improvement of one or more orders of magnitude, the impact on human civilization is unpredictable.”
How right he was and how little did we understand what was about to happen – back then in 1967! One measure of the underlying cause of much we see happening today might be found in the fact that in the fifty year period, 1958-2008, the [invested capital] cost per executed computer instruction per second has declined by seven orders of magnitude from approximately $10,000 (in 1958) to less than 0.1¢ (in 2008).
While I was recalling Hamming’s “order of magnitude” comment, Brad had moved on several slides to one with three recent quotes by the inventor and futurist, Ray Kurzwell:
- Law of Accelerating Returns is the reason why information technologies grow exponentially, and they’ll not only impact businesses but what makes us human.
- Information technology progresses exponentially [whereas progressing] 30 steps linearly gets you to 30. 1,2,3 … you’re at 30. With exponential growth, it’s 1,2,4,8 … [and by] step 30, you’re at a billion!
- Current exponential growth of computing will continue and human-level intelligence will be eclipsed as early as 2029.
With Kurzwell’s quotes on the screen, I pondered Hamming’s “order of magnitude” comment from six decades ago. As I type this a week later, I think Kurzwell made an important but unstated distinction; that we live in a world where most change is linear but, simultaneously, where digital technology change is exponential.
… and then, I mused …
It is the built-in conflict of linear change in governmental structures, in religious viewpoints, in human-to-human relationships, versus the exponential change in information technology that is fueling much of today’s strife – where autocratic regimes of almost feudal structure and religious leaders pressing for structures and relationships of a millennium ago are using 21st Century information technologies to further their retrogressive aims … desires by “the few” for power and control over “the many” that is at cataclysmic (definitely the right word) variance with a new generation that sees (and expects to possess) a growing range of technology-fueled freedoms within a world not of limits but one of plenty.
… and then the Engineer in me wandered back to those seven data-bits in an eight-bit byte. Many years ago I was part of a crazy computer project that actually designed, built, and foisted off on an unsuspecting public a 6-bit computer in an 8-bit world … and we did this very, very successfully. Why? Because “main memory” computer storage in the mid 1960’s was comprised of tens of thousands of 1/8th inch diameter magnetic donuts called “cores” (one core = one bit) that were not cheap … so our computer, the Friden System Ten, actually out sold IBM’s 1401 in, I think, 1965 (6?) … but not for long.
Seven data bits enable you to store 128 different “things” in one byte (any one of the 10 numeric digits, or of the 26 lower case alphabetic characters, or of the 26 upper case alphabetic characters, or of the 32 “special” characters, with room for 35 more). On the other hand, a six-bit byte has just 5 data-bits with that restricting you to just 32 different things – meaning it took our computer two 6-bit bytes to store an alphabetic character (12 of those magnetic donuts) compared to computers where “main memory” was comprised of 8-bit bytes. As memory costs plummeted our cost advantage vanished – but we did quite well for awhile …
Central to all of these musings while DuWayne and Brad were talking was my thinking about some of the terminology we use. We refer to “the Digital Dragon” or “Digital Disruption” whereas none of our computers are actually “digital” but, far more accurately, they are “binary” … very fast processors of a lot of “zeros” and “ones.”
It is the unlimited scalability of those “zeros” and “ones” along with their related and also unlimited precision that is foundational. Kurzwell tells us that IT progresses exponentially. Perhaps that is because the foundation of that technology is, itself, exponential (more specifically, a technical architecture based on 2x) in nature.