A computer, tirelessly combing through patient records, looking for new medical knowledge — that was my 1981 Stanford PhD thesis project — the first example of data mining under autonomous AI control.
Under development at Stanford from 1976 to 1986, RX won many awards and contracts and was presented worldwide.
Who else would've been foolhardy enough to try this in the era when CPUs were crawling along chugging data on mag tape? Now, everybody's in the machine-learning and big data game. It's still not easy!
I bought an Oculus Quest 2 in November, 2021. I love it! Here are my "excellent adventures" with this must-have VR headset. (BTW, That's a picture of blogger Nathie — I'm fifty years older.)
Even if you're a senior (age > 39,) VR is a great place to hang out while Omicron burns itself out. Here I describe the games and apps I've tried on Facebook's (now Meta's) Oculus Quest 2 headset. I especially enjoy apps for dancing (Synth Riders, Beat Saber, and Dance Central) and apps for boxing (Creed and Thrill of the Fight) and parkour (Stride.) But my top pick is Eleven, a table tennis (ping pong) game with stunningly realistic physics. What I really enjoy is chatting with opponents from around the globe.
Self-Driving cars (SDCs) will be the most important drivers of machine vision, neural network architectures, and AI processor hardware over the 2020s and 2030s. The case boils down to money and technology acceleration. Money (hundreds of billions of dollars in R&D) will be invested in the 2020s. That money will translate into a hundred thousand engineering jobs worldwide as companies and countries in Asia and the West compete for dominance. This article is a heads-up display of that thesis and its ramifications.
Since my days as a math/cog sci undergrad at MIT, I've been interested in two questions:
(And, might a "conscious, subjective world" be a necessity for human level intelligence, for machine intelligence, in general, and for self-driving cars, in particular.)
Here, I review the state of the art of autonomous vehicles, their sensors, their CPUs and the companies that make them. Cars will need steering wheels in traffic until at least 2030. But, I think by 2040 we'll be able to throw them away.
Living in Silicon Valley, I'm used to watching Google's self-driving cars dodge me as I ride around town on my electric bike. But, are they conscious?
No (not yet!) But, here I address what it will take to make them conscious — and, why would you bother? Also, what's the difference between machine vision and mammalian visual perception? Led by advances in neuroscience, computer vision researchers are rapidly accelerating (but have quite a ways to go.)
These are my detailed notes on the many lectures I attend every week at Stanford. These frequently feature cutting edge research by our faculty, students, and visiting superstars. My WebBrain contains hundreds of archived lecture notes (last updated January, 2020).) To obtain more recent notes, contact me.
For over fifty years my interest has been driven by just two issues —
I've also had a multi-decade interest in health (especially now in my seventies.) That motivates my close attention to cardiology (eg, Stanford's CV Institute series,) molecular biology, and longevity studies (eg, Stanford's Glenn Foundation series.)
Here's my CV from 1986 when I left Stanford to go back into clinical practice (for twenty years at Kaiser.) Reasons for leaving academic AI:
1) emergency medicine can be a thrill, 2) higher salary, 3) local family ties, 4) love of Silicon Valley, 5) impending AI Winter.
I retired from clinical medicine in 2007 and came back to Stanford (as an Affiliate of the Center for Mind, Brain, and Computation.) I've been a perennial student of cognitive neuroscience since my undergrad days at MIT in the sixties. I continued that interest as a neuroscience MD,PhD student at UCSF, also (after finishing my residency,) as a research associate/ principal investigator in AI at Stanford.
MD/ PhD students, perhaps, daunted by my career switches, occasionally ask me for career advice. Basically, a career choice is constrainted by 1) what you love, 2) your skills, 3) the market for your skills, and 4) where you and your family want to live.
My little one page story showing what real, strong AI will be able to do.
This was my sci-fi reply to the question "when will computers be smarter than humans?"
Will Moore's Law soon hit a brick wall? To make sure it doesn't — Cymer, ASML, and Intel have spent billions developing EUV lithography.
Here is the current state of EUV — it's a matter of when, not if.
Stay tuned for a 2020 update; coming soon.
Although NAND flash has been a spectacular success (as in Amazon's best selling Samsung 850 Evo SSDs,) two new technologies will soon eclipse it, and even compete with DRAM in speed.
In the works for over a decade, startup Nantero's carbon nanotube (CNT) NRAM (non-volatile memory) will finally hit the market in 2018. With a fresh infusion of $21 million dollars for further development, their CNT NRAM has been licensed by several manufacturers. This will truly be a ground-breaking advance.
Intel and Micron made big waves in 2015 when they announced their 3D XPoint tech. It will be marketed starting in 2017 as Optane, initially for servers and subsequently for high-end gaming PCs.
Both of these new memory techs will help usher in a new world of inexpensive genomics, cutting edge brain simulations, and autonomous vehicles.
Stay tuned for a 2020 update; coming soon.
Recently the (Ray) Kurzweil Accelerating Intelligence (KAI) newsletter ran a major article by Lt Col Peter Garretson (US Air Force) entitled What our civilization needs is a billion year plan.
Here's what made me bristle in that article: 1) strong advocacy of manned space programs, 2) using those programs to rescue humanity, and 3) pushing the notion of trillions of humans spreading throughout the galaxy.
My rebuttal in KAI argues that 1) manned missions, costing 100X the price of science-based launches, are a waste of precious NASA resources better spent on robotic probes and rovers, 2) humanity is already choking off the biosphere of Earth - we don't need trillions more in space, and 3) humanity is a stepping stone to the profound intelligences that will emerge within the next century or two and that will be the great engineers and explorers of space.
Meanwhile, let's focus on sustaining humanity's home right here on Planet Earth. With luck, we will be able to enjoy life here for many generations to come.
Beating Jeopardy! was a stunning victory for IBM's Watson and its DeepQA architecture. It was headline news in 2010. My writeup (voted the best on the net by Quora ) provides links to the best online articles and videos, and summarizes the project's key AI components. Watson was a milestone accomplishment that will lead to cheap, widely available QA systems. It's a step toward passing the Turing Test, but not an advancement in perception as were the DARPA Grand Challenge robotic cars. With Jeopardy Champ Ken Jennings, I too welcome our new AI Overlords, but take heart at having a brain that's the equivalent of a server farm but that runs on coffee and donuts.
This is my 2009 book review of Total Recall: How the E-Memory Revolution Will Change Everything, by tech magnate Gordon Bell and his Microsoft colleague Jim Gemmell. They describe a future of total data capture that's inevitable for many of us.
Kevin Kelly is the renowned futurist and founder of Wired magazine. As I looked at the rave reviews for his new non-fiction work, The Inevitable, I wondered, "Are Marc Andreessen, David Pogue, and Chris Anderson just giving the Senior Maverick at Wired his proper obeisance?"
No! This is another home run (as was What Technology Wants) — another magnum opus — this time addressing the phase shift in civilization signaled by the amalgam of internet + seven billion souls.
This is my book review of Kevin Kelly's 2010 magnum opus What Technology Wants. Although I disagree with his view of the benign nature of technology, this is an important book that I whole-heartedly recommend.
My figure of merit for books and movies is number of re-reads or re-watches.
This book gets regularly re-read. (My favorite movie list includes Avatar and the recent Star Trek: Into Darkness. My son and I had high hopes for the remake of Star Wars, but it was an artistic dud.)
After Microsoft started their forced cram-downs of Win10 in December, 2015, I thought, "Ah, poo-poo — I'll knuckle under." So, I put a blank SSD into my trusty Win7 tower (the case is always open,) and let MSFT overwrite my venerable Win7 (see next story.)
One of the little nagging problems I had was that I couldn't reliably drag the active window with my mouse. So, I swore at it for a week (there were other problems, too) and then I reinstalled my trusty Win7 (by just plugging in another SSD.)
I did ultimately solve the problem. It's easy, but — really? — this should've been cracked in Microsoft user focus groups. How can they ignore their customers like that? (Answer: they've been relying on their monopoly status for decades. Look at the stock chart.) Microsoft's got troubles.
PS: Note to reader: I've been using Win10 for four year now and like it (ie I no long swear at it.) But, as with Win7 below, initially there were problems.
In 2016, as I contemplated my shift to Win10, I'm amused at the thought that I dis-ed the then rock-solid Win7 here in this 2012 piece.
In 2016 at Stanford as Professor Olaf Sporns (of Connectome fame) was lecturing to a packed audience, Microsoft servers tried to do their usual forced update to his laptop. His powerpoints came to a screeching halt, as his SRO audience groaned.
Microsoft mainly caters to their business customers — the ones they really care about — and whether and when they upgrade to Win10 from the (then venerable) Win7.
I have a few old friends who work for the Evil Empire (in research) — so I always dis Microsoft less than it deserves. (Besides, Bill Gates' name is on the Stanford Computer Science building — and the Paul Allen Center for Integrated Systems is close by. Bill (and his late partner Paul) does great philanthropy.) Note that Bill himself gets frustrated by Microsoft's software.
I kind of believe (former CTO) Nathan Myhrvold's explanation of Microsoft's difficulties. They support thousands of different devices — Apple's machines all fit on one kitchen table (as CEO Tim Cook likes to point out.)
In 2016 Microsoft was angering thousands of its customers owing to its nightly Win10 forced cram down. (Like the businesses, I was also locked in with all my networked machines.)
My local Apple Store and Microsoft store are immediately adjacent to one another. These 2016 photos speak louder than words. (Apple has a far more enthusiastic fan base.)
Note: Under new CEO Satya Nadella MSFT has been partially exonerated.
(This is a 2007 letter I sent to MIT's Technology Review responding (in agreement) to an article by Yale's Professor David Gelernter AI is Lost in the Woods.)
In the past decade AI has made headline-generating progress with its self-driving cars, speech-recognizing intelligent assistants, and robots.
Much of that progress in AI and machine-learning has resulted from the adaptation of neural network models.
But, if you think human-level intelligence is just around the corner, you've been misled.
Much of the brain is still terra incognita, and some of that detail may be required to replicate human consciousness.
My review of Intel's new (April 2011) 320 series, solid state drives (SSDs). They're awesome.
2016 update — I now prefer Samsung EVO SSDs — apologies to my friends who work at Intel. Here's part of what sold me on Samsung SSDs.
The Sammies are cheaper — $86 at Amazon — and their Data Migration software backs up your drive in minutes, even while your computer's on.
If you're still using a rotating hard-drive, you may also want to get this home computer.
This work introduced the noosphere, the ocean of knowledge in which humanity dwells — a theme central to the prescient Teilhard de Chardin, and his disciple, Fr. Thomas Berry.
One of the next big steps in evolution will occur when the AIs master all the knowledge on the internet.
A brief bio I wrote a decade ago about my life-long interest in the mind/body problem.
Woody Allen would say, "which is it better to have?"
A chapter length autobiography of my interest in the mind/brain and software/computer relationship
I was writing this for a book. My current view is that books are almost obsolete.
Why buy a book when you can read stuff free on the web? (Also, who cares about biographies? (My favorite is Lytton Strachey's bio of Queen Victoria. Like Victoria, I favor rule by monarch.) And, the public's attention span is at most a month. As an author, why bother?)
As an old guy, however, all my life I've enjoyed reading a book at bedtime (and actually turning the pages.) My current favorite bedtime reading is Ashlee Vance's bio of Elon Musk. It merits an A+. I also love all of Steve Pinker's books.
(Another chapter length essay I wrote a decade ago on the power of knowledge. (Some of my assertions remain valid even after my latest epoch of fascination with neuroscience.) Yes, intelligence is multi-dimensional — social, emotional, motoric, spiritual — but that's later.)
TED is the annual conference of the tech cognoscente. Fortunately, their fifteen minute talks are all available. (I originally wrote this paragraph in 2007 before TED was so widely known.) Here are a few of my favorites — just updated for 2018. )
In 2012 while attending the University of Arizona's Towards a Science of Consciousness, I shared several meals with TED's owner/organizer, economist Chris Anderson — getting the inside scoop. Chris is the brilliant, low-key guy who asks the questions after many of the best talks.
Kevin Kelly is the founding executive editor of Wired Magazine. In 2008 Kevin posted an elegant article on Evidence of a Global SuperOrganism. It posited four assertions: the Web is 1) a manufactured superorganism, 2) an autonomous superorganism, 3) an autonomous, smart superorganism, and 4) an autonomous, conscious superorganism.
My response provides evidence that the Web is becoming exponentially smarter. A quantum leap will occur when it can read, synthesize, and learn from its exabytes of content.
Conscious awareness is entirely distinct and is a huge mystery. Awareness seems to be characterized by the large scale, multi-modal integration of perceptions of self and environment. All mammals are conscious (and probably other chordata and even some invertebrates). In 2008 I reviewed Gaillard's research on gamma oscillations, one possible telltale sign.
My detailed lecture notes appear above.
The Singularity Summit used to be one of my favorite conferences.
(Now, I consume a steady diet of the subdomains (neuroscience, AI, molecular bio, nanotech.)
In 2010 it was held in San Francisco and included a mix of well-known singularitarians (Ray Kurzweil, Elie Yudkowski, and Ben Goertzel): neuroscientists (Brian Litt, Terry Sejnowski, and Demis Hassabis) psychologists(John Tooby and Irene Pepperberg); computer scientists (Shane Legg, Steve Mann, David Hanson, and Ramez Naam) and biologists (Greg Stock, Lance Becker and Dennis Bray).
Here's a report on the 2012 Summit.
Manna is a poignant dystopian work of social commentary in the tradition of Brave New World and 1984. Written by Marshall Brain (inventor of HowStuffWorks) and freely available on the web, it deserves a wide audience.
This is an unsolicited, unpaid testimonial for TheBrain, a software program I've used daily for a decade to keep track of everything.
WebBrain is the server-based version of TheBrain that I use for sharing the public half of my brain. Most notably, WebBrain is where I upload and store my hundreds of detailed Stanford lecture notes. Court-reporter style, I've got meticulous notes on all our neuroscience, psychology, and AI superstars. Those fields have increasingly essential overlaps. Here's a way to keep up.