• Discovered Jan 17, 2021. The long future of artificial intelligence “Firing up a multi-million dollar server farm to train a cutting-edge natural language model like GPT-3 feels a lot like firing up a building-sized computer in the early days of classical computing. We didn’t get to smartphones by simply shrinking vacuum tubes and punchcards; we got here because we found new substrates for the same computational power, namely laser-etched transistors on silicon. I imagine, in a hundred years, if we have GPT-10 in our pockets, it won’t run on GPUs and TPUs, but a new material entirely.

    I believe there exist materials that will enable smaller, faster, far cheaper and far more efficient computers that can see, think, and speak” <– I don’t think GPT-3 is intelligent see Climbing towards NLU:On Meaning, Form, and Understanding in the Age of Data but perhaps on different sorts of computers will get closer or achieve an “intelligence” breakthrough and Linus aka @thesephist is right when he writes “Inventing the computer and the deep neural network may still be one of the first few steps in replicating the magic of the enchanted loom” ?!?

Leave a comment on github