Another post regarding time travel got me wondering how far back in time can I hypothetically leave a modern computer where they, the most capable engineers of their time, can then somewhat reverse engineer it or even partially?
Depends on what you expect them to do exactly. Today’s transistors aren’t much different than older ones, just smaller mainly. People of, say, 20-30 years ago may have the technology to inspect them (electron microscope or something like that), and the knowledge to understand them, but not the equipment to reproduce them.
If you go much farther back in time, say before integrated circuits (1960) or even transistors (1947) were invented, I think it’s unlikely that someone could reverse engineer the thing
Removed by mod
Yeah, modern CPU production is incredible and a pet interest of mine lately. I’d highly recommend the Asianometry YT channel if anyone wants to go deep.
Here is an alternative Piped link(s):
https://piped.video/@Asianometry?si=qj4fqBK2pwyQozG2
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
ITT: people conflating “reverse engineer” with “emulate”
That depends on what we mean by reverse engineer.
The overall purpose and function of each component, the PCB and PSU can go pretty far back, maybe even prior to the invention of the semi-conductor. I think without knowledge of electricity, and even AC current, would make it very hard since they couldn’t power it on. So my bet is around 1880 and it would need to be investigated by Nicolai Tesla.
But if we mean construct a similar one we’re going to need a lot of tech which you can’t infer from looking at the components, no matter what tools you have. The build of a modern CPU/GPU chip is absolutely mind-blowingly complex. 10 years for sure, 20 years likely, 30 years and I’m unsure. 40 years and it’s going to be extremely alien. 50 years completely impossible.
When the hot air balloon was invented, citizens thought it was a monster and beat up one of the first ones when it landed, and that was in the 1700’s (and that was right before the Hartlepool monkey incident, go figure). If people couldn’t fathom the mechanisms of the hot air balloon, an invention of their own day, it would surprise me if anyone before the advent of retro computer would understand a modern one wasn’t some kind of golem.
Zero years. Having a computer chip wouldn’t give much of a clue about how it was made.
The bane of all time travel is materials engineering and supply chains.
Good post, this was fun to read.
I think the answer is somewhere in here: https://en.m.wikipedia.org/wiki/Timeline_of_microscope_technology
I mean it’s just layers that can be removed by lapping. The real question is the ability to see the smallest features.
Chip fabs are the most expensive human industry is all of history. Production requires massive rare resources and tooling precision. Like, start looking up some of the nastiest chemicals that have ever been produced, mostly those intended to kill people, and you’re looking at the inventory stocking list for a fab.
The YT channel Asianometry is based out of Taiwan and has a lot of ties to the industry if you want a good idea of what is involved on various fab nodes and their histories.
Technically everything that a computer does can be simulated using any medium, pen and paper for example, or rocks and sand (relevant XKCD).
As for actually creating the parts needed, well a modern computer is just a very advanced Turing Machine which only requires 3 parts to operate: a tape for storing memory, a read/write head for reading/altering the data in memory, and a state transition tape to instruct the head to move left/right on the memory tape.
The memory and state transition tapes themselves can be anything, even a pen or rocks as in the previous examples. The read/write head could be anything as well. In previous iterations of computers we used the state of and turning on and off of vacuum tubes as a read/write head.
So conceptually, any time that humans were intellectually capable of reasoning out the logic. Their computer would just run much slower and be less useful the farther back in time you go.
“Reverse engineering” means tearing a machine down to figure out how it works.
Regardless of how much computation you can do with an abacus or an army of men with flags acting as logic gates, without sufficient microscopy you cannot reverse engineer a microchip.
That’s what this question is getting at: what previous incarnations of civilization would be able to study a computer and figure out what it’s doing?
Well if we’re considering alternate histories where a civilization gains access to a working computer then it’s basically impossible to tell. It depends on so many variable factors. Whether someone in that time period takes a significant enough interest to even look into it in the first place, whether they’re smart enough to solve the question of what it’s doing, and even who’s hands the computer falls into.
There’s a famous example of an ancient Roman trinket that was kept in the collection of a wealthy person. It was a small device that when placed over hot water would spin. We would recognize that device today as a steam turbine and we would know that it has the possibility of sparking the industrial revolution if the right person got a chance to look at it.
So if an ancient civilization got their hands on a modern computer and managed to do anything useful at all with it, it would alter world history in ways that we wouldn’t recognize it anymore. Even if they didn’t directly reverse engineer the computer but instead gained insight into other technologies like electricity or plastic production, it would alter world history in such a way that the modern computer would almost certainly be produced much earlier than in our own history which kind of nullifies the point of the question.
deleted by creator
- You have to define what you mean by “modern computer.” If we really break things down, an abacus of infinite size would be Turing complete. It would take a really long time to play Doom on it, though. It would also need a person (or people) to operate it. However, the technology to do so would have been available starting around 2500 BCE. It could even be much earlier, if you want to have your time traveller also invent the abacus. If you want something a bit more pragmatic, we can look to Charles Babbage and Ada Lovelace, who are generally credited with creating the world’s first programmable computer with a number of functions still in use today. Babbage was working in the mid-19th century, but given knowledge of his work could probably be reverse engineered back a bit as well. If you want to go in the other direction and make it even weirder and less practical, you can perform computation with a large room filled with people passing slips of paper back and forth after doing a simple logical operation on them.
My point is that there’s the current state of hardware technology, which depends on a whole chain of technological advances, and there’s computation logic, by which we see the “universal” part of the universal Turing machine.
If you’re talking solely about hardware and modern electronics, there’s a whole set of dependencies on industrial engineering and chemistry that goes from gears to vacuum tubes to diodes, which is interesting in its own right. What I guess I’m saying is that the advancements in the theory of computation (elements of theoretical architectures and mathematics) is distinct from the hardware it runs on. If you were to go back and teach the calculus and the theory of computation to Da Vinci, I imagine he’d come up with something clever.
Probably not very far, all things considered, because go too far back, and modern semiconductors might as well just be a magic rock as far as the technology of the time is concerned. You can’t just crack open that flashy new ryzen to see what makes it tick.