The x86utm operating system based on an open source x86 emulator. This
system enables one C function to execute another C function in debug step
mode. When H simulates D it creates a separate process context for D with
its own memory, stack and virtual registers. H is able to simulate D simulating
itself, thus the only limit to recursive simulations is RAM.
Execution Trace
Line 14: main() invokes D(D)
keeps repeating (unless aborted)
Line 06: simulated D(D) invokes simulated H(D,D) that simulates D(D)
Simulation invariant:
D correctly simulated by H cannot possibly reach its own line 09.
Is it dead obvious to everyone here when examining the execution
trace of lines 14 and 06 above that D correctly simulated by H cannot
possibly terminate normally by reaching its own line 09?
system enables one C function to execute another C function in debug step
mode. When H simulates D it creates a separate process context for D with
its own memory, stack and virtual registers. H is able to simulate D simulating
itself, thus the only limit to recursive simulations is RAM.
Code:
// The following is written in C
//
01 typedef int (*ptr)(); // pointer to int function
02 int H(ptr x, ptr y) // uses x86 emulator to simulate its input
03
04 int D(ptr x)
05 {
06 int Halt_Status = H(x, x);
07 if (Halt_Status)
08 HERE: goto HERE;
09 return Halt_Status;
10 }
11
12 void main()
13 {
14 D(D);
15 }
Execution Trace
Line 14: main() invokes D(D)
keeps repeating (unless aborted)
Line 06: simulated D(D) invokes simulated H(D,D) that simulates D(D)
Simulation invariant:
D correctly simulated by H cannot possibly reach its own line 09.
Is it dead obvious to everyone here when examining the execution
trace of lines 14 and 06 above that D correctly simulated by H cannot
possibly terminate normally by reaching its own line 09?
Last edited: