Newbie question: accessing global variable on multiprocessor

F

Flash Gordon

BGB said:
on DOS, yes, it is the HW in this case...

Yes, which is relevant to the original point. If a compiler conforms to
the standard on the platform it was written for, but does not conform to
the standard when run on hardware/software more recent than the
compiler, then that is *not* a problem with the compiler.

If either software or hardware vendor changes things in a way that
breaks other software for good reason (and there are lots of good
reasons) then I'm afraid that's part of life.
yeah, AMD prompted it with a few of their changes...

but, MS could have avoided the problem by essentially migrating both NTVDM
and DOS support into an interpreter (which would itself provide segmentation
and v86).

That doesn't stop it breaking things due to running too fast. It is also
another peace of software which has to be maintained on an ongoing
basis, so security audits, testing, regression testing whenever Window
is patched, redoing the security audit if it needs to be patched due to
a patch in core Windows... it isn't cheap to do right.
a lot of the rest of what was needed (to glue the interpreter to Win64) was
likely already implemented in getting WoW64 working, ...

this way, we wouldn't have been stuck needing DOSBox for software from
decades-past...

if DOSBox can do it, MS doesn't have "that" much excuse, apart from maybe
that they can no longer "sell" all this old software, so for them there is
not as much market incentive to keep it working...

It's not simply that there is no money in it. It's also that there are
costs which increase over time. There were costs involved in being able
to run DOS under Win3.1 in protected mode. Getting it working in Win95
required more work and so more costs. Making everything work under
Windows NT would have cost even more (so they did not try and make
everything work). At some point it would require a complete processor
emulator, which can be written, but is even more work and more complex
and would need revalidating every time Windows is patched (the DOSBox
people can just wait until someone reports it is broken, rather than
having to revalidate it themselves for every patch).

There is also the simple old rule that the more lines of code the more
bugs there will be and the harder maintenance and future development is,
and keeping backwards compatibility and obsolete features increases the
line count.
 
B

BGB / cr88192

Flash Gordon said:
Yes, which is relevant to the original point. If a compiler conforms to
the standard on the platform it was written for, but does not conform to
the standard when run on hardware/software more recent than the compiler,
then that is *not* a problem with the compiler.

yes, this is the fault of the HW vendor(s) for having changed their spec in
a not backwards compatible way...

for example, AMD is to blame for a few things:
REX not working outside long mode;
v86 and segments not working in long mode;
....

but, they did an overall decent job considering...
(much better than Itanium, we can see where this went...).

If either software or hardware vendor changes things in a way that breaks
other software for good reason (and there are lots of good reasons) then
I'm afraid that's part of life.

but, then one has to determine what is good reason.

in the DOS/Win16 case, I am not convinced it was good reason.

That doesn't stop it breaking things due to running too fast. It is also
another peace of software which has to be maintained on an ongoing basis,
so security audits, testing, regression testing whenever Window is
patched, redoing the security audit if it needs to be patched due to a
patch in core Windows... it isn't cheap to do right.

running too fast doesn't break most apps, and for those rare few that do,
emulators like DOSBox include the ability to turn down the virtual
clock-rate (turning it down really low can make Doom lag, ...).

presumably, something like this would be done almost purely in userspace,
and hence it would not be nearly so sensitive to breaking.

basically, in this case likely the whole 16-bit substructure (including GUI,
....) would likely be moved into the emulator, then maybe the 16-bit apps
draw into the real OS via Direct2D or whatever...

It's not simply that there is no money in it. It's also that there are
costs which increase over time. There were costs involved in being able to
run DOS under Win3.1 in protected mode. Getting it working in Win95
required more work and so more costs. Making everything work under Windows
NT would have cost even more (so they did not try and make everything
work). At some point it would require a complete processor emulator, which
can be written, but is even more work and more complex and would need
revalidating every time Windows is patched (the DOSBox people can just
wait until someone reports it is broken, rather than having to revalidate
it themselves for every patch).

There is also the simple old rule that the more lines of code the more
bugs there will be and the harder maintenance and future development is,
and keeping backwards compatibility and obsolete features increases the
line count.


a CPU emulator is not that complicated, really...
one can write one in maybe around 50 kloc or so.

more so, MS already had these sorts of emulators, as they used them for
things like WinNT on Alpha, ... little says why similar emulators wouldn't
work on x64.

the rest would be to dump some old DLL's on top (hell, maybe the ones from
Win 3.11, FWIW, or a subset of 95 or 98...), and maybe do a little plumbing
to get graphics to the host, get back mouse, allow interfacing the native
filesystem, ...

I could almost do all this myself, apart from not wanting to bother (since,
DOSBox+Win3.11 works, or I could install 95, 98, or XP in QEMU, ...). but as
I it, this one should have been MS's responsibility (rather than burdening
end-users with something which is presumably their responsibility).

(DOSBox gives direct filesystem, but doesn't do very well at keeping it
sync'ed, resulting in extra hassles, 32-bit XP + QEMU though would allow
mounting a network share, OTOH).


hell, MS could have probably even just included DOSBox, FWIW.

well, ok, it is worth noting that Windows 7 Professional & Enterprise do
come with an emulator (not seen personally), which I guess just runs 32-bit
XP (and, so yes, 16-bit SW does maybe return on Win-7, in an emulator...).
(well, with these, one can also get an MS-adapted version of GCC and BASH,
hmm...).

I have Win-7 on my laptop, but it is "Home Ultimate", and hence also
requires the DOSBox or QEMU trick...


anyways, big code is not really that big of a problem IME:
I am working on codebases in the Mloc range, by myself, and in general have
not had too many problems of this sort.

MS has lots more developers, so probably they have easily 10s or 100s of
Mloc to worry about, rather than just the few 100s of kloc needed to make
something like this work, and maybe even be really nice-looking and well
behaved...
 
N

Nobody

for the DOS or Win 3.x apps, few will notice the slowdown, as these apps
still run much faster in the emulator than on the original HW...

That depends upon how much you emulate.

For code which interacts with hardware, you may need to emulate the
hardware as well. This is less of a problem for the PC, due to the
widespread presence of clones (i.e. non-IBM systems). On platforms with
little or no hardware variation (e.g. the Amiga), programs would often
rely upon a particular section of code completing execution before a
certain hardware event occurred (or vice versa).

You may also need to emulate the timings for other reasons. A game which
doesn't scale to frame rate may need to be slowed down to maintain
playability. OTOH, a game which does scale to frame rate may need to be
slowed down so that the frame timings fit within the expected range.

A concrete example of the latter: the original Ultima Underworld game
basically still runs under Win98 on a P4. However: it scales to frame
rate, i.e. the distance anything moves each frame is proportional to the
time between frames. Normally this would be a good thing, except that
everything uses integer coordinates. On a modern system, the time between
frames is so low that the distance moved per frame comes often out at less
than one "unit", so positive values get rounded to zero and negative
values to minus one, resulting in some entities (including the player)
being unable to move North or East.

tl;dr version: some code is so tied to specific hardware that the only way
you can run it on anything else involves VHDL/Verilog simulation.
 
B

BGB / cr88192

Nobody said:
That depends upon how much you emulate.

For code which interacts with hardware, you may need to emulate the
hardware as well. This is less of a problem for the PC, due to the
widespread presence of clones (i.e. non-IBM systems). On platforms with
little or no hardware variation (e.g. the Amiga), programs would often
rely upon a particular section of code completing execution before a
certain hardware event occurred (or vice versa).

yeah. DOS emulators typically run them on fake HW.
for example, DOSBox uses a fake SoundBlaster and S3 Virge (from what I
remember), ...

for Win16, this should not be necessary, since Win16 still did have
protection, and generally isolated the software from the HW. even if it did
allow direct HW access, these apps would unlikely have run on NT-based
systems (unless NTVDM was actually faking a bunch of HW as well, but I doubt
this).

You may also need to emulate the timings for other reasons. A game which
doesn't scale to frame rate may need to be slowed down to maintain
playability. OTOH, a game which does scale to frame rate may need to be
slowed down so that the frame timings fit within the expected range.

A concrete example of the latter: the original Ultima Underworld game
basically still runs under Win98 on a P4. However: it scales to frame
rate, i.e. the distance anything moves each frame is proportional to the
time between frames. Normally this would be a good thing, except that
everything uses integer coordinates. On a modern system, the time between
frames is so low that the distance moved per frame comes often out at less
than one "unit", so positive values get rounded to zero and negative
values to minus one, resulting in some entities (including the player)
being unable to move North or East.

granted, poor code is allowed to break, presumably...
I think in general the Ultima games were known for being horridly
unreliable/broken even on the HW they were designed for...


anyways, the point would be to make old software work, not to make buggy
software work.
the vast majority of old SW working is what is asked, not all old software
which may contain obscure bugs.

tl;dr version: some code is so tied to specific hardware that the only way
you can run it on anything else involves VHDL/Verilog simulation.

errm, I doubt this...

most full-system emulators fake things at the level of the IO ports, ... and
this in general works plenty well (both OS's and apps generally work). other
things, such as the DMA and IRQ controller, ... can similarly be faked in
SW, and don't require full HW simulation.

on many newer systems, the bus controller itself contains a processor and
some code, which emulates some legacy devices in much the same way: watching
IO ports, responding, ...

granted, not everything works exactly, within reasonable bounds:
QEMU or Bochs will probably not give HW Accel graphics, for example, but
most other things work.

bit-twiddling != need for VHDL...
 
N

Nobody

errm, I doubt this...

most full-system emulators fake things at the level of the IO ports, ... and
this in general works plenty well (both OS's and apps generally work). other
things, such as the DMA and IRQ controller, ... can similarly be faked in
SW, and don't require full HW simulation.

But they only emulate the hardware to the extent sufficient for "typical"
use.

That's not a problem if the system you're trying to emulate includes a
"real" OS. You only need to emulate to the level at which the OS uses the
hardware, and at which it permits other applications to use it.

On platforms where it was common for applications to just kick the OS
out of the way and access the hardware directly (i.e. most of the 8- and
16-bit micros, and PCs before Win3.1 took over from DOS), anything could
happen (and often did).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,572
Members
45,046
Latest member
Gavizuho

Latest Threads

Top