Doug Evans wrote:
Dave Korn wrote:
Doug Evans wrote:
This is new ground so we can decide how we want things to look, and then
make it work.
Well, what I'd particularly like in this case would be for my pc to
increment by one for each 24-bit insn, rather than have the model
pretend to
be an 8-bit CISC machine processing all 3-byte instructions, if you
see what I
mean.
Righto. That should be doable regardless.
I hope so. Should I just take the blunt sledgehammer approach, and define
setters and getters for hardware units h-addr and h-iaddr that multiply/divide
by 3 on the fly, and let the underlying framework pretend it's a CISC machine
that uses 3-byte operands and data? Or is there a cleaner way to go?