Re: Subing EZ80 microcontrollers for Z80A cpus

From: William Boucher <wboucher6_at_cogeco.ca>
Date: Fri Dec 02 2011 - 19:38:47 EST

An interesting thought but until Z80A chips become as rare as hen's teeth, it's likely not worth the effort. While the new chip may be able to run the original program code, the chip would require a custom 144-pin to 40 pin adapter PCB just to plug in. About 90% of the new chip's capability would be wasted. The adapter PCB may require additional 3.3V-to-5V buffers for the outputs and it would also have to provide a 3.3V regulator. In order for the new microcontroller to start up properly, it may require some internal programming although I did read in the datasheet that it may be able to start running external Z80 program code automatically if properly set up.

The thing about speed with respect to Star Trek, the reason the later levels seem to slow down and lag is because there are so many objects to draw and compute. The Z80 CPU always runs the same speed regardless of the code that it is running. The more vectors there are to draw, the longer it will take to draw the screen image once. The more complicated the image, the slower it will seem to be. As you destroy lots of objects, the framerate will increase again and seem more normal. If the new microcontroller were to run the program code 3.5 times faster, the drawing speed would become way too fast for the GO8 monitor to keep up. There would be no way, that I am aware of, for the CPU to know how and when to speed up and slow down to make things appear to run at a steady pace. One of the microcontroller's internal clocks could certainly be used to generate a vector drawing pace maker but that would require somewhat of a program rewrite to accomplish.

The bottom line is that I don't think that simply replacing the original Z80A with something that runs faster is a solution to the apparent game lag issue. If it was, simply increasing the crystal frequency would accomplish the same goal, that is of course if the Z80A could run that fast which it cannot.

I suspect that running a Z80A CPU core on an FPGA would be easier to implement, however, someone who is an expert with the EZ80 might beg to differ.

William Boucher
http://www.biltronix.com
  ----- Original Message -----
  From: Zitt Zitterkopf
  To: vectorlist@vectorlist.org
  Sent: Thursday, December 01, 2011 3:19 AM
  Subject: VECTOR: Subing EZ80 microcontrollers for Z80A cpus

  Has anyone ever tried substituting an EZ80 microcontroller in place of a Z80A Cpu?
  Specifically; I was thinking it might help solve some of the cpu lag issues seen in the higher levels of the G80 Star Trek.

  According to the Z80 WIKI; the EZ80 can retire instructions at an average of 3.5x faster. Supposedly it’s binary compatible with the original Z80A.

  By no means is such a conversion trivial – was just curious what people may have tried or have comments on?

  I’ve only taken a cursry scan of the Datasheet:
  http://www.zilog.com/docs/ez80acclaim/ps0192.pdf

  John

---------------------------------------------------------------------------
** Unsubscribe, subscribe, or view the archives at http://www.vectorlist.org
** Please direct other questions, comments, or problems to chris@westnet.com
Received on Fri Dec 2 19:38:40 2011

This archive was generated by hypermail 2.1.8 : Sat Dec 03 2011 - 05:50:01 EST