Re: SDCC porting feasibility study, part 1: the assembler

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> I am contemplating writing a new toolchain from the ground up at this point.
> I'm rapidly learning that open source C toolchains are in short supply, and
> the ones that exist either (A) don't target 8086 at all, (B) are not
> documented well enough (or clearly enough) for a newcomer to add support,
> (C) output things in ways that are undesired, or (D) are so complex that all
> ye who enter there abandon all hope, specifically thinking of gcc.
> Particularly with smaller CPUs than 8086, it seems C compilers are
> ill-suited. I am thinking of the 6502, of which many systems exist with
> massive (for a 64K address space) amounts of bank-switched RAM; its minimal
> amount of 8-bit registers and interesting addressing modes make it hard to
> compile good code for from a language like C. The 65816 being the 16-bit
> variant makes it highly desirable to port ELKS to (wouldn't it be nice to
> have ELKS working on the Apple IIgs?)
>
> Maybe what we need to be doing is making a list of the features that we need
> a compiler to support, rather than taking each one in turn and trying to jam
> the pegs in the holes?

What scares me there is writing a whole toolchain isn't trivial.  In
college, I took a class where we wrote the most basic of basic
compilers in java (using a nice grammar parsing library), reading in a
simplified ALGOL and targeting mips and doing no optimization or real
register allocation at all.  It barely supported functions with
parameters and return values.  No strings.  No floating point.  Just
32 bit ints.  We used an existing assembler that did all the dirty
jump calculations for us.  It was a lot of work just to get code to
execute, and without having a linker, it only runs in a setup where
loading is performed by hand (eg, simulator).  The entire semester
class was just to get it to this point.

I think even if we have to severely re-engineer things or even do a
ground-up, starting with digging into something like SDCC and using it
as a reference point would help considerably.  Especially if aiming
for a multi-target compiler, you almost have to design your targets in
at the beginning to make sure you're not missing the infrastructure to
support it.

Other questions...

Should the toolchain be able to self compile?  (how close is SDCC to
self compiling?  This probably isn't a bad indication of how good it
is in general)
Should it be able to self compile on a real target?  If yes, are gobs
of RAM a safe assumption?
Language features supported?  (I think SDCC might have limitations
here, I saw some stuff that it won't copy entire structs, which sounds
like kinda a big deal but probably not terribly hard to overcome)
Optimizations?
Data types to support?
Standard library?
How much architecture specific code is appropriate to leave to the
applications or libraries?

"Software ecosystem" questions...

How much existing software would be easy to adapt to this toolchain?
How useful is the work done on the toolchain to other free software
communities or hobbyists, etc?
--
To unsubscribe from this list: send the line "unsubscribe linux-8086" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Kernel]     [Linux ia64]     [DCCP]     [Linux for ARM]     [Yosemite News]     [Linux SCSI]     [Linux Hams]

  Powered by Linux