Apple to ditch IBM, switch to Intel chips
When Apple will switch to x86 and ia64 it will be possible to run mac osX on your own home bought computer, given that they don’t avoid it by changing some opcodes.
One other things is sure; they will loose some power. As long as they provide binaries and provide one set for all apple computers they will have to add support for both ppc and x86 in one binary distribution which will undoubtely be a lot bigger if not slower too.
It will be interesting to see how this issue will evolve.
4 thoughts on “Power PC dumped”
Not necessarily. I still think Intel might be doing PPC chips and/or WiMax and the DRM they recently rolled out silently. We’ll see. One hour to go…
In any case they still will require support for a whole new architecture which means either a drop in speed or either an increase in size. (they have to choose whether to optimize for both architectures or just use common (slower) opcodes instead)
There’s no reason the larger executables should be slower, as only the “correct” half of the executable will ever be loaded by the OS and thus the foreign instructions won’t be trashing cache-lines and perhaps not even reside in memory.
The intel processors will not support the old instructions, and as such there won’t be any optimizing for common opcodes, or optimizing for it or whatnot. PowerPC instructions that need to be executed will execute on a simulated G3, and will do so at a much slower speed that native execution.
i.e. Recompiled software will probably run faster (assuming the intel chips are faster, which they basically are), and some software will run really slow because it uses emulation.
Code size remains an issue, but disks are so large, and data so much more of a bit-consumer, that I don’t think the increased executable size will really matter at all.
Larger binaries mean more to load and to look through when starting an executable. It also means there is more to install and all. But indeed, when it has loaded there isn’t a lot of difference. But I guess people should mind their size of executables and libraries to double.
The emulation shouldn’t be a lot of a problem for these days emulation can be done by using JIT’s or AOT’s. Basicly by either converting the opcodes of one executable to be compatible with another architecture or to either do that at runtime. Both will most certainly cause a drop in speed but it wont be that much of a drop as compiling for x86 generic instead of for instane amd athlon.
But I heard that they are to release normal MacOSX and I-MacOSX. I-Mac would be the one with Intel executables. So code size wouldn’t be an issue except for third parties wanting to deliver their product in one download/disk/etc instead of multiple.