The places where people want Apple broken up are generally places where we all know it will make the result "worse" but will enable interoperability and competition and break down lock-in and empower users and lead to more repairable and less wasteful devices and (and and). It’s by the rest of the industry figuring out how to collaborate on delivering similar gains without vertical integration. If we want competition against Apple, the way to get there is not by hobbling them with the inefficiency of the old paradigm. The people who suggest breaking up Apple are correct that it would destroy them, by forcing them to use inefficient generic components at every level. It applies up and down the stack for everything Apple does. Intel caters to the entire market (one the M1 just noticeably shrunk) with myriad integrators, workloads, and users, and each has different concerns and priorities making accelerated algorithm blocks useless to some and not-fast-enough for others.” It’s because Apple controls the entire software stack and therefore knows what they need on the chip. It’s not because Intel’s fabs are struggling or Apple has found tricks no one else in the world knows about. “ You would be right to say that Intel couldn’t make the M1. Intel caters to the entire market (one the M1 just noticeably shrunk) with myriad integrators, workloads, and users, and each has different concerns and priorities making accelerated algorithm blocks useless to some and not-fast-enough for others. You would be right to say that Intel couldn’t make the M1. They’ll be rewarded with sales and increased market share, and developers will have one architecture to wrench on for all platforms. It’s a great chip and Apple got there first. The M1 has solved for both with vector extensions and a massive reorder buffer and register file. At worst, you’re dealing with a bunch of branchy integer code. At best, all mathy code can be shoved down SIMD pipelines for high IPC. A chip respin costs 3-4 months and who knows how much money, so you need to go to production with the A0 silicon you powered up.ģ. Save power by creating accelerated logic blocks without locking yourself out of algorithmic improvements.Ģ. But the recent hero worship is a bit awkward to read as Apple is generally following an accepted playbook in the End Times of Moore’s law.ġ. The M1 is a damned fine chip and I’d like to own one. Put Tiger Lake on TSMC 5nm and a lot of M1’s lead goes away. They minimized risk (“got lazy”) when Moore’s Law reigned and their fabs were 1-2 generations ahead of everyone else, but have since squeezed every last architectural drop out of 14nm. “New Intel chips were often delayed and offered only small improvements over previous generations.”Īs noted on HN and elsewhere, Intel’s struggle has been in manufacturing. But none of these techs seem to have comparable image quality, and there are some inherent challenges (potentially addressable, but challenges) with color balance etc. which would also be a huge boost to battery life in well-lit conditions. The only way around this I could see for laptops and tablets in the sun would be a return to transflective displays or other non-black options. I have a white iPhone 10, and it's essential equipment for me as a convertible driver - on a sufficiently sunny day, the phone will overheat on the center console screen-up, even when it's powered off! But with the white side up, it's fine even on sunny days breaking 110 degF. One of my mild annoyances with the iPhone 12 is that there's no white option. Even when they're displaying a white screen, they're absorbing nearly all of the light hitting them, and then emitting their own energy. One of the challenges of using a large-screen device in the sun is that most screen technologies (including transmissive LCD and OLED, but excluding eInk and transflective LCDs) are inherently "black" - they absorb light energy that hits them.
0 Comments
Leave a Reply. |