Apple’s failure to develop its own modem detailed in new report

BuddyTheBeefalo@lemmy.ml to Technology@lemmy.world – 167 points –
Apple’s failure to develop its own modem detailed in new report
theverge.com
37

I though "failure" was an absolute term? They obviously aren't done developing yet, but that doesn't mean they never will.

I bet they have working prototypes and it comes down to something like power draw being too high. They probably want something that's at least on par with what Qualcomm has.

And even when the first "retail" version is done, I find it highly unlikely that they put it in the flagship iPhone first. The modem having a bug or other weird behavior in their most popular product would be detrimental. They'll test it in cellular iPads, or maybe even in MacBooks. If they test it on an iPhone, it'll probably be on the iPhone SE first.

And after all that, they'll put it in their flagship iPhones.

It's a "when", not an "if".

1 more...

Imagine trying to navigate the patent minefield when developing something like a modem

This is the best summary I could come up with:


According to a detailed report from the Wall Street Journal, Apple’s attempt to develop its own in-house 5G modem has been stymied by issues resulting from the iPhone maker underestimating the complexity and technical challenges of the task, and a lack of global leadership to guide the separate development groups siloed in the US and abroad.

“They hate Qualcomm’s living guts,” says Edward Snyder, a wireless industry expert and managing director of Charter Equity Research, in comments reported by the WSJ.

After settling its dispute with Qualcomm in 2019, Apple quickly acquired Intel’s smartphone modem business, along with a few thousand engineers to help advance its development efforts.

That’s why Apple extended its modem deal with Qualcomm — which would have expired at the end of this year — just days before the iPhone 15 was announced.

And while some have lauded Huawei’s HiSilicon chip design business for beating Apple to the punch with the apparent development of its own 5G modem in China’s Mate 60 Pro, lab tests show that Huawei’s chips consume more power than competitors’ and cause the phone “to heat up” which is bad for performance.

Apple’s custom modem work continues, and Bloomberg’s Mark Gurman suggests we’ll likely see them gradually roll out before the current Qualcomm deal expires in 2026.


The original article contains 381 words, the summary contains 212 words. Saved 44%. I'm a bot and I'm open source!

I don’t understand this - didn’t Intel have a working cellular modem chip before Apple bought that segment of the businesS? Sure, it wasn’t good, and Intel probably saw that it was going to be difficult but with the amount of money Apple invested in this, starting with a working product, how so they not have a working product?

I think you answered your own question. It wasn’t good. Apple isn’t willing to sacrifice battery life since it has been one of their biggest selling points on the iPhone for years. As far as why they haven’t figured it out yet. It is probably pretty difficult. Intel spent tons of money on it and couldn’t succeed. A chip maker gave up. That should tell you how difficult the process is. The 5G modem industry is basically a monopoly so there are a ton of companies that would be trying if it were easy to do.

How did Huawei do it so well if Apple and Intel struggled?

Drop support for frequency bands and older mobile standards not used in China. Don’t worry about battery life.

Works great for Huawei, no good for Apple.

No I'm talking about the 5g transmitters and receivers they sold in the USA before we banned them

Loads of companies make the tower equipment, including Huawei, Cisco, Nokia... in those cases, size, cool running and low power draw aren't as important. Apple gives no shits about that part of the industry.

Power draw for cellular tower equipment has become a major concern and product differentiator for mobile operators. I work in telco sector.

AFAIK Apple develops their modem in Germany. The initial effort to get into this market is tremendous. After that it’s incremental improvements, but you must start somewhere.

I can’t find any reviews of the chip itself, just announcements. It is too early to say they succeeded. Also intel did make a 5G modem for phones. It just was a couple of years behind Qualcomm’s chip. There is a good chance that Huawei’s 5G chip will be the same. If you look at the phone they are going to sell with the new 5G chip, it has a main processor that is performing at the standards of two years ago. The 5G modem could be the same way. Furthermore, Huawei could be breaking patents and be fine as long as the phone isn’t sold outside of China. The main difference is that Apple doesn’t want to put a worse 5G modem in their products and can’t pretend that patents don’t exist.

It's not that surprising. Despite Job's lies about "patenting" multi touch or whatever, they never developed tech. Most of these silicon valley companies don't, they staple together tech that's developed in the public sector and take all the credit and profit.

Edit: I forget that people don't generally know about this:

https://i.pinimg.com/736x/82/3e/f0/823ef0be785ee604eccea26ff6583156--mariana-ux.jpg

All of the actual tech is public sector. The form factor is a rectangular mini computer around a touchscreen. That wasn't special either, there were lots of devices that were the same. The thing that made the iPhone "special" was the capacitive touchscreen, which wasn't a design innovation, but a technological innovation. They put it in a shiny box and sold it to you. The other thing they did was the app store, which was a software repo with a shiny coat of paint that charged money (most software repos up to that point and to this day are free).

The other thing they did was take billions in government grants to start silicon valley. All the big tech giants are a product of goverment spending on private companies to sell us public sector innovations.

If you think the iphone or anything sold to you by a company is special, you've been duped by marketing. It's understandable because they will spend billions of dollars to figure out the best way to make you want their crap, but you were still duped.

Edit 2: Lots of people saying I'm wrong, but nobody actually explaining how. I think you just don't like being told you were duped.

Innovation is always based on what's already been done. If some tech company takes off on tech someone else invented, the question is why the inventor was not able to monetize on it. It's not always as simple as "tech company stole it". Invention and prototyping is very different to making a product that people want.

The difference is that silicon valley got billions in government grants to kickstart their industry so they bould buy licenses and pay huge teams of engineers and designers whose job it is to make something marketable. Historically speaking if you actually invented something, you got nothing but a wage or a very small payout.

That's it. They don't innovate, they don't develop, they package.

We have an economy that rewards exploitation, not work. That's not the fault of the workers, it's the fault of the ruling class who made it that way.

Historically speaking if you actually invented something, you got nothing but a wage or a very small payout.

That's true for most innovations ever and not exclusive to the US tech industry.

The point is that the tech industry markets itself as this big leader in innovation, but it's not. It markets and packages existing innovations. Capitalism in general is sold to us as "driving innovation", but it's a lie. The fact this is normal in general strengthens my point.

Devil's advocate would say "okay, then just go make your own iPhone if apple isn't actually doing anything" but I don't really want to be defending apple, lol

You can hate on Apple all you want (and I really do) but they made the right device at the right time. Tech might all have been there but the combination and usability of the first iPhone was groundbreaking.

It's not that Apple makes amazing stuff, it's that other companies really put out barely shiny turds.

Look at the zune, the tech was fine, or so I have heard, but it looked like an ugly brick. Seriously, a regular red brick looks better, even a yellow brick does.

I have a Subaru, and while I love it, the infotainment system is garbage. Clearly there was no effort to make it look good and usable.

UX is hugely undervalued, I wonder if one of the reasons is because you don't notice good UX, it's not in the way, but you noticed bad UX. So good UX without a lot of marketing is invisible.

UX is hugely undervalued, I wonder if one of the reasons is because you don’t notice good UX, it’s not in the way, but you noticed bad UX. So good UX without a lot of marketing is invisible.

I absolutely agree. It's especially underestimated how hard it is to make actually good UX because what feels intuitive can be highly individual. In addition the typical techie nerd that does the programming is more interested in technical puzzles than trying to view the program with the eyes of an end user (which feels pretty schizophrenic at times since you know how the thing works but need to dissociate from that knowledge).

Strange comment to make about apple of all manufactures.

Where's the lie?

Claiming Apple doesn't develop tech is ridiculous, and raising them as an example even more so, because I can't think of a vendor with higher portion of hardware built in house. You could make an argument for Sony (camera sensors) and Samsung (screens, also Exynos for their phones), but they're up there.

What tech do they develop? All you've said is they build hardware. That's not developing tech.

Lol I work in software in The Valley. Trust me, we write this shit.

Yeah, nobody's saying you don't write software. This is a history lesson in where the tech comes from.

Even then, a lot of the work done there is stapling together APIs, right? A lot of those APIs are implementations of tech developed, again, in the public sector.

And if you are writing novel stuff, I'd bet good money all the interesting stuff comes from research done in universities, right? Most of the interesting things I've ever programmed were based on public sector research.

And even then, the industry got started with public sector money. Maybe your company got its start from VC funding or whatever, but that's after the whole sector was jump started. Now the big companies in your field don't pay taxes, in fact a lot of them are paid by your taxes.

I mean, if you want to explain where I'm wrong, go for it. Right now all we have is "trust me", which is famously strong evidence.

You're overgeneralizing. Government money is in everything. It needs more effort to prove it's causal for every innovation there is.

How? What? Explain your objections beyond "needs more effort" please. Your objections need more effort.

No. You're the one with the big claims that the whole industry (or in your other reply even the whole capitalist world) doesn't innovate. So you first provide some actual evidence. So far your arguments are just "trust me" themselves.

The default assumption shouldn't be that they do something, they very clearly only package existing technology. They clearly don't have the know-how to make a functioning modem based on existing specifications, much less develop new tech. Why do you believe they do innovate? Because they told you? I'd suggest the evidence against the null hypothesis just doesn't exist.

The graphic I linked shows the reality, that all the underlying tech is from the public sector.

Also, you didn't even bother to contradict what I said that most of the programming is stapling together existing APIs. That's true, isn't it?

You have no idea how modern technology is produced. Any particular product is usually the result of dozens to thousands of iterations, some funded with public money and many not. Let's take an example from your chart: DRAM. I actually don't know when DARPA "developed" DRAM (since DARPA usually funds private companies to do development for them), but it must have been before 1970 when Intel designed the 1103 chip that got them started. Do you think that pre-1970s design is remotely similar to the DRAM operating on your device today? I'll give you a hint: it's not.

And no, modern device development does not consist of gluing a bunch of APIs together. Apple maintains its own compilers, languages, toolchains, runtimes, hardware, operating systems, debugging tools, and so on. Some of that code had distant origins in open source (e.g. webkit), but that's vastly different than publicly funded and those components are usually very different today.

They're failing to produce competitive modems because modern wireless is one of closest things humans have to straight up black magic. It's extremely difficult to get right, especially as frequencies go up, SNR goes down, and we try to push things ever faster despite having effectively reached the Shannon limit ages ago.

So you've vaguely waved your hands in the direction of innovations that you think are different now than in the 1970s but not explained how they're different or where those innovations came from.

You aren't actually pointing to any serious innovations silicon valley have done.

Modern device development consists of more than gluing a bunch of APIs together, but it largely does consist of that.

Apple maintains those things not for innovation purposes, but so they can keep a walled garden. If they maintain objective C and iOS and MacOS on their own terms then they can keep people locked into their ecosystem and overcharge them for devices they will then overcharge for repairs in order to upsell people into the next model. They are notorious for this shitty behaviour. It's not real innovation.

And when you say wireless is straight up black magic... you mean it's a real technology that was developed by researchers, not capitalists, because real R&D is expensive, so capitalism socialises the costs and privatises the rewards.

I haven't explained what the differences are because almost everything is different. It's like comparing a Model T to a Bugatti. They're simply not built the same way, even if they both use internal combustion engines and gearboxes.

Let me give you an overview of how the research pipeline occurs though. First is the fundamental research, which outside of semiconductors is usually funded by public sources. This encompasses things like methods of crack formation in glasses, better solid state models, improved error correction algorithms and so on. The next layer up is applied research, where the fundamental research is applied to improve or optimize existing solutions / create new partial solutions to unsolved problems. Funding here is a mix of private and public depending on the specific area. Semiconductor companies do lots of their own original research here as well, as you can see from these Micron and TSMC memory research pages. It's very common for researchers who are publicly funded here to take that research and use it to go start a private company, usually with funding from their institution. This is where many important semiconductor companies have their roots, including TSMC via ITRI. These companies in turn invest in product / highly applied research aimed at productizing the research for the mass market. Sometimes this is easy, sometimes it's extremely difficult. Most of the challenges of EUV lithography occurred here, because going from low yield academic research to high yield commercial feasibility was extremely difficult. Direct investment here is almost always private, though there can be significant public investments through companies. If this is published (it often isn't), it's commonly done as patents. Every company you've heard of has thousands of these patents, and some of the larger ones have tens or hundreds of thousands. All of that is the result of internal research. Lastly, they'll take all of that, build standards (e.g. DDR5, h.265, 5G), and develop commercial implementations that actually do those things. That's what OEMs buy (or try to develop on their own in the case of Apple modems) to integrate into their products.

So you've admitted that all primary research is done in the public sector, because of course it is. Exploratory primary research is not guaranteed to be profitable. That leaves only what is relatively guaranteed to be profitable for private industry. And what is guaranteed to be profitable is... not exactly very innovative. By definition it has to be fairly obvious.

So the other things you've mentioned are all implementations; optimisations towards local minima/maxima. Mass production is a more or less solved problem, figuring out how to deal with specific problems that crop up in individual production pipelines is work, certainly, but it's not innovation. This is boilerplate stuff. Every factory since the dawn of factories has done it. And even to get there, you had to dig deep into relatively unknown silicon valley companies that most people have never heard of. They are not Apple, not even close.

And the semiconductor industry is somewhat unique in that they're in the middle of a half-century-long slide down towards a distant local minimum, specifically the size of their transistors. That means they can continue to make iterative changes that they know will give them payoffs. We know they know they will pay off because there are multiple companies in competition developing this technology in parallel and they are more or less keeping pace with one another. If they were coming up with true innovations that wouldn't happen.

Number of patents tells you nothing about what is actually being developed. I will admit I was perhaps too zealous in saying they "don't develop", because obviously the things you mentioned are a form of development, it's just that they are predictable advancements, which means anybody with the funding to hire engineers could pull it off. They are advancements in the same way a train advances along a track. They are low-hanging fruit. These companies didn't till the soil, plant the trees or nurture them. They just came along at harvest season, plucked the apples off the branches and called themselves farmers.

If you call that sort of thing "innovation", I would say you have a very low bar for that term.

Also, it seems to me that you, like the other person, threw out a lot of technical sounding details all at once that certainly seemed to drive towards a point. Perhaps you hoped I couldn't discern the difference between what you were saying and real innovation - you did say I had "no idea" - or perhaps you actually think they're the same thing. In any case, I do understand enough about this process to see through this, but someone less technically informed wouldn't. I'm not terribly impressed with someone attempting to bury me in the weeds like that.

If you had a real slam-dunk example of innovation that you could point to, I imagine you would've done it. You can't though, because for example with DRAM, future iterations of the same technology aren't fundamentally any different and operate on the same basic principles developed in the public sector, and I think you know that.