Hacker News new | past | comments | ask | show | jobs | submit login
Would you be willing to fund a Linux port to Apple Silicon? (twitter.com/marcan42)
403 points by huhhuh on Nov 29, 2020 | hide | past | favorite | 528 comments



No, I wouldn't, not even if I could print billions in a snap.

I'd rather help funding the port to a 10 times slower but open platform, rather to a technically superior but proprietary close one whose owner would make it incompatible two seconds after they would smell competition from Linux in any of their core business fields. No thanks, I'll send my small quid to whomever appears to be really in favor of openness, no strings attached, which is neither the case with Apple as it is not with Google or Microsoft.

https://www.washingtonpost.com/technology/2019/09/05/how-app...


No.No.no.no.no.no.no.no.

A million thousand times no.

As some one else said, every spare dollar I have is going to fund a company that values open source hardware, open source software, right to repair and own.

A company that doesn't purposefully force manufactured obsolescence as a business model, privacy intrusions, lip service to OSS, bullying right to repair in courts, tax evasions doesn't deserve any free work from the skilled people in the community. All of this, just so that they can make more billions when people around the world are suffering just to give their restrictive supply chain their blood, lives and tears.

You know why open source hardware is expensive for lower quality? Because Apple pays the premium on the premium quality stuff. But I'm that process they also drive up the prices.

Hardware manufacturers want the Apple cash, so they wait for the Apple contract and when they get that, any other contracts that you might have will get delayed and sidelined with lower quality stuff. Essentially, a monopoly in the supply chain


Wow, thats some conspiracy theory. Only a teeny tiny fraction of hardware companies have any relationship with Apple whatsoever. For those that do yes their Apple work is lucrative, but for most of them like Sony or Samsung it's a minuscule fraction of their overall revenue, and even then is only for very specific components. Large swathes of their output has nothing to do with Apple. The supply chains of most PC manufacturers never go anywhere near Apple or any of Apple's suppliers at all, yet somehow Apple is responsible for their shoddy quality and poor compatibility? Amazing.

Also the cognitive dissonance is deafening. The company with industry leading device lifetimes and premium long lasting second hand market value is, er, guilty of built-in obsolescence. How does that work exactly?

I mean Ive got to think you must know both those points are true. The fact that Apple has nothing to do with the vast majority of the PC supply chain, and the fact that Apple devices have industry leading lifetimes and service and support is just overwhelmingly supported by the evidence. I just don't see how you can not know those things. Yet here you are making this argument. I just find it really puzzling. Are you really not aware of these things, or deny that they are true? Are these things we actually need to dig into and discuss?

I'll concede on the repairability point, yes that's a valid argument. I would contend that it's mostly just a tradeoff between compactness and simplicity of design, versus repairability and it's up to customers to choose which they value most, but Apple has sometimes gone too far in unnecessarily obstructing repairability.


>I mean Ive got to think you must know both those points are true.

Hardware relating Comments on HN has always been fairly low quality to say the least, comparatively speaking to many other subjects. But Apple M1 sort of brings the worst out of it. My current counting right now over 10K comment suggest most have very limited understanding of Hardware tech, little on Hardware business, and quite literally zero on supply chain.

So I would not be suspired If many doesn't know.

I could count people with one hand who have to constantly comment and correcting all the wrong assumption, TSMC capacity limitation, NAND and DRAM price fixing etc etc. And now we have Apple's supply chain monopoly.

The world currently ships approx ~1.2B Smartphone every year, and Apple represent less than 20% of it. And if you include Tablet, PC and Laptop ( Which shares the same supply chain ) Apple is much closer to 10%. The premium Apple pays is not leading edge ( apart from 5nm TSMC ) but very strict quality control and capacity guarantee.

My observation into this issue, is that most software developers and tech workers are so far removed from old model ( non-tech ) of business they have very little understanding of Logistics, Supply Chain, Lead Time, Sales Channel, Distribution and Discovery ( Marketing ). And ironically enough the world best company at doing all of the above is also a tech Company called Apple.


FWIW, it is because we read articles like this one, and because we are constantly told that Apple has exclusive deals for good touch screens from Samsung (who happens to also make the one other good line of competing phones, and which you explicitly stated was a place where people are wrong).

https://thenextweb.com/apple/2011/08/21/apple-locks-its-riva...

> How Apple locks its rivals out of the tablet market with exclusive sourcing deals

Care to explain how we are wrong instead of just claiming we are dumb? I promise that if you educate me (somewhat a bit famous in the "hating on Apple" space) I will help educate others. (Though, FWIW, I do know that a lot of Apple's hardware is semi-custom design, and so if the argument is that I know but find it irrelevant for this comment thread. I am focusing on the belief that Apple makes it hard for others to source parts, which has a lot of history to it.)


I guess I will write a long blog post when I have time. But that article is from 2011. So that perspective were correct at the time but no longer relevant now.

Example when the first iPad using NAND launched, Apple secured nearly 60% of the world's total NAND production capacity. That was not a monopoly or exclusive deal. It was out of necessity. Securing a stable supply of critical component is the norm is any industry. It is up to the manufacture to decide whether they should expand their capacity to meet demand ( from others ).

In again NAND example, since the cost of building a new fab and amortisation time is far too long, most would delay any Capex spending until it is absolutely necessary. With high demand and fixed supply, NAND and DRAM price were sky high, and Samsung managed to be the most profitable company for 3 quarters in 2017, only to be overtaken by Apple. The profits then fasten the capacity increase cycle because the risk are now comparatively lower.

Smaller players who do not have the volume, forecast and commitment will always be left at the will of their manufacturing partners. But partners / vendors will always complain these players dont give ( accurate ) forecast. Capacity planning in itself is a profession. And these are common, if not the norm across all Manufacturing industries, from Food, Metal, Raw Mats or even Transport like DHL where they have Data Scientist working on Flight Space ( Their unit of capacity ).


Apple is superficially "premium" nowadays.

It has been releasing faulty keyboards on it's macbooks for years. They knew. They didn't care. They just continued to release new ones every year with the same faulty keyboard.

Not to mention staingate (which affected me personally).

Those 2700$+ macbooks with ghosting screens. The stupid chip that does a weird sound when you stop your music.

The list just goes on and on.

And this is the stuff on the surface. Go open a macbook and analyze every single component quality...


Apple has always been superficially premium

It’s an Intel machine

They’ve released versions with bad graphics cards, bad batteries.

It takes time to re-design and ship. They replaced numerous “bad keyboard” models for people.

Meanwhile, every plastic Lenovo or Dell needs a new hinge or battery cover every year, and screws fall out.

I’d take physically legit with hard to source issues like ghosted screens since I’m only using it for text and browsing and occasional sound hiccups than a laptop that falls apart after 2 years.

My recent 15” MBP has outlived Dells of the same age at my employer.

80% of the way there is better than 60%


What Lenovo/Dell models were those? And how are you exactly calculating these percentages?

It looks like you speak from your own experience, and so do I.

Enterprise grade laptops from either Lenovo and Dell are known to be very sturdy and well made. Not sure if the materials will be as nice for your fingers as the macbook materials, but for sure they won't fall apart.

I have had thinkpads that would have survived falling from a third floor. I just don't see how a macbook is more reliable than a thinkpad or a precision.

Try throwing a macbook out your window.


Came here to defend my plastic Lenovo. Two years and counting, looks and feels like new. The keyboard still makes me want to just keep typing.


[flagged]


Oh right, you know what these hundreds of millions people actually need better than they do themselves and they are all idiots, so none of the evidence counts. Discussion over. I'm done.


> You know why open source hardware is expensive for lower quality? Because Apple pays the premium on the premium quality stuff. But I'm that process they also drive up the prices.

"pays the premium on the premium quality stuff" reads very weird to me. Isn't that how generally the world is supposed to work? You pay a premium for premium stuff? If not Apple, somebody is anyways going to do it right...


I agree with the majority of your post, but this part here is flat out wrong:

> You know why open source hardware is expensive for lower quality? Because Apple pays the premium on the premium quality stuff. But I'm that process they also drive up the prices.

Hardware is hard. Some Atmel or whatever low brain microcontroller can be had on a board with hand-painted selfmade-etched circuits or shoddy breadboard stuff, at the frequencies these things operate signal integrity doesn't really matter.

But anything with a brain powerful enough to run Linux? Then you're delving into differential signalling at extremely high frequencies - which means you have to precisely account for trace widths, trace thickness and trace length matching simply to ensure that the signal from your CPU arrives at your RAM chip intact. You'll ned an awful lot of awfully precise voltage levels other than 3v3/5v, with equally precise requirements on timing in getting them up. You also won't be able to get away with a two layer board, you'll need three or more layers and a lot of vias - and now, you can't manufacture your own prototypes any more but need to order at a PCB shop which means you either pay through your nose for speedy turnaround and low volume or wait for weeks for pool orders.

And once you have the PCB you'll need to populate and solder it - where you will run into such nasty things as minimum order quantities and the need for a pick-and-place machine if you don't want to go nuts. And if that's not enough, have fun with NDAs and no support for hobbyists by big SoC vendors due to small volumes (I have extensively written about that topic here: https://news.ycombinator.com/item?id=25208056).

The best solution if you needs smarts is to look into either adopting a Raspberry Pi 4 Compute Module if you can get away with its performance, or look into COMexpress if you need serious, up to Intel Xeon scale levels performance - but be warned, designing for that ain't exactly easy either.


You seem to be forgetting about the powerful EDA software that will help with the design and its verification.


Which isn't free (and easy to use!) either.

And I forgot about another thing too: the supply of people. While everyone and their dog can attend some six weeks coding bootcamp or teach themselves using free and open resources and can then start contributing to open source code, dealing with hardware requires a good education in fundamental physics, good amounts of money for components and tooling (a good soldering iron, air filters, fine-controlled hot air gun, a decent oven, the equipment and the chemicals for etching, a precise micro-drill, ...). Many people don't have that amount of upfront money and skills, and those that do often don't have the amount of free time to help advance open source because, as I wrote, it takes lots of time and money to deal with smart hardware.


> everyone and their dog can attend some six weeks coding bootcamp or teach themselves using free and open resources and can then start contributing to open source code

Perhaps this was intended to be hyperbolic, but no, you can't become a kernel developer, say, in six weeks. You can learn to dabble in web development in that time, but you'll barely have taken the first step toward being a serious software engineer.


They said that it was possible to contribute to open source, which is surely true. Not that they could be a kernel developer.


Granted, but six weeks is easily long enough to learn how to assemble a computer, or to do basic soldering. If we're going to compare hardware skills and software skills, let's do so in a reasonable way. If we're talking about building a new system from the ground up, I think the comparison to kernel development is about right.

Much skill is needed to make a serious contribution to an open hardware project, but the same is true for many types of software.


I don't think you understand very well how hardware design works. Most hardware designers write code just like software developers. Most of them have never used a hot air gun, and most of them can't explain in much detail how a transistor works at the physical level (they learned this once in their education, but have since forgotten, simply because they don't need to know this in their daytime jobs). Yes, once you need to fab your design (and iterate on it) you need other disciplines, but the work here is more or less the same for any design.


PCB design doesn't involve much coding. It's typically done in a semi-manual way using a CAD-like interface. Here's an example for a simple board:

https://www.youtube.com/watch?v=2b1UdOmxVrw

There is, as yet, no magical piece of software that automatically produces good layouts - not even for simple designs like this where signal integrity is of little concern.


> Most hardware designers write code just like software developers.

You're talking about people doing FPGA work which is a totally different skillset than making your own PCBs - and even more lacking in people contributing to open-source projects because it is (at least in my opinion) even harder to get a grasp on, and capable FPGA development boards cost a boatload of money.


I'm pretty sure he meant people who design hardware using Hardware Description Languages which aren't only used to configure FPGAs... you can build ASICs using them too.

https://en.wikipedia.org/wiki/Hardware_description_language


I completely agree with your overall point, but on a pedantic note, you don't need to etch your own PCBs these days.


> A company that doesn't purposefully force manufactured obsolescence as a business model

Apple's mobile devices, laptops, and desktops have some of the best longevity and the best resale values in the market.


> As some one else said, every spare dollar I have is going to fund a company that values open source hardware, open source software, right to repair and own.

What is this whimsical company, and how exactly does it make money?

> You know why open source hardware is expensive for lower quality? Because Apple pays the premium on the premium quality stuff. But I'm that process they also drive up the prices.

Apple is not even in contention over the same stuff. Nothing open sourced can fund the development process of a 5nm IC, as the tooling alone would be tens of millions if not more. The total cost of an IC would be in the hundreds of millions (making something of the M1 class is billions).

Open source is lower quality unless there is a commercial sponsor. This is almost universally true for software and hardware. This is because people, naturally, have less resources to invest in something with zero to no financial return.


> What is this whimsical company, and how exactly does it make money?

https://puri.sm/products


What is the point of having the right to repair a phone that, 3 years after it's crowd funding campaign, now has 5 year old hardware, still can't reliably make calls and has a sub 2 hour battery life?


It can reliably make calls [0] and has a decent battery life [1] (even though it cannot sleep yet!). If you have more concerns, you can check the community FAQ [2].

Concerning the old hardware, it's the most powerful phone which can run latest Linux kernel and the only one recommended by the FSF [3]. Specs is not the whole story in smartphones [4].

[0] https://source.puri.sm/Librem5/community-wiki/-/wikis/Cellul...

[1] https://puri.sm/posts/librem-5-4500mah-battery-upgrade/

[2] https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque...

[3] https://www.fsf.org/givingguide/v11

[4] https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque...


Can make calls sometimes and being reliable are I suppose a matter of experience and opinion.

https://www.techrepublic.com/article/librem-5-review-the-lin...


This is not a review of the mass production batch (Evergreen). Please stop spreading FUD. Evergreen started shipping in November. Some people received theirs recently:

https://forums.puri.sm/t/received-my-librem-5-evergreen/1087...

https://www.reddit.com/r/Purism/comments/jzj6s1/i_have_recei...


Ok, it looks like you're quite right I'm out of date. My apologies.


What's the point?

You own it. You control it. YOU have the power. That's the damn point.


You have all the power but none of the utility.


You are spreading FUD. The phone can do almost everything, including 3D-games.


Fine, so you can port tuxracer to it.

If you want to play an actual game people care about and think there are game companies building those for PureOS you're delusional.


Depending what you call "actual game", you can play some of them: https://www.youtube.com/watch?v=S_HXQJkWjUQ.

Concerning "game companies", this is a typical problem of new hardware and is quite expected (in the beginning). The difference however is that you do not have to rewrite anything, just recompile for Debian ARM and make it fit the screen. Potentially thousands of games should run well after quick adjustments.


Is their hardware really open source? Are they making money? Is a $799 phone with hardware that dates 5 years back considered a "viable consumer product", or is this a niche device that just shows how open source fails to produce viable consumer products?


See my reply above and check the FAQ: https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque....

> Is their hardware really open source?

The phone is going to get Respects Your Freedom certification by the FSF.


Can I download a zip file which contains all of the phone’s engineering information, which I can then submit to Foxconn and manufacture millions?



So the answer is a resounding “no”:

>> Purism published the KiCAD schematics files for the DevKit, which was entirely designed with free software tools. However, Purism has not released the CAD files or the Gerber files for the Librem 5, in order to prevent the creation of clones. CEO Todd Weaver says that Purism needs to recover its development costs before releasing the Gerber files, which they are “thinking about releasing in a time capsule” of “3 years, 5 years, something like that.”

Not to mention the plastic CAD files, tooling, etc.


Your "resounding no" is the very definition of FUD. In fact, it is more "yes" than "no". Name me any other company which publishes x-rays, schematics and promises to publish KiCAD files for their hardware.


Not sure where the FUD is (that's the third time in this thread that you've thrown the "FUD" accusation, just saying). Purism does not publish manufacturing files, with a specific claim that they want to prevent their product from being copied. They say so themselves, in the link you provided.

There's very little you can do with electrical schematics alone. It's probably just enough for someone to claim a product is "open sourced" while keeping it proprietary for all intents and purposes.


> (that's the third time in this thread that you've thrown the "FUD" accusation, just saying)

Yes, you are right. The reason is the wrong claims written in a convincing tone and (intentionally?) ignoring facts.

For example, here you ignore that this company is providing much more than any other company typically provides (schematics). Purism also promise manufacturing files and have a history of fulfilling their promises (even though sometimes with delays). I have no idea how you can say "resounding no" if you take into account the alternatives.


> What is this whimsical company, and how exactly does it make money?

I mean, Olimex isn't exactly poor. https://www.olimex.com/


Olimex makes low end development tools, not consumer products; and it’s not a large company by any means.


Lol they highlighted running Linux in the first announcement of the ISA switch, they see no threat in Linux at all. Their business model would need to change dramatically for that to happen. They’re a hardware company that makes the hardware more attractive with software, service and device integration. Their software restrictions are part of the value proposition, but anyone fully ejecting from that would be on their own. Linux becoming attractive on their hardware would almost certainly be unsupported, but I seriously doubt it would be undermined.


> Linux becoming attractive on their hardware would almost certainly be unsupported, but I seriously doubt it would be undermined.

This. Apple doesn't even care about Hackintoshes, which are (theoretically) orders of magnitude more threatening to Apple's business model since they (theoretically) cannibalize Mac sales.


Reminds me of that Bill Gates quote from 1998:

> Although about 3 million computers get sold every year in China, people don't pay for the software. Someday they will, though," Gates told an audience at the University of Washington. "And as long as they're going to steal it, we want them to steal ours. They'll get sort of addicted, and then we'll somehow figure out how to collect sometime in the next decade.

I am sure Apple loves that Hackintosh users are in the Apple ecosystem (often developing software for Apple machines), and many will eventually start buying real Apple hardware.

Apple will turn a blind eye to Hackintosh as long as the process stays too difficult to ever cannibalize sales.


Did they ever start getting license revenue from China? I've had a few recent interactions with Chinese businesses that suggest piracy is still rampant and shameless.


http://www.chinagoabroad.com/en/article/microsoft-s-cloud-co...

"A survey conducted by Forrester Research earlier this month showed that Microsoft has already become the country’s second-biggest cloud services provider, with its Office 365 and Azure platforms."

Looks like the subscription model did the trick.


They're probably still better off than if they clamped down on piracy and China decided to develop a competitor to Windows and switched to that


China did, it's (was) called Red Flag Linux. It's shut down now. I believe this product was mainly a tactic to get the Chinese gov't a better position in negotiating deal with MS (probably regarding both cost and access to backdoors).


> Apple doesn't even care about Hackintoshes

That's because their numbers are so low that they don't represent a threat, but still contribute to keep users hooked to the Apple ecosystem, which is vital for them; of 10 people with a Hackintosh there may be no real Mac users, but many of them are likely to own an iPhone and use other Apple services; Apple need to keep those customers so they close both eyes on Hackintoshes. Also, since they offer no warranty for non original Apple hardware, it wouldn't impact much their corporate users base, which often are forced to use the real thing.

One side effect of making their own line of CPUs, and a system that as for now, runs only on them, however, could also be seen as a way to kill the Hackintoshes "competition", or it may be just a nice side effect for them. Time will tell.


The main reason to run hackintosh as far as I know is to write software for macOS or iOS. Apple won’t encourage corporations to build hackintoshes but as long as it isn’t too easy to install macOS anywhere I don’t see how — on net — it hurts apple.


> The main reason to run hackintosh as far as I know is to write software for macOS or iOS.

I suspect a lot of Hackintoshes users do so because they like using macOS day-to-day, but don't see anything appealing in Apple's hardware lineup.

I'm one of them.


Won't Apple silicon eradicate hackintoshes eventually? I mean, you won't be able to just buy Apple's components and assemble a computer. If you were, it's really just a Mac, isn't it? I imagine they'll support x86_64 based systems for a few more years, but there won't be new drivers or support for new cpus, etc.

I wonder if it occurred to someone at Apple, if it's somewhat an added benefit to them.


I'm on my second hackintosh now, and while setup is now easier, I still need to be careful with what I buy/update/upgrade. Right now, I'm getting way more bang for my buck. (faster cpu, faster gpu, normal priced ram+storage)

With apple silicon, I hope they release a decent 'mac pro' or something that is good enough for me. Prices won't go down obviously, so here's hoping we can at least install third-party ram + storage.


I don't agree with you but I am glad you exist.


What makes MacOS good?


When they moved to HiDPI screens, everything worked. In contrast, when I moved my Windows 10 box to a 4K monitor, everything was unbelievably tiny until I set the system scaling to 300%. Which then broke some apps (far too large text, etc.) that I had to go through the "run in this scaling compatibility mode" debacle. Except some other apps still had small text and icons (looking at you, Chitubox!) which required the "run in this other scaling compatibility mode" finagling. Imagine having a "mature" OS where people are still dealing with this after years.

(and this is just one of the pain points I get from Windows 10 every day that I don't get from macOS.)


So you have more pain points with Windows 10 than Apple corporation?

My biggest pain are how Apple treats everyone, for the last 30 years. It's unbelievably frustrating.


> So you have more pain points with Windows 10 than Apple corporation?

Yes. Especially if you add in how Microsoft Corporation have treated people over the last 30 years too.


Generally good design and user-centered design (but that's being eroded with every new release) along with attention to detail.

Small examples: you can scroll a window without having to click on it to make it active, applications don't refer to files using paths but file ids instead so you can generally move files around a disk while they are open without things breaking.


> you can scroll a window without having to click on it to make it active

This works on Windows, and most sane Linux DEs are either default to that or can be configured to do that.

> applications don't refer to files using paths but file ids instead so you can generally move files around a disk while they are open without things breaking.

Do you mean they hold an opening fd of the file, which is also not special at all? Otherwise it is pretty interesting, please elaborate.


I only really use macOS these days, so they've clearly improved or borrowed features since I last looked, but I guess the point is that those features were implemented long ago.

The ability to implement file access the way is obviously possible on any OS, but on the Mac it's consistently implemented that way in applications, except for a couple of bad actors. On other OSes it's much more mixed. The same can be said for many other small features. There's a just a lot better consistency overall on macOS.


> Do you mean they hold an opening fd of the file, which is also not special at all? Otherwise it is pretty interesting, please elaborate.

Exactly. Special notwithstanding, it’s still a nice feature and not something Windows has to my knowledge, at least not common (i.e. “File is in use” errors). On macOS you can e.g. move a file to trash while it’s being used still by some running program.


Somewhat not consistent though. But I don't know if that is the fault of the application or Windows.


The scrolling behaviour also exists in Windows and in Linux.

I believe (if I understand what you mean correctly) that the file system behaviour is also a Linux thing. Deleting a file that is open is completely fine in Linux.

I think UX between the three OSs is almost completely a matter of habit and familiarity these days. I for one can't see how anyone can be productive with the (what I believe is abysmal) window management on a Mac but millions of people like it so I'd probably get used to it if I wanted to.


file system behaviour is also a Linux thing

It's required by POSIX, so all modern UNIX-like systems have this behavior:

When the file's link count becomes 0 and no process has the file open, the space occupied by the file shall be freed and the file shall no longer be accessible. If one or more processes have the file open when the last link is removed, the link shall be removed before unlink() returns, but the removal of the file contents shall be postponed until all references to the file are closed.

https://pubs.opengroup.org/onlinepubs/9699919799/functions/u...


> the file system behaviour is also a Linux thing

I think the interesting part is being missed here, the behavior is that the file is being identified to apps by a uniquie identifier which is not the path, so referrering apps still find it after it's been moved, even when they didn't have it open already. More like using the inode as the identifier maybe? This is a behavior that goes all the way back through the pre-OSX macs. It was part of some sort of philosophy for Macs though I forget its name.


Reason I bought a Mac way back when is that I wasn't going to go back to Windows if I could avoid it, and I needed to run Photoshop.


This sounds a tad irrational, but I also hate windows once every 4 years.


That sounds a tad presumptive.


- Normally is attached to very good-looking and sturdy machines

- You can have (..)nix like terminal in half of the screen and Office, Photoshop, and most of the shiny SW ("except games") running flawlessly in the other half without some kind of VM

- The battery of this very nice machines that it comes attached to may last a day

- Customization is a bit more restricted than its (..)nix cousins so most of the time it runs flawlessly even if you don't know what you're doing

- It connects very well with the other device that a lot of people carry in their pockets

- There are not a lot of combinations of HW + MacOs that you can run so online support tends to be very good


Can't really take this seriously when your first line is false.


Would you care to elaborate?

Try as I might, I have never found any laptop that is even half as sturdily built as a macbook. Yes apple plays dirty tricks that prevent inexpensive repair, but TBH, no one else produces a product that physically lasts long enough to be worth repairing by the time the board level components gives out.


Butterfly keyboards. Lack of ports. That's just from memory.

I'm sure Apple denies any faulty hardware as they did with their keyboards.


Oh gotcha, I certainly agree that the keyboards were a problem, and it was not cool how Apple pretended it was a non-issue for so long. It definitely prevented me from purchasing one during the time they using those keyboards.

That said, I am using one of those Macbook Pros for work, and the overall build is still very impressive. I would expect this machine to last longer physically than any of the non-mac laptops I have ever used.


UNIX with good taste and a decent UI.


That makes up for Apple being Apple?


Does it make up for running on well designed hardware? Well, yes it does. That’s also irrelevant to your question I was answering (well, I know you were trolling, but anyway).


Don’t shift the goalposts. You can ask what makes [product] good, or what excuses [company]’s actions, but it’s not the same question.


As an example: I find Amazon to be one of the most blatantly evil large consumer facing tech companies; I hate giving them money and avoid it as much as I can often at my own expense; their consumer facing products are wildly good for most common usage scenarios and I understand why they’re successful even though I wish they weren’t.


Unix based OS where sleep and hibernate work flawlessly, decent UI, with good commercial and open source software support? Also all the hardware works?

I like Linux, but it’s no where near as polished as Mac OS. Wasn’t 20 years ago. Isn’t now.

Hell, I’m thinking I should just reinstall windows and use WSL2.


That first paragraph isn't accurate.


> That first paragraph isn't accurate[...]

"...in my opinion."


It's a *nix with real Office and Photoshop


I'm not following why these are related. I use Linux for dev, I use windows for office and Photoshop.

None of those require macOS.


That's nice, I hope you enjoy your setup.


Really makes you wonder how many people are just buying because of marketing and psychology tricks.


Or perhaps they just value different things than you? Assuming that the only reason someone might enjoy a product is because of marketing trickery is pretty elitist.


The main reason for me is that Apple hardware is (from my experience) unacceptably unreliable and their warranty service is infuriating.

Add to that a lack of expandability, customizability, (user) repairability and (hardware) compatibility.

Life is just a lot less miserable without having to deal with Apple hardware. PC components seem to be very reliable when you spend just middle-range prices and you can easily buy and swap out components when something does go wrong.

I generally go many months without having to reboot my Hackintosh, and I can't ever remember having a kernel panic after initial setup. Both of those things were not true with Apple hardware.


They were a hardware company, but are moving into services for continued growth, which puts them at odds with an open platform. The business model is changing. https://www.statista.com/statistics/382136/quarterly-segment...

You can see services revenue is up to 14 billion compared to iPhone's 26 billion.


Why would services put them at odds with an open platform? (I’m pretty sure they use that open platform for many of their services)


Making it harder not to use their services permits them to profit thereby (e.g. via the Apple tax).


Why would an open platform running on their hardware impact service usage at all? Let’s be clear what an open platform on ARM Mac is: people (such as a lot of folks here) who already prefer an open platform and want M-series performance, not a significant proportion of the macOS user base developing a preference for Linux.


And that is revenue. Guess margins is the main question.


That’s pretty easy to ballpark. Everything Apple does targets a whole package 25-30% margin. If the services aren’t generating that margin, the hardware is making it up. Last I heard the iPhone is on that margin target, as are all or nearly all macs. So it’s pretty reasonable to assume the services are pretty close in line.


Services actually hover in the 60% gross margin, vs 30% for hardware. But they sell so much hardware that the overall profits are still 80% from that.

Source: https://sec.report/Document/0000320193-19-000119/


> whole package margin

> If the services aren’t generating that margin, the hardware is making it up.

I really don't like that way of looking at profits.

Going by that standard, if you sell a thousand dollar device with $260 in margin and no software, everything is great.

But then if you add on $600 in software to increase your margin to $330, that's bad somehow?


Huh? I wasn’t making a value judgment at all, just observing Apple’s historical behavior. That is how they look at profit. And they move all sorts of revenue around to make certain price points more appealing. Mac mini gets cheaper at entry level, macOS is free. To the extent those are lower margin, they make it up in services or other hardware lines.


Huh? I wasn't judging you at all, just insulting Apple's historical behavior, how they look at profit.


> they see no threat in Linux at all

Imagine thinking macOS is threatened by Linux. Guys like this will never understand the appeal of the Apple ecosystem, it is beyond their comprehension. There are some good reasons to run Linux, but shouldn't you at least be capable of seeing why Apple makes the kind of money they do, beyond the "well those customers must obviously be stupid and less enlightened than myself, a true warrior of the Free Software tribe"? Why are people like this.


> "well those customers must obviously be stupid and less enlightened than myself, a true warrior of the Free Software tribe"

Sort of.

For most MacBook users, a chromebook will be more sensible choice.

And iphones sell because fashion.

iOS is locked down, and Mac app Store is also a candy shop. Because there are stupid people to spend money on shit mobile games, apple makes quite a lot of money.

And they do lot of deceitful marketing. Apple software is getting shit year by year.

Most people buy apple because "brand", and get accustomed to it and don't like slightest change.


My next machine is going to be a Linux machine. Having said that, Apple builds high quality products - my 2012 MacBook Pro is still very good, and the software is incredibly stable. Yes a lot of stuff is locked down but that also means the interface is predictable and stable, mostly things just work.


+1. The real problem is not software incompatibilities or restrictions, but simply the lack of any vendor support and documentation. For a Linux port to "fail" (in the sense of very low usability), Apple doesn't even have to do anything actively, staying unresponsive and unsupporting is sufficient. Without comprehensive documentation, the development will be a long struggle.

- First, having good hardware support even on x86_64, a known platform with vendor support, can be problematic.

Even on a standard laptop where Linux works just fine, it's still a struggle for the community to get all the peripherals working after a new laptop has been released. Usually new computers can take a year or two before everything is sorted out. For example, on my Lenovo tablet, it took a whole year before every driver has stabilized, yet the integrated audio still didn't work (even when two company employees were actively supporting it on Linux), and I had to take an USB soundcard on-the-go, a really frustrating experience. Eventually I found the problem in the source code, fixed it, and submitted it to LKML - happy end. But it's not the only problem I had, and not everyone is lucky to have an easy problem - the more non-standard the platform is, the more problematic it is, just ask Microsoft Surface Book users for confirmation.

- Second, power management is already a somewhat unsolvable problem on x86_64, again, it's a known platform, and often with vendor support.

It's increasingly difficult to do power saving in modern laptops with highly integrated SoC - a single component can prevent the entire system from entering a deeper power saving state. In the old days, CPU and graphics were basically everything, but nowadays there can be an endless number of traps - a SATA driver or an audio driver can be the culprit. And even when developers have gone through the trouble of supporting power saving in a device driver based on the existing documentation, it's often plagued with mysterious bugs that baffles everyone (the Window driver often includes nonstandard workarounds that is difficult to be discovered without familiarity with the hardware and systematic testing). And not to mention that many subsystems are poorly documented that prevents the implementation of power management. On many machines, it's already safe to say that nobody knows how power management really works, and nobody knows how to fix it.

- Third, reverse engineering the GPU driver is hard, GPU driver is still a disaster on Nvidia x86_64 and ARM SoCs.

In the past decade, it's well known that Nvidia had done a successful job to destroy Nouveau, the free and open source driver for Nvidia graphics, by simply not providing documentation and tech support. It took years of reverse engineering to get basic 3D rendering, and another few years for reclocking support. And the introduction of signed firmware for hardware initialization and control essentially rendered Nouveau unusable on the last few generations. The problem is GPU is not limited to complex hardware like Nvidia graphics, even the graphics on relatively simple ARM SoCs have this problem, documentation is nonexistent and reverse engineering is slow. The Mali GPU commonly found on ARM single board computers is still having problem on performance, stability and hardware support.

Even on an extremely popular platform, with known documentation (after requests from the community) and strong community support - the Raspberry Pi, it still takes 3 years or so before the 3D graphics driver, VC4, is developed into a usable state.

---

Porting Linux to Apple Silicon is a challenge that requires developers to accomplish a similar task, but under the conditions of (1) to a brand new platform, (2) with unknown, new and non-standard peripherals, including the GPU, (3) without adequate documentation. It's easy to make the kernel boot, but making it usable is the sum of all nightmares.

You'll find none of the peripherals works, and the community reverse engineering efforts will take years before getting basic functionality, and even then, the battery will only last for 2 hours and will forever be broken. Graphics acceleration will be nonexistent or just borderline usable. And only Apple knows the technical details, nobody can fix it.

Unless the circumstances change and indicate otherwise, I think the Apple Silicon port will be a serious waste of community time, talents, and resources. It's better to spend time on a platform where vendor support and documentation exists...

However, the conclusion assumes that a serious competitor of Apple Silicon will eventually emerge and more supportive to the community, but it won't necessarily happen. I'm somewhat afraid that biting the bullet and reverse engineering the Apple Silicon could be the only way to have a high-performance Linux desktop on ARM - I hope not.


This is, unfortunately, the conclusion I've come to, as well. Thanks for the write up.


> Lol they highlighted running Linux in the first announcement of the ISA switch, they see no threat in Linux at all

First they ignore you

Then they laugh at you


> they see no threat in Linux at all

Yet.


I mean, hopefully they haven't entirely sold their hardware-company soul, in which case Linux support would only help hardware sales, even if by a little bit. Apple certainly didn't do much to interfere with people porting Linux and BSDs to their PowerPC hardware back in the day (and indeed, the Powerbook G4 sitting on my bed right now is mostly-happily running OpenBSD).

However, considering Apple's behavior around iPhones and iPads, I don't expect those hopes to come to much fruition.


If you use Linux on an Apple laptop, you are less likely to use their software stores, mobile phones, tablets, headphones, etc.

The integratedness of Apple's offerings depend on the control they have on both software and hardware. I think it's the integratedness that sells Apple's product.


But is amount of people that want to use Linux on an Apple laptop going to be relevant to Apple? Given most consumer would rather use preloaded OS and the state of the linux on the desktop, I don't really think it matters to Apple.


> I don't really think it matters to Apple [given] amount of people that want to use Linux on an Apple

Agreed. But the point I argues was that Apple does not care because they sell hardware either way. I think they care a lot their hardware is used to runt their software, as this is a big reason to sell more hardware.


Despite what people say, I view the possibility that Microsoft might switch to the Linux kernel to run Windows as very much real.

NT is very much legacy code in maintenance mode, switching to something that's more modern that's also the base for Android makes a lot of sense.


Right, but at least before iOS those software offerings were specifically to make the hardware more attractive. Being able to install Linux or Windows or what have you doesn't really impact that much, since the option to reinstall macOS and return to that "integratedness" always exists, and since there's nobody holding a gun to "normal" users' heads saying "you must install an entirely different operating system" (and indeed, Apple taking a stance of "we don't officially support this, and strongly suggest you stick with macOS, but if you really know what you're doing we won't get in your way" would be perfectly reasonable).

Nowadays, especially on iOS, it seems like the software is the focus. I'd argue that's a mistake that will sooner rather than later bite Apple in the ass. Hopefully the switch to M1 is a sign that Apple realizes this and wants to move back toward making top-notch hardware instead of coasting on App Store rent-seeking and planned obsolescence.


The phrase 'I seriously doubt' should have to be backed up by money.


Okay. I’ll give you a dollar if I’m wrong


> They’re a hardware company that makes the hardware more attractive with software

They also make hardware that to take advantage of the new software features need to become obsolete

Linux extends the life of any hardware it runs on


As spiffy as Apple mobile devices are, I never found the locked down environment worth it. Nice hardware, too bad about the OS/App model...

This new hardware is awesome!

I could care less overall though.

Frankly, open computing can be done fast enough to not be a worry. I will stick with all that and see where this new path leads.

And my mind is open. Who knows? Maybe it all ends up in a good place.

I am quite happy to run FOSS on the many devices that will do that reasonably for the foreseeable future.


The Mac M1 platform is about as open as any other laptop platform. Your points are quite valid for the i-class devices, but there is nothing preventing anyone from installing linux on an M1 Mac once driver support arrives.


> once driver support arrives.

How will the GPU driver support arrive unless Apple opens up the specs to it? Either these open it up like AMD and the community writes a driver, or they keep it closed and people try to reverse engineer something that is a poor second best (hello Nouveau).

What do you realistically think will happen?


There's no OSS driver for Apple Silicon (x86 platforms has OSS driver for Linux maintained by platform company) and boot process is proprietary (x86 UEFI is open spec).


Driver support will never arrive. The reverse engineering effort without specs from apple would be gigantic. And during that period the floss software would be running much worse than the benchmarks we see and want right now. All that for a moving target that Apple can also lock down with a random update at any time.

It's just not worth it.


Do you seriously trust that not to change?

When is driver support coming?

When will audio on Mac's work on Linux?

When will the touchbar drivers be made available?

Apple doesn't give a shit about Linux, but they also don't make it easy to use their hardware.

And to answer the question posed, no, I would never give this company money.


Fwiw both times I dipped my toes into Linux on some old PC laptops (once mint once Ubuntu) I ran into audio, trackpad, and WiFi driver issues.


Which is exactly why these days I'm mindful to buy hardware from companies (and using components) that don't make it hard to run FLOSS. Neither actively by locking bootloader or such, nor passively by withholding documentation needed to write drivers.

There are options to buy from & support manufacturers who actively help this process. Why would I put my money towards making apples hardware more attractive, when their stance is at best "we don't care but we might change things any second and break this simply because we don't care."


I will probably look for laptops in the foreseeable future, or maybe a new desktop. I never had a laptop before[1]. Which brand or models do you recommend for Linux? And for desktop? Like for CPU is AMD good? What about motherboard? Asus, Gigabyte, MSI?

[1] OK, I do. I have a T-42 with an unknown supervisor password. :( I installed OpenBSD on it ages ago, but locked myself out.


Lenovo ThinkPad or dell for Linux is fine. Lenovo run their QA off of Ubuntu live USBs.

AMD is traditionally better for desktop GPU support but either is fine.

Particular hardware doesn't really matter these days, just check the manufacturer website / reviews. Level1 techs on YouTube cover Linux support well for various mobos.


> Do you seriously trust that not to change?

No, I don't. On the Mac platform, Apple has consistently elected to give users ultimate control. They could have a change of heart some day, but so could Intel, Microsoft, or any other vendor.


The same Apple that continually locks down / harder and harder with each OS release, and makes developers jump through more and more hoops?


As a power user, you can however still jump through all those hoops, even with custom kext's.

I agree that the situation sucks for us, but on the other side - you can give a piece of Apple hardware to a literal child and won't have to deal with either getting it running in the first place (as with anything Linux based) or a persistent rootkit or other malware (which is more common and easy than not on Windows).

Being easily open for power users, hard to exploit for malware and hard for incompetent people to mess up, unfortunately, is a Hard Thing.


This is mostly just Apple fixing the bonkers default security settings of a unix desktop. As devs we're used to the idea that, e.g., any app we run should be able to modify any files that belong to us, regardless of whether we asked it to. But that's actually an insanely lax security model.

It's honestly not hard to disable any security settings that are getting in the way. I run whatever software I like on my Mac, no problems.

People have been talking about how Apple were going to lock down Mac OS "real soon now" for at least a decade. There's no real indication that it's ever going to happen.


I don’t understand why people complain instead of taking FOSS they like, packaging it as an app, distributing it on the App Store & publishing the source. Does Apple prohibit that?

I remember the days of very little Apple compatible software on the shelf at retailers, and no games. I was probably one of the first students to use a MacBook with OSX for CSCI 101 because many classes required Windows. That was lack of freedom. The App Store has more software than any other distribution channel at any point in my life.

Edit: I also love freedom from malware, which I understand is probably a temporary situation. But it’s been a good run so far.


> I don’t understand why people complain instead of taking FOSS they like, packaging it as an app, distributing it on the App Store & publishing the source. Does Apple prohibit that?

Apple does not, but the GPL does.


If the source of the app is released by the author, what about GPL prohibits this?

Edit: I ask because I’m curious to learn and because many systems run apps that are GPL without the whole system being GPL. For example, I love using GIMP on Windows.


Section 6 of the GPLv2 states:

> You may not impose any further restrictions on the recipients' exercise of the rights granted herein.

This conflicts with the App Store's Terms of Service, particularly it's "Usage Rules". https://www.fsf.org/blogs/licensing/more-about-the-app-store...

This is of course different from merely running GPL software on a proprietary OS. In fact, iOS ships with GPL software on board.


That’s a helpful reference on GPL. I’ll save a link to it.

Not sure this Stack Overflow is current/correct, but it appears other FOSS licenses might be compatible with iOS App Store terms. https://stackoverflow.com/questions/459833/which-open-source...


Note that this applies only if you are using GPL software owned by other people.

If you write entirely your own code, and include only more freely licensed dependencies, you can release your own code under the GPL and still distribute it on the App Store.


Sure, but that's only because if it's your own code, you don't have to follow any rules at all. You can do whatever the heck you want, up to and including making the whole thing closed source.


Apple will remove software from the App Store if they get a whiff of it including GPL code.


It costs time and money to do that. Who is gonna port, pay, and gonna do it?


I understand it will take effort and don’t have answers for those questions.


This is all a lot cheaper on open systems. Just saying.

Eric Raymond explained why: use value. Open systems can be exploited easily and the people who use them get high use value because of all that.

Incentives to port and build on known open systems are stronger, barriers to access lower.

Result: more FOSS on said system


>owner would make it incompatible two seconds after they would smell competition

We're talking about Apple, right? The company that made an entire software platform called Bootcamp, specifically to enable a competitor's OS to run on their hardware? A company who custom-developed drivers specifically to be compatible with a competitor's OS on their own hardware? You're talking about that Apple? The one who said they would be open to allowing Windows to run on M1 if Microsoft developed a compatible version of Windows?

I just want to make sure we're talking about the same company, because you seem to think we're talking about a company who runs in the face of competition, where the Apple I know confronts it head-on and goes out of their way to be compatible with the competitors.


Yes, we are talking about that Apple which integrated soft fuses that can be triggered remotely, and will brick your device physically.

We are talking about that Apple which sued a recycling vendor in Canada because they recycled devices instead of trashing them.

We are talking about that Apple which bribed judges in the US to deny its endusers the right to repair.

We are talking about that Apple which removed even the audio port of a device to prevent its competition to breakout of the locked hardware with their "upgrades" like better cameras, or even payment systems that wont be supported by Apple.

We are talking about that Apple which even changed their goddamn shell from the BSD fork because they didnt want to "open source" (Apple thinks this is zip file dumping) their own bash, which was the only reason bash was outdated for decades.

Yes, we are talking about that Apple.

Once anything is a competition, it will be extinguished by Apple. They only care about profits, and nothing else.


>Yes, we are talking about that Apple which integrated soft fuses that can be triggered remotely, and will brick your device physically.

Care to expand on that?

> We are talking about that Apple which sued a recycling vendor in Canada because they recycled devices instead of trashing them.

That's incorrect. They sued the company because they were supposed to take the devices apart and recycle/recover the parts. Not resell them as manufacturer refurbished.

> We are talking about that Apple which bribed judges in the US to deny its endusers the right to repair.

Again, please expand on that.

> We are talking about that Apple which removed even the audio port of a device to prevent its competition to breakout of the locked hardware with their "upgrades" like better cameras, or even payment systems that wont be supported by Apple.

Arguably, for the 2 things you have mentioned, the Lightening port would be much better suited. They took out the headphone jack because wires are a pain in the ass and the state of the art for wireless headsets is good enough for the majority.

> We are talking about that Apple which even changed their goddamn shell from the BSD fork because they didnt want to "open source" (Apple thinks this is zip file dumping) their own bash, which was the only reason bash was outdated for decades.

So brief history of the defaults shells in macOS. First 3 releases it was tcsh (BSD - inherited from NeXTStep), then for a while, it was the GPLv2 version of bash (last version to be released under GPLv2 was 3.2), more recently they moved to zsh (MIT). The critical part to get here is licensing. The GPLv3, like it or not, is a controversial license. Apple have an issue with GPLv3, so they avoid software that uses it. Frankly, when bash transitioned to GPLv3 is when Apple should've made the change, but that is a different discussion.


>That's incorrect. They sued the company because they were supposed to take the devices apart and recycle/recover the parts. Not resell them as manufacturer refurbished.

While they ended up on the market as refurbished, (not necessarily manufacturer refurbished) the devices were sold to refurbishers, who would have treated them as any other device they received for refurbishing.


> Care to expand on that? (brick your device physically)

The lock feature is implemented via iCloud and the "Find My Mac" page, which is able to remotely shut the soft fuses via BIOS/EFI, and in return physically bricks the hardware. Anybody that has control over your iCloud account (e.g. a malicious third party) can potentially do this. See [1], this happened to a lot of companies (customers) that had problems with malicious third-parties getting access to their network.

> That's incorrect. They sued the company because they were supposed to take the devices apart and recycle/recover the parts. Not resell them as manufacturer refurbished.

How come that they could only prove that by having more than 18% of devices that were sent to be trashed with active GPS antennas, and active remote iCloud software tracking? [2] Also, recycle is defined as re-cycling, not re-abandoning. The definition of recycling among the population seems to be a different perception than yours.

Do you honestly think Apple would've dropped the lawsuit if the devices would've been repaired and given away for free? Also kind of related: The lawsuit that happened in Norway, where Rossmann was called as a witness [4]

> They took out the headphone jack because wires are a pain in the ass and the state of the art for wireless headsets is good enough for the majority.

Literally a couple months after PayPal and other payment providers were banned by removing the headphone jack in the 8th gen iPhone, Apple Pay was introduced. You're making it sound like 2 or 3 tiny low-voltage wires are harder to implement than a Bluetooth antenna wiring... c'mon, really?

> (history of default shells in macOS)

You're making it sound as if bash was only inside macOS for a short time. bash, for the sake of argument, has been in use since macOS 10.2 which was released in 2002 [3], whereas 10.0 was a public beta and 10.1 was released the same year in 2001. The argument of a tcsh being in use for 9 months, and bash being in use for over 19 years is quite the opposite way around than you seem to perceive it.

=========

[1] https://support.apple.com/en-us/HT208987

[2] https://www.theverge.com/apple/2020/10/4/21499422/apple-sues...

[3] https://opensource.apple.com/release/mac-os-x-102.html

[4] https://repair.eu/de/news/apple-crushes-one-man-repair-shop/

=========

Again, my arguments were to counteract the argument that "Apple is acting in good faith for the people", because I do not think that anything innovative can happen on a crippled platform.

Apple actively shuts down innovation, every time it could be seen as an alternative to their own product, which they will release only once market adaption has been done by pioneers. This was happening a lot of times with hardware, and a lot of times with software.

I personally would choose repairability over anything, always, because I've learned my lessons the hard way with financial burdens.


> The lock feature is implemented via iCloud and the "Find My Mac" page, which is able to remotely shut the soft fuses via BIOS/EFI, and in return physically bricks the hardware. Anybody that has control over your iCloud account (e.g. a malicious third party) can potentially do this. See [1], this happened to a lot of companies (customers) that had problems with malicious third-parties getting access to their network.

So a security feature? Hardly Apple acting maliciously, is it. Regarding iCloud being compromised, this is inarguably down to lack security policy on the part of business or the end user. iCloud now enforces 2fa by default as a result, largely because (and I'm speaking from experience here) most businesses can't be trusted to do this themselves.

> How come that they could only prove that by having more than 18% of devices that were sent to be trashed with active GPS antennas, and active remote iCloud software tracking? [2] Also, recycle is defined as re-cycling, not re-abandoning. The definition of recycling among the population seems to be a different perception than yours.

Had Apple sold the lots to the company, you'd be right, but that's not what happened. Apple paid the company to destroy, recover and recycle. The company acted in bad faith and against the terms of the contract they had with Apple. With regards to you claim that Apple tracked active devices - when a business has any kind of electronic equipment destroyed, they are required, by law, to record certain information about the devices like serial numbers, etc. Unsurprisingly, some of the devices turned up at Apple stores for repair. The article that you linked to has the following from Apple: “Products sent for recycling are no longer adequate to sell to consumers and if they are rebuilt with counterfeit parts they could cause serious safety issues, including electrical or battery defects...”. Devices that can and meet the requisant standards to be refurbished and resold are.

> Literally a couple months after PayPal and other payment providers were banned by removing the headphone jack in the 8th gen iPhone, Apple Pay was introduced. You're making it sound like 2 or 3 tiny low-voltage wires are harder to implement than a Bluetooth antenna wiring... c'mon, really?

This is noting more than a conspiracy theory! Do you really believe that they removed the headphone jack to fuck with Paypal? This is utterly ridiculous. I'm not, by the way, suggesting that wires are are a pain in the ass from an engineering perspective, rather a user perspective.

> You're making it sound as if bash was only inside macOS for a short time. bash, for the sake of argument, has been in use since macOS 10.2 which was released in 2002 [3], whereas 10.0 was a public beta and 10.1 was released the same year in 2001. The argument of a tcsh being in use for 9 months, and bash being in use for over 19 years is quite the opposite way around than you seem to perceive it.

Your link shows that bash was included with 10.2 (it was included with 10.0/10.1 and NeXTStep too), not the default. tsch was still the default until OS X 10.3 (bash-2.05a-release) - not 9 months, 2.5 years. bash 4.0 was formally released in 2009, with a change to GPLv3. Apple had updated bash to 3.2 and stuck with that as it was licensed in a way that was acceptable to them for distribution. I know this to be the case because I actually used these OS's on a daily basis. As I said originally, Apple should have switched to a different shell at this point. The wornderfull thing about UNIX shells is that the user can change the default and use what they want, including updating the pre installed one, so suggesting this as Apple being user hostile is well wide of the mark.

> Again, my arguments were to counteract the argument that "Apple is acting in good faith for the people", because I do not think that anything innovative can happen on a crippled platform.

That's your opinion, and a perfectly fine and valid one to have, but...

> Apple actively shuts down innovation, every time it could be seen as an alternative to their own product, which they will release only once market adaption has been done by pioneers. This was happening a lot of times with hardware, and a lot of times with software.

...is simply not accurate in the way you are framing it.

> I personally would choose repairability over anything, always, because I've learned my lessons the hard way with financial burdens.

That's fine. Others wouldn't. I'd suggest convenience is top of the list for the majority of people that use Apple products. It's the same argument that goes against the customisation camp.


> bribed judges

...

Say more...?


And they did all of that in response to community solutions that weren’t as solid as their first party support. They went out of their way to support and extend the viability of their users clear desire for dual booting, at that time of a much more formidable competitor than MS is today.


It wasn't altruism, they did it because it would help sell more hardware and bring more people into the Apple ecosystem.

If they see a similar use for Linux support, they'll do the same. If they think it will hurt them in some way, they won't.

But I think it's fair to at least consider the "undermine" possibility considering that on iPhones they've consistently blocked attempts to jailbreak or open that platform in the way an alternate OS does on their Macs. We've already seen incremental moves to make MacOS app distribution more centrally controlled. So I honestly don't know where Apple believes their long-term interested to be on the topic of facilitating, or at least not actively blocking Linux.


> It wasn't altruism, they did it because it would help sell more hardware and bring more people into the Apple ecosystem. > > If they see a similar use for Linux support, they'll do the same. If they think it will hurt them in some way, they won't.

I agree and said... pretty much the same thing. I just don’t think there’s a snowball’s chance in hell Apple will see Linux as a threat.

> But I think it's fair to at least consider the "undermine" possibility considering that on iPhones they've consistently blocked attempts to jailbreak or open that platform in the way an alternate OS does on their Macs. We've already seen incremental moves to make MacOS app distribution more centrally controlled. So I honestly don't know where Apple believes their long-term interested to be on the topic of facilitating, or at least not actively blocking Linux.

Nope. Not likely, unless the mechanism for getting Linux on the M-series is software exploits of the macOS platform. They don’t do anything to undermine booting entirely other OSes on their hardware, even iDevices. But they do patch up their own stack.


You can't "nope" your way around Apple putting up a few walls around the MacOS garden just because they've left a door in one of them. If they go no further, then it's fine. But "not likely" is a bit too soon to say until they've gone a while without further limitations. That is, unless you consider something like the Mac equivalent of Project Sandcastle running on outdated hardware to be an acceptable situation, which appears to require a jailbreak of their stack anyway. There's simply nothing stopping Apple from going that route, and some years of incremental changes to MacOS that have inched it closer. Again, I'll believe they won't do that when they stop taking those incremental steps. Until then, "nope" really just means "not yet".


I'm pretty amused by all the "open" and "free" advocates the last few days telling me what I "can't" or "should have to" do.


The company that forced me to buy a mac to develop for them?


How about Windows? Did someone force you to buy a computer to develop for it?


You can pretty much develop software for Windows on other platforms too. Cross compilation is supported in several languages, and when you can't - you can usually compile stuff under Wine, or even ReactOS. So while obviously you need a computer, you don't strictly need Windows.


Yes you do need Windows. Wine is great, but there's no way I'll publicly release a windows application that hasn't been tested on an actual windows install.


Microsoft gives virtual images of their OS for free for development. You can work on any platform and test on VMs.


You can cross-compile from other OSes using MinGW and test via the free Windows VMs that Microsoft distributes. You can also download the free Windows development VMs and develop from them exclusively.


In theory one could do Windows dev on hard mode (on Linux/Mac): get one of the Windows virtual machines that Microsoft provides to test for old IE compatibility (https://developer.microsoft.com/en-us/microsoft-edge/tools/v...), install the freely available compilers (or even the open source Cygwin one), and code away!


Microsoft distributes free Windows VMs with developer tools installed for the explicit purpose of developing on Windows, there's no need to use their IE images.


I wasn't clear enough. I meant I'm forced to buy a Mac to develop for iPhone.


Bootcamp is dead on M1. In addition they have taken steps towards further locking down which apps can run on OSX. So he's probably talking about that Apple.


Bootcamp isn't available as Microsoft doesn't have a version of Windows compatible with the new platform. The parent comment even says that. And Gatekeeper has been around for 8+ years, always with ways to run whatever you want, but I'm sure the "locking down" is coming any day now.

It's just business: Apple won't intentionally remove features that makes their hardware more attractive. It's where they make most of their money.


I didn't mean to imply it was Apple's fault Windows did not run on M1 (in fact it has been shown that Windows ARM64 does run natively on M1 albeit through a patched QEMU due to lack of driver support), but by designing a custom SoC they have abandoned the approach which would afford them the most compatibility (staying with x86). This may have been the right direction for them, but it's not the same approach as before and definitely abandons compatibility in favor of better performance.

Second while Gatekeeper has been around for 8+ years, the default of contacting an Apple server on every program launch was new (or so I thought) with Big Sur. It's a step towards that direction. People could not launch apps when this server was down.


Nah, gatekeeper has been doing the remote check thing for years (just, 2 of them, I think).

It only became noticeable because there was an oversight in the "fail-fast" algorithm. If Apple was available, it was assumed it would be fast, and it wasn't when there was an iCloud outage.


I followed the instructions on how to use Little Snitch to skip Apple's app start stuff.


Though “BootCamp” is dead they 100% give you the option to boot unsigned software (unlike iPhone) and it will be possible to boot another OS. However obviously that OS will need drivers and they aren’t available and apple aren’t making them hence no “BootCamp” but it’s totally possible and people will almost certainly start to boot Linux to a certain degree although it seems unlikely we’ll get a proper accelerated GPU driver (I’m sure it will be possible for some basics to work and someone will do that).


Well, it remains to be seen if something like Boot Camp will return. Federighi's comment made me at least think that if MS allowed (and encouraged) Windows on ARM to run on Macs then Apple might do what it takes to make it happen. Time will tell.

Federighi quote: “We have the core technologies for them to do that, to run their ARM version of Windows, which in turn of course supports x86 user mode applications. But that’s a decision Microsoft has to make, to bring to license that technology for users to run on these Macs. But the Macs are certainly very capable of it.”

To be completely clear, this quote may have only been referring to virtualized operation, but given a change in the WinARM solution licensing model, Apple might support booting as they did on x86 Macs.


> We're talking about Apple, right? The company that made an entire software platform called Bootcamp, specifically to enable a competitor's OS to run on their hardware?

Let's rephrase that: The company that made a generic X86 computer and intentionally created a non-standard way to boot operating systems on their run-of-the-mill X86 hardware, so that other X86-compatible operating-systems running on the same hardware would have issues and make OSX look better by comparison.

Yeah, that company. I don't expect them to do jack shit for compatibility.


> The one who said they would be open to allowing Windows to run on M1 if Microsoft developed a compatible version of Windows?

How generous of them. And will they provide microsoft with the specs needed to do so?


Will Apple sell M1s to other laptop manufacturers?

I already pay for closed source Intel and AMD, I wouldn't mind being able to use M1s, but, I just don't like Apple laptop hardware.


Apple did not have a good experience the last time they rolled out licensing of that sort. Watered down the brand especially given some very crappy PowerPC clones.


And the ones that weren’t crappy were better than Apple’s


Maybe those good ones were few and far between. I had the misfortune of supporting some of the clones, and finding myself in front of one was always a much more unpleasant experience than a PowerMac G3.


That’s not how they roll


> I'd rather help funding the port to a 10 times slower but open platform

This is an idealism vs. pragmatism debate. Ideally, we’d have a solid hardware stack for Linux. Pragmatically, we have top-of-the-line metal for which there will or will not be Linux. I know which one I would fight for if I still believed in consumer-grade Linux.


Yeah, I'm in the same boat on this one.

I am just as in favor of fully open source hardware, software, the whole stack (heck, I'm a huge RISC-V nut), but I won't stick my head under the sand and pretend that I'd rather have a 10x slower, but FOS, machine.

I applaud Purism and other companies that strive to make a fully open source full stack, but I won't always have the financial freedom to purchase products that are either slower or less ergonomic than a closed alternative. It's sad, but it's pragmatism.


> I'll send my small quid to whomever appears to be really in favor of openness

That would be these guys then: https://www.sifive.com/boards/hifive-unmatched


I think you misspelled that link. ;)

https://mntre.com/media/reform_md/2020-05-08-the-much-more-p...

Edit: and here I was, thinking a smiley would clarify things. For the record, I'm of course not arguing against supporting sifive. I just think that in the context of this thread, what MNT are doing is much more directly comparable to porting Linux to M1


    > a 10 times slower but open platform
With the mindset like this they will never smell any competition from Linux.


[flagged]


An more charitable explanation is the parent poster is someone who puts their principles and a long term view above short term convenience. Seems like good marriage material to me.


Yes, having no will power and giving up after being forced to is a strength...


Wtf? Should they rather marry you, someone who goes ahead and makes them look like somehow less of a worthy human, just because they would opt to spend their money in a way you disagree with?


> they would smell competition from Linux in any of their core business fields

I don’t think you or Apple or anyone else has to worry about that


Well I wouldn't pay for it if I could print billions in a snap because then I would start my own company that creates open RISC-V laptops and pcs. But in the current market I will because there's simply no alternative.


What do you mean by no alternative?


All the current 13 inch devices have vastly underperforming cooling systems. Well every device except the Apple macbook pro.


If you're not willing to look outside of the exact form factor and specs that Apple happens to support, then open devices aren't a priority for you.

I use a 1 inch thick Thinkpad that's almost a decade old because it runs Libreboot. There is an alternative, but you need to be in the mindset where freedom is more important than hardware.


I do too, and that machine is a beast! Plenty fast.


Exactly you need to compromise significantly on speed and form factor. It's simply not practical for me to do that.


No computers available in the years preceding the present were practical for you? I doubt that.


Not in a 13 inch form factor.


If you desire the latest and greatest no matter how proprietary, that's your prerogative. But to then pretend you want open hardware but have your hands tied is silly.


No, its practical... you just don't want to do anything more than what's easiest, because that's how humans have evolved.

You could drive across country right now without a cell phone, just buy a Rand McNally road atlas. That's all you need.

You won't... because it would require you to pay attention to signs and exits, instead of having Google Maps do all that for you.


I don't have a smartphone so I do in fact use maps.


I agree with your message in general, however believe that the new M1 Mac mini has the best (and sufficient) cooling among the new line up.


What is worth what?

Serious question. Freedom / open computing

, or

Spiffy hardware.

Pick one. This is where we are right now.


Did you not own a computer prior to this month?


Of course I do. But there is no 13 inch laptop that fullfills my nerds.


> there is no 13 inch laptop that fullfills my nerds.

I choose to believe this is not a typo.


X86 based PCs are no alternative?


Unfortunately not. There are no 13 inch x86 devices on the market that don't throttle afaik.


Dynamic Voltage and Frequency Scaling (DVFS) is a desirable feature. You're not getting a 60W CPU in a 13" form factor, but you can get a 28W CPU that will run at 28W indefinitely while also being able to temporarily boost to 60W when needed and thermally able to.


It's fine if I could get an ultra book that could sustain non boost performance. But even that is too much to ask for intel. I have't tested recently released AMD cpus to claim AMD also falls short.


AMD laptop chips are pretty decent: you can find laptops/ultrabooks for about $700 running on a Ryzen 7 4700U. Based on a vs comparison on CPU Monkey it looks like it gets about 80% of the performance of the M1, though only about 25-50% graphics. However, you have much better ram & storage options, all at a lower price, and the graphics are still 2x to 4x better than Intel's on board graphics.


You've been mislead if you think the 13" Macbooks run at full boost clock all the time.


It looks like the M1 does run all out all the time (maybe because it never turbos that much to begin with).


When plugged in, perhaps, but when on battery those slower cores are going to get used more.


No, if you run a heavy computational load on a battery powered Macbook Pro it runs all out on all cores. Lots of reviewers looking at this. Battery life is worse of course under heavy load but still incredible relative to x86 for identical tasks. Faster and less battery usage.


No, because it is already a unix OS. I imagine this is unpopular in HN, but what matters to me is just running on a good solid unix. I don't really care if it is a unix or Linux. What I want is shell scripting, pipes, forks, make, python, awk, grep, find, du, df, etc... Most anything that doesn't come pre-installed can be built from source. This wasn't true in the early days of OS X, but now the only things I find that don't work (that I care about) are tools that depend directly on devfs or other very Linux specific system APIs.

Unix command line driven tools are a big part of my work and the native OS unix does everything I need it do. It seems a lot of people became committed to Linux coming from Windows with no unix system under the hood. I suspect if Windows had been unix based like Mac OS, Linux would not have become quite so popular. I'm sure avid Linux users will disagree, but for me the unix model command line interaction is what matters. Go easy, I'm not trying to make an argument for one unix over another. Just arguing that the need for a capable unix is already met by the native OS.


Whenever I use the CLI on a mac, I am frustrated by the absense of all the GNU extensions that aren't there when I use the core utilities, and that the version of bash is ancient.

I mean I still prefer it over windows, but it is not the same experience as linux.


This is really a non-issue and easy to get around. You can easily install the GNU utilities which take precedence in your PATH. I installed them all with Homebrew in a few minutes, including bash5. You can follow this guide[0] as a start.

[0] https://gist.github.com/skyzyx/3438280b18e4f7c490db8a2a2ca0b...


It is an issue if I need to rewrite my scripts specifically for macOS and they are no longer cross-platform.

Yeah, you can alias all utilities called g<something> to <something>, but you can't reasonably expect all users of your scripts to do so as well.

Using macOS is PITA.


If you're using GNU extensions, then you are not writing cross-platform scripts in the first place.


I’m not well-versed enough to know how well it measures up, but Mac switched to ZSH as default shell about a year ago.


I think zsh might well be more powerful than bash but it's extremely opinionated about how it does stuff (unless you want to go down a configuration rabbit hole) - for me, coming off 20+ years of bash muscle memory, it was unusable and I switched back.


It is the same experience as UNIX though, including the fact that each flavour has its own way of doing stuff.


The last time I used a Mac I remember the absolute headfuck that was all the different ways to make it "more unixy"

"Oh just use Macports!"

"No macports is trash, use Homebrew!"

"Homebrew sucks, use Gentoo/kMacOSX!"

Of course none of them were particularly interoperable at the time and all had extremely arcane invocations and setups. It was very lipstick on a pig


Not much more of a headfuck than the flavours of Linux distros and their package managers and guis. "More unixy" is also a bad pun, there's nothing more unixy about it, just a package manager - pick one and use it. Been using homebrew without much of a problem almost for a decade.


I haven't used homebrew but I use macports all the time without any issues. shrug


I'm very happy with Homebrew. I find the model way easier than Linux packages.


The value of great looking hardware and a great looking OS is strong.

The value of being able to open the lid on a brand new machine and just run a terminal is significant. It’s compelling.

It sounds so silly but it’s the same silly that makes us buy the same beer and the same coffee and shop in the same stores all the time.

I migrated off macOS for good during the pandemic though. I yearned for rc.local and xinitrc too much. I live a somewhat (ahem) humbler life now with i3, Firefox, python, vim, and Notable: they were all I really needed, but I miss the shiny.


Same. Only reason I even look at OSX is because of the terminal/BASH and how easy that makes programming-related work. For everything else, I think windows is way better.


I actually like the Mac GUI for GUI specific tools, but to each their own. No big deal. I used to use Macs in the early days and always had a separate unix workstation. When OSX gave me both, I was all set.


I've used OSX a lot back 8-10 years ago and while it is in fact a Unix that doesn't mean it is just another *Nix distro, there were some massive disadvantages to it for my use cases (people who can stand constantly switching between touch pad and keyboard seemed to fare a bit better in this regard as their touch pads were excellent):

- modifier keys varied by application (was it alt-shift-left, CMD-shift or ctrl-shift-left I should use to select one word back in this app..?)

- CMD-tab only switching between applications is probably nice if you have good time to ponder your next move. When I only want to go back to whatever application-or-window I came from and I end up a completely different place because the last place I were was another window of the same application it breaks my flow and it is infuriating.

So yes. I'd very much want a Linux distro on it.


It's unpopular because it's not good or solid, unless you don't care about openness and privacy. In which case you're not really worth thinking about in the context of Linux distros. There is plenty of closed software that compromises your privacy around.


Agree completely. My OSX environment fully mirrors various Linux deployment environments I run well enough to test any kind of shell scripting or crontabs or anything else I need to test locally before deploying them. Also, Parallels and VMWare both say they'll release M1 compatible virtual machines soon should you need a specific release to test on. I don't see the point of shelling out for Apple hardware if you just want to run a Linux distribution as your main OS. I'm more bothered by the current inability to boot into Windows.


> I suspect if Windows had been unix based like Mac OS, Linux would not have become quite so popular.

I fully agree with you. Microsoft's biggest mistake was not giving the POSIX subsystem the same level of relevance than Win32 and just carrying it around to fill check boxes.

Had they been serious about it, it would have been more than enough for UNIX like workflows.


It's an old, crufty Unix that had various components replaced with proprietary Apple software. Take a look at how a modern Unix does networking, versus how Apple is forcing the market to use their proprietary network APIs.


No. Because (in no particular order):

- it shouldn't be the community's (or a crowd-funded dev's) responsibility to provide software support for hardware produced by one of the largest companies out there (bonus: with zero hardware specs);

- Apple could make all this futile with a push of a button (SecureBoot can be disabled for now, but what guarantees are there this won't change?);

- other arm64 machines will be available soon enough, most if not all of them with publicly available specs;

- I do not own an M1 machine, nor do I plan on buying one;

From a technical perspective, it's doable. Looks like it has UEFI and can run Windows. But we know nothing about possible silicon errata and required driver changes (or at least I don't).

Anyway, I'm sure others would like to see this happening and would actually pay - hopefully the Twitter poll will reveal whether this is actually worth it.

Disclaimer: I ported things to arm64 for a couple of years as a contractor.


For the record, I disagree with some of your points, but the one point I agree with is really important.

There's no clear guarantee that there will be other performance-competitive arm64 CPUs in laptops anytime soon. I don't think anyone has as much incentive as Apple does. Who else is as incentivized to make a laptop/desktop-class arm64 chip? Maybe ARM themselves ... but without a mainstream OS to run it on (with mainstream software available for it), I don't see it happening in the next 5 years. Its a chicken-and-egg problem that Apple is uniquely suited to address with their vertical control over the Mac ecosystem (hardware/dev tools/OS/competitive software).

Server chips, maybe - but we can already see with Azure that competitive x86 chips from AMD have killed Microsoft's plans to deploy arm64 on their cloud service.

But this:

- Apple could make all this futile with a push of a button (SecureBoot can be disabled for now, but what guarantees are there this won't change?);

This is huge. We could all contribute to getting Linux ported to M1, and then Apple could shut us down with little or no effort. And ... maybe they won't? They probably won't? But who knows? Why build an ecosystem around a hostile hardware vendor?


> Server chips, maybe - but we can already see with Azure that competitive x86 chips from AMD have killed Microsoft's plans to deploy arm64 on their cloud service.

I have heard a theory that ARM Servers have a difficult time because there aren't really many developer machines that run ARM. With Apple changing that, there is a chance that the next round of ARM server chips will have better success.


I think this a contributing factor, but not the whole story. Another part is in order to switch to arm64, your entire software stack needs to support that architecture. If you are using linux and open source software, you'll be fine for most, maybe all of that stack, especially if you are willing to compile things yourself. But it just takes one component to block the transition.


I see it as being a bit like the move from Python 2 to Python 3, but easier.

Most software that can run on x86_64 can run on ARM after recompilation. Some software does require changes (anything using vector intrinsics for e.g.). But in general, the biggest barrier is the dependencies.


I agree. Cross-compiling is just awful in general, unless someone works really hard to put together a really high-quality cross-compilation toolchain which includes compiling, deployment, and remote debugging. That's what you get with the iOS and Android toolchains. But it just isn't there with Linux (or Windows) in general. There's a whole lot of work that's gone into the iOS and Android toolchains. It's easy to overlook that.


That seems silly. Developer machines rarely have AVX-512 or multiple TB of ram too...


Various people have suggested to Intel that they made a mistake by not selling desktop/workstation chips with AVX-512.


They DO sell desktop chips with AVX-512. I'm writing this from one: i9-9900X.


They do now. But for a long time they didn't (I think). Or maybe they always did but were unreasonably expensive.


Amazon have their own ARM server chips out since some time. They are competitive: https://www.anandtech.com/show/15578/cloud-clash-amazon-grav...


I know - which is why this is all so interesting.

But my point stands - 3 years ago Microsoft was talking about deploying arm64 to Azure, and those plans were cancelled with Zen 2. Given that GCP will be gone by 2023, that leaves half of the cloud titans with arm64, and the other half without. Will AMZN sell Annapurna chips to other services? My guess is they won't.

Someone has to make arm64 chips for other data centers and cloud providers, and Cavium/Marvell and Ampere have failed trying. It's a huge investment for an uncertain payoff. Annapurna had buy-in from the rest of Amazon.


You do know there is a version of Windows for ARM64 already, right? It was largely hampered by the speed of the available CPUs, but it does exist.

What I’m really curious about is how the ISA for the M1 is different from the CPUs Windows for ARM already supports. And as others have mentioned — I think GPU support will be more difficult, but I don’t have any data to support that.

But, it’s not like alternative (and mainstream) OSs don’t exist for ARM64. Maybe with some faster ARM64 CPUs, this could be the push to really make ARM a viable architecture for more than just Macs.


You do know there is a version of Windows for ARM64 already, right?

You do know that there's no software available for it, right? Without software, arm64 Windows is not a viable operating system for an arm64 laptop. ARM Windows is as relevant as MIPS Windows or Itanium Windows without the huge ecosystem of software you get on x86.

And who goes bankrupt first to build this platform? Is it the laptops makers who lose millions investing in laptops that almost no one will buy because there's no software for them (actually, they already did that). Or is it the software devs who lose millions porting their software to laptops that no one owns? When Apple says the future is ARM, everyone knows the hardware will be coming and they'd better fire up their IDEs. There's no such confidence in ARM Windows.


> When Apple says the future is ARM, everyone knows the hardware will be coming and they'd better fire up their IDEs. There's no such confidence in ARM Windows.

Pretty critically, Apple was able to modify their chips so that they can efficiently emulate x86 code. Microsoft will not be able to do the same.


There is plenty of linux software that runs on arm64 though. Even if windows remains the dominant deaktop OS, there is certainly a niche for developers and other who would want to run linux on a high-performance arm64 laptop. And then there is chrome OS.


I don't think the "developers who want a high performance ARM laptop" niche is large enough for any serious hardware manufacturer to address.


You can't design a desktop-class CPU for a market niche. Even board-level design doesn't scale that way - every good linux laptop on the market is actually a Windows laptop that's had linux installed on it and benefits from the economies of scale in the Windows market (as much as it saddens me to say this, typing this on a thinkpad running fedora 33).


Fujutsu have made some arm chips which could be interesting for servers if they ever get out of the HPC segment which I think they are in exclusively for now. Specifically the A64FX.


Eventually they will come. And I think they will be linux laptops, not windows. The legacy support of closed source software windows has is working against them atm. Qualcomm for example already has hardware that is one or two iterations away from being competitive with the M1( with different strengths and weaknesses) and they can mainline their drivers if they want.


I agree that competition will come, but I am skeptical it will be from Linux. Linux has less than 2% of the desktop market share (that includes laptops). Where's the payoff for the huge investment required to make a dent? The numbers just don't add up. Apple has made this happen using their massive iOS product revenue to fuel their custom CPU development teams.

https://gs.statcounter.com/os-market-share/desktop/worldwide

As you say, Qualcomm is probably the closest to being competitive, but there is a lot of work to do to catch up. Nvidia is trying to acquire ARM, so they seem to be interested in moving into the space. Samsung and AMD are now working together in ARM processors, so they are another player. Intel used to have an ARM presence via their DEC acquisition, but that was sold to Marvell about 10 years ago. Marvell might move into the space. There are also some ARM startups, some of which were founded by Apple engineers. Lots of activity. Intel is probably the biggest loser in all of this. Sad to see that. They will respond, but it will take time.

At any rate, in terms of desktop market size, Windows is the biggest (76%) and getting a chunk of that processor revenue is a big enough payoff to warrant the required investment. Doubling or even tripling Linux desktop share is comparatively small potatoes. Microsoft is mostly agnostic so they will encourage cannibalizing X86 Windows in favor or ARM Windows rather than lose market share. I think we are going to see a big uptick in ARM Windows investment and product announcements.


Eventually they will come

Perhaps - if it makes sense, cost/performance/power-wise to put a core like that in an Android mobile phone. I guess it would? But in the next 5 years?

Remember that anything Qualcomm makes will be optimized for mobile phones and only mobile phones. They won't waste any die-area at all on anything that isn't required by the Android phone market. The Android phone market is the only market for these chips that sells enough units to pay for their design, and it's fiercely competitive.

Especially with ARM's own designs improving so much and Samsung abandoning their own under-performing designs in favour of ARM's, Qualcomm is going to get a lot more competition in the next few years.

Anything they put on those chips that makes them more expensive or use more power than competitive chips is going to cost them design wins. No way will they sacrifice 10% of their mobile market for some pie-in-the-sky, maybe-maybe-not ARM laptop market that doesn't exist and will depend on lots of theoretical buy-in from Microsoft/Redhat/Canonical/Adobe/Lenovo/Dell/etc.

The legacy support of closed source software windows has is working against them atm.

That's pretty outlandish. Try saying that to someone who uses Excel or After Effects or Photoshop for their work. A performance linux-based arm64 laptop has everything working against it that a windows arm64 laptop does, and arguably even more.


>it shouldn't be the community's (or a crowd-funded dev's) responsibility to provide software support for hardware produced by one of the largest companies out there

You must be thinking of a different Open Source community than I am, because the Open Source community I know thrives off of providing community support for software on major vendor's hardware.


I submitted arm64 patches to OSS ranging from the kernel to the most obscure userspace applications.

All my contributions required some support or at least confirmation from the hardware vendor that my assumptions were correct - e.g. I submitted a patch for a GICv3 errata on a specific chipset; I had to confirm with the vendor that my findings were correct - sure, the patch "worked", but was it doing the right thing or just hiding the real issue? (e.g. why did writing zeros to some magic register fixed the problem we observed? was it a hardware issue or a software issue in the kernel? without feedback from the vendor, such things are a lot harder to isolate and fix properly).

I agree about OSS thriving on major vendor hardware, it's just that Apple is special in this case and intentionally makes it hard for the OSS community to provide support for their hardware.


It isn't UEFI-based, it uses iBoot. Proprietary up and down.


My bad, I didn't do proper research when writing that comment. Thank you for the correction.

And that's too bad, UEFI would have been easier to deal with imo.


Most of the solutions in this thread are simply not viable. Stuff like the Pinebook and Raspberry Pi are several magnitudes away from Apple's M1 SoC. No amount of hackerish imitation will bridge that gap, the laws of physics are not kind to consumer product hardware hackers. It is very obvious that most of the people commenting here has never held a soldering iron, nor taken a single computer architecture class. The only true way to match Apple on both performance and efficiency is to fund custom silicon. Cerebras needed 100 million to get started. Apple has already laid the architectural groundwork, a chip with similar performance numbers can probably be commissioned with ~50 million. Another 10-20 million for the emulator/hypervisor crowd to get x86 virtualization up and running. This could probably be done by existing players so the cost can be discounted. A crowd investing approach is most likely the best bet, with the initial rounds being led by a VC experienced in hardware.

This is the Sputnik moment for open source. Should ACPI (and its associated lassiez-faire mindset) become obsolete as a standard, then general purpose computing on consumer devices would become history.

All the comments about how Apple's Macs are not worth it are entirely missing the point. There is nothing comparable to Apple's M1 right now. Your Purism/Lambda labs/Starlabs/System76/Pine64 usual Clevo shell machine with Linux pre-installed would never be able to compete with a M1 powered system. This isn't about 16:10/3:2 display ratios, machined aluminium, or HN's favorite complaint: "build quality". Even a well-funded premium laptop company like Razer would not be able produce something to compete with the MBP if they don't have a similarly efficient chip. Forget about the bigger players like Lenovo and Dell. Intel and AMD dropped the ball big time, and if the open source community doesn't step up, then the future will be locked-down ARM systems. RISC-V will not be competitive for another half a decade and there is no guarantee that it won't be locked down if the community does not push back against the embedded hardware culture (where VHDL/Verilog libraries are referred to as "intellectual property") which pervades ARM's heritage.


> Apple has already laid the architectural groundwork, a chip with similar performance numbers can probably be commissioned with ~50 million.

I think you might be two orders of magnitude off here. A class leading general purpose compute chip on a cutting edge process? I don’t think $5 billion to produce working chips is unreasonable. If it were only $50M to produce A14/M1 level chips, Qualcomm would already have done so.

Cerebras is cool, but they are not on TSMC 5nm and don’t have to design general purpose silicon, where you have pesky things like huge reorder buffers or 8-wide decode blocks, which are not seen in any other production CPU right now.


Agreed. Apple has acquired multiple companies and been building custom CPUs for 10 years with a significant sized design team. They paid about 280M for P.A. Semi in 2008. They reportedly also hired a number of DEC StrongARM refugees as well as several other small design teams. Apple was also an original ARM investor and have been using ARM cores for various uses for decades.

A competitor also needs 10 years experience designing higher and higher performance ARM cpus in tremendously heat/power constrained systems (phones). There are a lot of long term threads that have come together in the M1 chip.


Building a useful 8-wide decode block on x86 might be extremely expensive in terms of area and power.

For a long time, we’ve been told that the nasty x86 ISA is no big deal and that a good CPU can cache decoded instructions and play other tricks to make the ISA irrelevant. But x86 really does have fundamental problems. The length of an instruction can’t be determined until the instruction is almost fully decoded, which makes deciding in parallel quite nasty. And the x86 memory model is fundamentally more expensive than ARM’s.

M1 is the first serious attempt to make a competitive high-performance non-x86 CPU. Perhaps the real lesson from M1 will be that x86 can’t compete.


You are right, I am mostly estimating based off upgrading existing architectures with insights obtained from M1. 50 million may be enough for the raw manufacturing cost for one batch but research and development from scratch would not be cheap. I am basing the cost off the assumption that there are already experienced chip designers at Intel/Qualcomm who are willing to take the opportunity to build a startup with a 50 mil budget.


I share your view of the likely future, but unfortunately it seems very unlikely to me that the open source community is going to unite and prevent this in any significant way. What's happening might give a boost to open source hardware and that would be wonderful, but I don't think this hardware will have much of a chance of being economically competitive and capturing a market.


Ehh I think people are overestimating M1 impact.

Sure its the best laptop/ipad(other pad) chip out there. And it (and it's next versions) will likely remain best for quite a while.

But it's not that the rest of the chips out there turned into garbage. They are still very usable.

Personally I have gone away from laptops back to desktops. Having desktop and a phone/tablet combination for me is better than just laptop or laptop + phone/tablet. I still have my laptop, but I very rarely use it now. (mostly if I travel, which I don't because of covid :) ) But having latest gratest laptop is not anywhere near the top of my purchasing priority.

And on desktops, latest AMD chips and lots of fast ram, are quite decent.

Bottom line even if OS never has anything that matches M1, it will still be fine.


OSS for operating systems is unlikely to be fine: out of the four classes server, desktop, laptop, phone/tables, one is already fully locked down and with the M1, the laptop class also becomes less open and hackable.

Most people - unlike you - don't care that much about desktops, they either have laptops or tablets.


If the M1 is a close derivative architecture of high efficiency mobile scaled to desktop, would scaled up Android processors really be that far away from M1? I know Apple A-series had some edge over Snapdragons, but was the gap between Android processors that big?

There are many hundred billion dollar companies motivated to close the gap. It may not be this year.


“I know Apple A-series had some edge over Snapdragons, but was the gap between Android processors that big?”

Absolutely


Remember there is the 5nm advantage.


What about the other A-series processors? Excluding the A14, they all used the same process as other ARM based processors, but also have a big performance advantage.


The A14 is significantly faster than the Kirin 9000 (Cortex A77 on TSMC 5nm) on SPECint and SPECfp: https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...


There is the Microsoft SQ2 and the Snapdragon 8cx Gen 2.


Apple has the same sort of ownership of M1 as it did with Thunderbolt, maybe a little bit more but not much more than you'd think. Let me explain.

Apple has the performance edge now because it purchased ALL of the 5nm chip fabrication capacity of TSMC. This does not mean:

- they will be able to do so forever

- someone else won't be doing 5nm in the next couple of years

So, the same sort of performance will come to other devices in the near future. The best Apple can hope to do at this time is to get enough attention to get some new users and hopefully lock them in into their cozy eco-system.

Most importantly, most users don't care about an expensive powerful devices. Highest selling consumer devices are still budget devices and will continue to be so for the foreseeable future. Servers on the other hand will continue to demand powerful hardware, and Apple isn't doing anything there.


Well, PC clones only happened thanks to Compaq.

Personally I would have been happier with Atari or Amiga instead, targeted mobile dev back in the J2ME/Symbian days, I did had access to console devkits for a while, so for what I care everything is good as it is.


Your argument against an open platform is "more open hardware is slower", as if that matters somehow. 99% of what you use in a day doesn't depend on a fast platform, and the remaining 1% is stuff that you should migrate away from or modern videogames (which arguably you should also move away from).


> and the remaining 1% is stuff that you should migrate away from or modern videogames (which arguably you should also move away from).

Would you expound on this in greater detail? I am interested in your thoughts.


With some notable exceptions applications have become slower in a way that is just fundamentally unnecessary. Fifteen years ago you could do text editing, browsing the internet, chatting and possibly editing spreadsheets all in parallel on a machine that today would not be able to do anything with the same (but upgraded!) applications. This is not due to features that are "just more expensive" (though as said there are some notable exceptions, like probably decoding 4k video on youtube) but because the development process has migrated further and further away from any sort of performance concern as the platforms you could expect became faster.

That is not to say the computing landscape was fundamentally better fifteen years ago. I think it is almost the same. What has really improved is the general usability and stability of drivers (no more rebooting, random freezes and if you run linux your X server not working every other update), which is definitely something to be proud of. What has not improved is basically everything else and I'm really not sure why.

If I had to guess I would say it has something to do with managers the world over just not giving a shit about delivering something good, as long as it sells. As platforms have started having more headroom for bad software, "as long as it sells" has started to have a wider and wider performance profile. Landing us where we are today: with many of the same features, much less privacy and every other app packed with an entire browser "because it makes it easier to develop".

Even as recently as last year I had a more relaxed stance about this but I guess it has just been bothering me now that I have more time to think about the state of my work (and everyone else's).

Edit: I see now I responded to something slightly different than what was requested.

Modern (AAA) videogames should not be bought or played because they are produced by a sick and exploitative industry that really needs to die. Instead, we can prefer the indie games that work well with what they have: Generally a lower graphical budget that runs significantly better on hardware that is "out of date".

For office-type processing and messaging, there are generally application choices and tradeoffs that can be made for more performance instead of more eye candy. The average office worker (which I guess includes most people here) will not materially care that it doesn't look quite as good. Another upside is that as a company you don't need such an enormous budget for office hardware.


> the remaining 1% is stuff that you should migrate away from

I'm not following. Compiling and media editing depend on a fast platform, are you really suggesting people should avoid doing those tasks?


Photoshop (i.e. "image editing") and video editing software have existed for a long time. They have not become better, just heavier. There are still alternatives for those applications that run just as well on slower hardware, though obviously your movie edits might have to render overnight instead of in three hours. Nobody will die if that happens.

Edit: You mentioned compiling, which is not an incredibly salient point. If you're not doing linux kernel development (and even then...) you do not need a fast machine for this. Even the linux kernel compiles within a few minutes (maybe 5? It has been a few months since I tried this) on a raspberry pi 3b. For more contained projects (e.g. something you might write in lisp, rust or go) it'll take somewhere between one to twenty seconds.


what on earth are you talking about? time is made up of seconds, minutes, hours... etc. reducing wait time increases productivity. The larger and more complex a project becomes, the more these delays compound. A sub $1k laptop with the the computational power of a workstation that costs many times that, and isn't portable is an absolute game changer in many areas. If I worked for National Geographic or the New York times as a photographer and needed to preprocess several thousand images or 4/8k footage from the field so I could make print deadlines, 3 hours vs overnight means everthing.And more importantly, the amount of power consumed by the cycles is far from trivial. An x86 mobile workstation can't even complete certain render tasks on a single charge that the macbook air can do on less than a quarter of a battery charge.

As for compile times, believe it or not, lots of projects are written in something other than highly optimized C code and compile time is a real pain point. You don't always have a choice of what language or dependencies you work with.

Just because there is an alternative way to do something, or a way that takes longer doesn't mean its viable. Performance improvements matter, a lot. Power consumption improvements matter, a lot.


> This is the Sputnik moment for open source

Does it need to be? x86 had every opportunity to become a locked-down, exclusive standard too, and it grew the way it did, into a cross-vendor standard, for a reason. Why will a similar trajectory not apply to the M1?


The history of the x86 instruction set is very complicated and involves a number of lawsuits and settlements. What you see today is the fruit decades of fighting and wrangling.


Because M1 / Arm has a locked down base to draw from, unlike Intel / AMD.

Way different roots in mobile when compared to PC


As far as I understand, x86 is a duopoly, and AMD is allowed to produce it because antitrust litigation resulted in Intel being required to allow a second party to implement their instruction set.

Also Intel is a very different entity than Apple. They produce commodity hardware, where Apple creates consumer experiences. Apple's incentive is to keep M1 closed to give their software ecosystem an edge vs. competitors.


If there’s anyone in the community I’d trust to do this, it would be marcan, who, among countless other things, got Linux working with GPU acceleration on the PS4. https://media.ccc.de/v/33c3-7946-console_hacking_2016


I maintain mbpfan and have contributed to a few other Linux on Mac tools.

Linux on Mac is not worth the effort. I ran Ubuntu on a 2011 MacBook Air and now run Fedora on a 2014 MacBook Air. There were several papercuts even with these relatively open models where it took a while to get suspend, trackpad, fan, webcam, wifi, etc. working properly. You will need to use binary drivers for the latter two features and some strange EFI bootloader. It seems that newer MacBook compatibility is worse:

https://github.com/Dunedan/mbp-2016-linux

I use Debian on ThinkPad at work and plan to buy this or a Dell XPS for my next laptop.


I'd suspect, though, that most Linux people who considered running it on Apple hardware just shrugged and picked very similar PC hardware instead.

Apple's m1 computers have no PC analogue and may not for some time. I've not used one but it appears they really are different, and better, than anything else in their class.


> Apple's m1 computers have no PC analogue

And they don't need to. There's hardware available that works just as well, the buzzy marketing space around M1 is just that: marketing. Nobody needs this.


As much as I do not really care about apple machines (and I neither own neither plan to own in the near future) it seems that the M1 laptops have a significant advantage in power consumption over amd64 laptops. And as a linux user, this is only gets worst with linux. I agree however that from a performance perspective the M1 seems to be competitive but not much faster than existing amd64 products.


I have had 24h+ battery life on a 64bit x86_64 machine - clearly good battery life is possible already.

Edit: Yes, this was with the lid open and me doing things. Not videogames, just some light coding.


On a form factor similar to a Macbook Air? I'm sure it is possible to have 24h+ battery life on a Linux laptop; if you ignore the other parameters (battery size and performance) it is meaningless.


thin, no fans, normal keyboard. I have not held a macbook air so I can't comment on weight. The battery size doesn't matter, only how long it lasts while using it :)


Apple won't support it, therefore support and maintenance will be working around apples proprietary hardware and special drivers. Linux is not meant to live in that environment, I'd argue bettering support for other OEM's is a more fruitfull goal.


If anyone has been following nvidia/linux story, this nails it on the head. Apple has never played well with FOSS and aren't going to start now. Apple's implementation will always be ahead, better, less bugs, and the linux port will always be a shitty experience that takes up several weeks to get working properly, and then it'll be slower than expected.

Get yourself a laptop with an nvidia card and Ubuntu. Even today it's a garbage experience that takes up hours/days of debugging to get right, and then it's still way worse than the mac/windows experience.


Apple have a FOSS kernel, one of the three remaining web renderers, CUPS, and a bunch more smaller things. They’re not the most religious of FOSS companies, but it’s a bit unfair to say they don’t play well in the places that they play.

If you take FOSS to mean only GPL, then yes, I get your point.


Yeah I think it’s more often that FOSS that doesn’t play well with Apple. It’s understandable - Apple keeps a lot of desirable software for itself - but perhaps unfortunate.

Federici has said in one of his interviews that Apple wants people to hack on the M1 machines so I’m hopeful that they will be more open than on IOS.

All of that said, I won’t hold my breath.


> Apple wants people to hack on the M1 machines

Of course they do. Like they wanted people to hack on OSX 20 years ago; and once they reached critical mass, they pulled up the drawbridges (dropping anything GNU, dropping Java, restricting access to the OS, pushing AppStore, etc etc).

Chances they're going to do exactly the same with M1 and its follow-ups are 99.99%... if anything because their management is largely drawn from the very same people who executed that strategy.


They dropped anything GNU because of the GPLv3, which was specifically designed to stop what the authors saw as exploitation of others' work.

I can see both sides here - the viral nature of GPLv3 is anathema to a company like Apple, so simply walking away was inevitable; and people were exploiting loopholes in the GPLv2 (I have no idea whether Apple was, but there are documented cases of others doing it).


It's just one datapoint - there are plenty more. Simply speaking, when they are the underdog and have to attract developers, they open up; and when they don't need it, they close doors. That's just what they do, it's a perfectly rational strategy (if cynical). They are hardly the only ones at this game, Microsoft does it too. I am just pointing out that promises of openness with Apple typically come with an expiry date.


Apple and Microsoft have never pretended to be “open” and their behaviour is quite predictable - Apple most of all. You might not like their behaviour but they haven’t tried to trick anyone.

Contrast with Google and the infamous tweet about Android:

https://mobile.twitter.com/Arubin/status/27808662429


XNU is not FOSS, and 99% of the foundation classes, Aqua, and drivers are all proprietary.


Why is XNU not FOSS? Has something changed? Is Wikipedia wrong? It says that XNU is APSL which is approved by OSI and FSF.

While I agree that much of the good stuff is proprietary (but certainly not all, eg WebKit), that doesn’t mean that the free stuff isn’t free.

https://en.m.wikipedia.org/wiki/Apple_Public_Source_License


Webkit was likely "saved" by the original KHTML license being GPL. At the time, Apple were wise enough (or desperate enough) to figure that they could work with such a license, although they were eventually careful to chisel out anything they could into BSD-licensed modules. And still they had to be dragged into the light more or less kicking and screaming (e.g. they had no public VSC until KDE people kicked up a stink in the press, and were just throwing huge swaths of code "over the fence" like they still do with XNU).


KHTML is LGPL, isn’t it?


If i remember correctly KDE used to be GPL when webkit started, and was later relicensed. The difference is relatively irrelevant anyway, the way they used KHTML they would have had to release sources even under LGPL. Had KDE used BSD, MIT, or Apache, back then, we likely wouldn't have had webkit.


I’m trying to find a specific commit to disprove this, but that’s not my recollection. Certainly this blog post from 2005 indicates LGPL: https://web.archive.org/web/20050428230122/http://www.kdedev...


You might be right. Still, it shows the Apple attitude at the time, and how the license helped changing their ways. Note how the post complains they are simply using OSX apis... without the modification-release clauses, webkit as a reusable library would never have happened.


Indeed. Git repo at https://github.com/apple/darwin-xnu.

Note that that is 2 years behind. You can get a newer source dump from https://opensource.apple.com/source/xnu/xnu-6153.141.1/.

So, open source, but development isn’t done in the open. You can’t really expect you’ll be able to see recent commit messages or even just regular source dumps.


The development methodology has nothing to do with whether something is FOSS or not.

If you're complaining that you want to see daily updates, that's a completely different thing than claiming it's not FOSS (as OP did).


FOSS means being able to develop it the same way they do. That ought to mean being able to see to VCS history, bug trackers and so on - their own developers would use those things when developing.


It really doesn’t mean that. Neither Stallman’s four freedoms[1] nor Peren’s Open Source Definition[2] have anything to say about code history, bug trackers, development standards etc etc, and they’re the only commonly accepted definitions for what free and open source software is.

[1] https://www.gnu.org/philosophy/free-sw.html.en

[2] https://opensource.org/docs/osd


> The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.

> In order for freedoms 1 and 3 (the freedom to make changes and the freedom to publish the changed versions) to be meaningful, you need to have access to the source code of the program. Therefore, accessibility of source code is a necessary condition for free software. Obfuscated “source code” is not real source code and does not count as source code.

IMO exactly the same logic applies to code history and bug trackers: you need access to those things to be able to study how the program works. A code dump is much the same thing as obfuscated source code: you can build the program from it, but you can't understand the program from it.


Right, but, like, that’s just your opinion man.

FOSS means you get the source code, everything else is gravy.


Source "in the preferred form for making modifications". IMO that means a VCS checkout rather than a code dump.


A code dump is nothing remotely like obfuscated source code. They run competitions for obfuscated source code, take a look and compare.

Seriously, at this point I think you have to be just defending a position that you know is wrong but since you originally stated it, you’re not backing down. If not then let’s just agree to disagree...


I've been struggling with nvidia on my laptop for quite some time, but I definitely wouldn't say it took debugging. There are clear limitations, primarily that on older chips the gpu never fully powers down when not offloading, but that was pretty clear from the start. And that's on Arch, using a port of a tool that was made for Ubuntu. On newer computers, it works even better. No matter what, there isn't much room for debugging, be that a good or bad thing.


I have a Dell XPS 9500 (the 2020 model) with an nVidia driver. I run PopOS! as my daily driver and it works with the GPU just fine.


Can it run sway?


I haven't attempted sway specifically, but I don't see why not. The OS installs with nouveau on wayland by default, and runs perfectly fine (at a ~60-70% performance level, based on Linux/Proton gaming FPS). I use GNOME on Xorg, so have no issues with the proprietary driver.


Looks like that’d be the GTX 1650 Ti? That’s of the NV160 family, and per https://nouveau.freedesktop.org/FeatureMatrix.html (assuming it’s up to date), that lacks all 2D and video acceleration, and power management. That sounds rather like a mediocre toy for a large fraction of users. And Wayland needs Nouveau, because the NVIDIA proprietary driver is a hostile environment that does everything its own way rather than the way everyone else does things.


Then use the Intel card and you'll have full acceleration + wayland. Or don't buy the laptop. I was simply answering the question.

You're being unnecessarily antagonistic here. I have a modern 2020 laptop, it works for all of my needs with an nvidia GPU.


Look, the whole context here is “NVIDIA cards have bad Linux support”. And that’s just what we demonstrated from saati’s comment onwards in the example of Sway: that if you want to go Wayland, you’re stuck with bad functionality. Because graphics cards and computers are powerful enough, many people will be able to live with this crippled functionality and might not even notice it, but compare it with the story for AMD dedicated GPUs, or AMD or Intel integrated GPUs, and the point is substantiated: if you want to actually use the GPU fully, you just can’t do so properly.

(And a large portion of the problems people are talking about are with Nouveau; people that can use the proprietary driver—specifically, people using X rather than Wayland—will have a better time of it, though still worse than with GPUs of other brands.)


I own an Ubuntu laptop with an NVIDIA card since 2014 (HP ZBook 15 with Quadro K1100M). Not perfect but not garbage. Hours of debugging / workarounds cumulated in 6 years yes, days not.

The remaining problems on Ubuntu 20.04:

1. The brightness control keys don't work. They generate the event they are supposed to generate then I see an error in syslog. Workaround: two hotkeys bound to windows+fn+brightness up/down to run a bash script that increases / reduces the backlight level.

2. The screen works at 40 Hz since Ubuntu 18.04. I expected to be unusable but actually I can't notice the difference with 60 Hz. Noveau works at 60 Hz but it's unusable for other reasons (I don't remember the details, I checked months ago)

Problems I had because of the NVIDIA driver: the laptop didn't shutdown, only restart. Workaround: I press the power button right after the restart, when the BIOS shows up. Not a big deal because I shutdown a couple of times per year.


I was able to fix the brightness controls on Ubuntu by switching to Linux 5.8, if you haven't already tried that.


> not garbage

> The brightness control keys don't work.

> The screen works at 40 Hz

I think you need to use a laptop (with factory OS) produced within the last 2 decades, because your garbage detector seems to be broken.


I didn't downvote you, because there are tradeoffs and they are very subjective.

BTW I do use Windows and MacOS a few times per month, via remote desktop to run two programs of a customer that only run there. Not a complete experience but add to it some continued (Windows) or sketchy (Mac) use of those OSes from their very first release (80s / 90s).

I'm trading those two nuisances, one very small, the other invisible, for other much larger problems

1. Not using Windows, which is a pretty much horrible experience. Good for gaming, which I don't care anymore on a PC, bad UX (and I don't mean only the GUI.) And I'm targeting Linux servers anyway.

2. MacOS and it's top bar and the menu at the top. It was OK on the very first Mac because the screen was so tiny that it actually saved space. It was perplexing or infuriating when the screens got larger. Unfortunately it stuck and it will be like this forever. And Macs don't have physical buttons on the touchpad.

So I'm happy with my Gnome desktop, configured with: the top bar at the bottom merged with a Windows like task bar, no dock, hotkeys to swap virtual desktop by customer project, no animations, visible permanent scrollbars (but I show only their outline on the background color of their window.)


You have your preferences about UI and GUI and that's ok. But 40Hz screen and non-working brightness controls are a non-starter for any but the most dedicated users. 144Hz is common on decent laptops nowadays, along with automatic brightness.

To me, this trade-off is very simple.


As I wrote, it's very subjective. I've been working at 40 Hz for 2 years and I didn't notice the difference. I press windows-fn-F9/F10 instead of only fn-F9/F10, not a big deal. Automatic brightness control as on phones? I would probably disable that anyway.

By the way, brightness control didn't work in 2014, then started working, then it stopped working again. Kernel versions / NVIDIA drivers, who knows.


Exactly this.

For many of us the disadvantages of using Linux are tiny compared to the advantages:

- from 30-50% faster compiles (vs Windows)

- instant git (vs Windows)

- choice of Desktop environment (vs both)

- choice of hardware (vs Mac)


> Get yourself a laptop with an nvidia card and Ubuntu. Even today it's a garbage experience that takes up hours/days of debugging to get right, and then it's still way worse than the mac/windows experience.

This. Writing this from a laptop running Ubuntu with an nvidia card and I can feel my body tense up just reading it.

I remember several nights in college just struggling to get the drivers to work and finding the right cuda spec from the repository, after combing through multiple sources.

It's still a nightmare and I always dread that the StackOverflow and Ask Ubuntu posts in the bookmarks folder that I've reserved for this arduous process will be obsolete when the GPU card crashes again for the umteenth time.


The new XPS 15 I am writing this on works fine on Ubuntu. So did the Thinkpad I used before it and the XPS 13 before that. I spent 0 time debugging. I am not sure how the Windows experience would compare, but I am fairly happy, with maybe the exception of battery life, which from what I gather kinda sucks on the XPS.


I disagree with this on a few points aside from the hyperbole:

>Apple's implementation will always be ahead, better, less bugs...

No, it's not and really hasn't ever been. Every platform has a shortcoming and claiming it's ahead of the game is a biased stance. The M1 specs may be impressive now, but the same can be said of every "new" SoC chip.

>... linux port will always be a shitty experience...

People who really love Linux and FOSS actually prefer to tune their environment to their expectations. The weeks of tinkering is part of the hacker mindset that is slowly eroding, and I'll take the weeks long config experience over a locked environment.

To sum it all, Linux and consumer open systems have a different target audience from Apple. Apple users are like car lessees, they just want to hop in and drive their car until the lease is up. Linux users are like the mechanic working out of their garage who drives a hodgepodge vehicle they pieced together. To say that the Apple experience is better is just hype, and calling the personalization/optimization process "shitty" is closed-source/closed-minded view.


> Linux is not meant to live in that environment,

Linux is like the cockroach of the OS world, it can do just fine in any environment. After the apocalypse, I'm pretty sure the OS the cockroaches are going to use.

If it hadn't been for reverse engineered proprietary drivers, Linux would have never gotten to the point it is now.


On the other hand, we have basically one single model per year to support. That hardware is used by millions and usually users keep it for 5 or more years.


That's not too big advantage - e.g. the major problems with GPU drivers don't seem to be caused by too many companies producing them (2 or 3 depending on how you count) or too many models (new architecture appears only once every couple of years).

(I won't trust myself to speculate on what is making GPU drivers so difficult to get right)


How is that different from simply picking the best option from some other manufacturer each year and supporting only that? There's no mandate that you need to support every device a company makes.


The Linux community won't go along with that. People keep buying laptops that don't support Linux then ask for the community to support them.


If I thought there was a good chance of having the work upstreamed so that a mainline Linux kernel would eventually work on these computers, maybe.

In my experience, there are not many ARM SoCs that have mainline support, even some of the Raspberry Pi boards don't, and they're some of the most popular ARM SoCs used for Linux.

A lot of these projects get to the stage where a forked kernel will run on the SoC, but then support and attention peters out when it comes upgrading the fork or getting the work into mainline Linux because a new and better ARM SoC will be released, and then that requires the same amount of work on a new Linux fork. Since each ARM SoC released is a unique hardware configuration, a lot of this work will be for naught when Apple releases new SoCs that require the same amount of work, or even more work, to get Linux running on them.


This would absolutely all be done with upstreamability in mind, and upstreamed as soon as is practical.

All that horrible forked kernel stuff is the realm of companies with closed development teams who don't care and just want a product out the door.

The PS4 Linux stuff I did was not upstreamed for lack of time/interest, but I absolutely considered it being decent enough to upstream when writing it. PS3 Linux running under GameOS mode patches were indeed upstreamed (though there was a bit of a lawsuit along the way so all of my code ended up attributed to a friend instead since I was keeping a lower profile at the time...).


> All that horrible forked kernel stuff is the realm of companies with closed development teams who don't care and just want a product out the door.

Or people who gave upstreaming an honest try but got fed up. It's not the easiest project to contribute to.


You do need to know what you're doing, meet quality standards, and do things in a way that fits existing architecture (unless you have a really good reason to revamp something). It's not beginner-friendly for sure, but I can tell you the vast majority of non-upstreamed vendor code is not upstreamed because it's terrible (which is also bad for users: things like blatant security problems are much more common in forks than upstream).


If you can't be bothered to learn how to write linux kernel patches properly you shouldn't be making them at all, in a forked kernel or otherwise.


I think this is a legitimate concern, but it seems less likely to me, specifically in the case of Apple hardware, for two reasons:

1. There will be far more interest in being able to run Linux on that hardware, and in being able to keep it up to date (since it'll be a daily driver).

2. Apple are probably less likely to regularly make drastic changes between hardware revisions because they will need to support multiple hardware revisions themselves (Apple are typically pretty good with supporting old hardware). Their only strong motivator would be to actively prevent people running other OSes but I don't think Apple care that much about screwing over that niche demographic - especially when they're already paying for the hardware.


With regard to #1, Linux support on existing Intel Macbooks since 2015 has been pretty poor[1]. That doesn't give me much confidence about support while Macs get more and more esoteric hardware configurations.

As for #2, perhaps, but we've already seen what happens when Apple adds their proprietary hardware to existing x86 systems with things like the T2 chip or the Touchbar. Support for either on Linux is still poor and nonexistent. Full integration with hardware and software on Apple's end means that they don't have to worry about compatibility like 3rd party hardware manufacturers do.

[1] https://github.com/Dunedan/mbp-2016-linux


I'm skeptical. Linux is reasonably popular for "Windows" hardware, most statistics put it at about 0.5% market share. I've never seen anyone running Linux on Intel Macs.

Why would ARM Macs, which seem quite a bit more locked down, ever become a better supported and more popular Linux hardware platform?

I wish they would become one, but I don't see how we can make this leap.


Anecdotally, I just bought my first Mac just because of the M1. It's an absolute game-changer and at least 5 years ahead of the competition. If macOS becomes too burdensome or heavy-handed, I anticipate I'll move to some Linux distribution assuming there's support (and would help fund its development). I'm guessing there are at least a few people like me who will constitute a market for developers to cater to.


Hi from Linux on an Intel Mac.

(I've got fixing up amdgpu to work on this 2015 iMac on my TODO list too... might even do that as a bonus warm-up if this takes off, while I get my hands on an M1 mac)


I admire the work but you have to admit that even after all this time it's easier to install Ubuntu, say, on a random PC :-)


To be honest, the only problems on this iMac are audio (fair to say Apple's fault), the screen only running at 4K max (Apple custom stuff involved for 5K, but hey, not complaining about 4K, go find me an AiO PC with a 5K screem), the amdgpu issue (not really an Apple problem, it's just a rarer chip and probably a dumb bugfix and the older radeon driver works fine), and the Ethernet and SD card reader being problematic (that's a Broadcom chip and their silicon is universally buggy; not Apple's fault, these chips have issues on PCs too)

So really just audio is the one thing that jumps out as broken and specific to this being a Mac. I personally happen not to care as I use an external audio interface anyway :-)

Actual installation is trivial, it's just standard UEFI pretty much. As long as the right GPU driver loads you're fine. Thunderbolt and all that works out of the box.


The Broadcom part is huge IMO. I try to avoid Broadcom as possible on every (programmable) thing I own, but that is not possible with Apple devices.

I mean, I'm sure we can make brcm* not suck on Linux (or it already works well now), but I don't want to support such a hostile vendor.


I ran linux on a MBP as my primary machine from 2009 - 2015 (on 3 different hardware revs). The late 2015 MBPs moved a bunch of peripherals off of the USB bus making them harder to use with linux at the time. In the years since, support has been added for all those peripherals.

I still use my late 2013 MBP running nixos. That machine was a great piece of hardware at the time and still works fantastically well even today. (Its really sad how terrible most most non-Apple speakers are in laptops).

I would buy an M1 Macbook Air if it could run linux and had good power management. (I'll say the same about an M1 MBP in the generation where they give up on the touchbar and go back to hardware buttons.)


Wouldn't device tree solve (or at least mitigate) this problem somewhat? e.g. most broadcom chips have the same I2C hardware block, just mapped to a different memory address.


I'd be curious if you had a rough outline of what are the steps involved in porting Linux to a new target? In particular, I see folks working on a downstream port the the 3DS: https://github.com/xerpi/linux_3ds. I'm kind of curious where do people even start?


Here's an outline for getting Linux running on ARM SoCs: https://elinux.org/images/a/ad/Arm-soc-checklist.pdf


> In my experience, there are not many ARM SoCs that have mainline support

You have a point, but the pmOS project among others is working on upstreaming mainline support for these devices. This involves a lot of effort, to be sure but meaningful progress is being made.


Most of what makes the M1 interesting is TSMC's 5nm process, why wouldn't we expect similar results from future x86 CPUs on 5nm or smaller.

At least AMD is supporting upstream Linux development for GPU support, if you care about Free software and control over your computers, throw your money at companies better aligned with those goals. Don't reward Apple for being assholes.

<=5nm AMD Zen SoCs will be fantastic as well


5nm certainly helped, but what really made M1 is the team Apple has assembled. Apple’s fat margins allowed them to attract and retain the best engineers with top of market total comp. Meanwhile Intel viewed engineering as a cost center to be optimized. So here we are.


Comparing Apple A13 and Snapdragon 865 (both TSMC 7nm) would be good to know how Apple's chip design team does things well.

https://www.anandtech.com/show/15603/the-samsung-galaxy-s20-...


Those are some of the worst colors ever chosen for a bar chart. Topped off with "Chart series order in same order as legend order" while having the bars vertically stacked, and the legend with two columns.

I don't know how some of these tech writers made it out of 6th grade science class with those charting skills.


You optimize for where your competition is. For the past couple generations, Intel had to optimize for prices and, with hiccups in the process improvements, this pressure gets even worse.

When Zen 2 came out, IPC became a serious issue for Intel. I imagine that M1 makes it a lot more urgent.

Also keep in mind that, in a sense, M1 is a dead end. Integrated fast memory is great when the size fits, but it's terrible if it doesn't.


I’m not sure that’s exactly accurate. The special JavaScript instructions, and the optimizations to retain/release counts within memory blocks, seem much more interesting. https://mobile.twitter.com/ErrataRob/status/1331735383193903...

I haven’t the foggiest idea if any of this could make it to competing ARM chips, but given how many years Snapdragons have been lagging relative to the iPhone “A” chips, I don’t feel too optimistic.


The “special javascript instructions” are a single instruction that’s only impact is removing a perf advantage intel has due to JS exposing x86 specific sentinels to the web.

The only thing it does is remove a branch in arm64 where the various JITs have to do:

    Int value = (int)some double;
    If (flags) value = x86 sentinel
Whereas x86 systems don’t have a branch by definition.

That is all the “special instruction” does.


I don't think it will be that hard for Intel & AMD to put little operations like the javascript float one (which is a standard ARM instruction BTW) or some special instructions to help garbage collection and retain/release counts work more effectively that all programming languages could leverage. It's something else that makes it better.


And that instruction is to replicate the x86 FP rounding behavior, which ended up in the JS standard. Nothing else...


> The special JavaScript instructions

That's an ARMv8 feature, not specific to Apple.


ARMv8.3 to be specific.


Are they really assholes? Competition is important to progress and I like that their M1 release has shaken things up so much. We should hopefully be seeing similar CPUs released in the future, now that Apple has shown a glimpse of what's possible.

Until then I'll be enjoying the M1 because even if they're assholes, they're assholes who can make a computer I actually like using.


>Most of what makes the M1 interesting is TSMC's 5nm process, why wouldn't we expect similar results from future x86 CPUs on 5nm or smaller.

It's many things working together, 5nm is not "most of it".

Not to mention TSMC is already working on 3nm.


Something nobody ever mentions is Apple is the new Intel: they will tend to be at least one process node ahead of the competition.

The reason for this is they have the scale to book all of TSMC's processing capacity for a new node. The AMDs of the world have to wait a year to receive the scraps while Apple move to the next TSMC node.


Do the companies better aligned make full-metal laptops challenging MacBookAir durability and design while being 100% Linux-compatible?


There are e.g. Dell XPS "developer edition" which are shipping with Ubuntu and have near 100% hardware support (I think there's some issue with fingerprint reader).


I wonder at what point they got rid of the Intel RAID/RST controller that'll never be supported in upstream Linux - it's in the non-"developer" 9350.


Coincidentally I have this exact model (9350 non-developer) and it runs (Ubuntu) Linux great.

Laptop is completely silent, all the hardware runs flawlessly, battery life is after 5 years still pretty good, just overall great experience.

I honestly don't remember if I had to at some point disable something in BIOS ...


I have it too. The battery is shot after 4 years (?) but lasts quite a bit longer in Ubuntu. It's had coil whine as long as I had it, the supplied Toshiba NVMe drive died and I've also replaced the Wi-fi card with a mostly sane Intel 9560.

If you're using a SATA drive with the default UEFI settings, Linux will see the drive. If you use a PCIe drive, it won't until you set it to AHCI mode.

Edit: oh and Windows frequently wouldn't recognise the USB-C port unless you had something plugged into it during boot.


You turn it off in the BIOS before installing your system, and that's it.


Which is what I've done when installing Linux. Am I losing performance? Power efficiency? Features?

It also makes dual booting trickier if you don't plan it beforehand.

It frustrates me that there's not that many reviews that cover this kind of stuff, so it's hard to avoid silliness like this when making a substantial purchase like a laptop.


> Which is what I've done when installing Linux. Am I losing performance? Power efficiency? Features?

From what I read a year ago before doing this, you don't lose much, and software-based RAID (if you go for that sort of thing) in GNU/Linux is just as efficient/reliable, and maybe more so. And anyway, if you only have one HDD/SSD, there is no point in RAID.

> It also makes dual booting trickier if you don't plan it beforehand.

I don't think so: installing Windows will work just the same with Intel's RAID turned off.


> And anyway, if you only have one HDD/SSD, there is no point in RAID.

Obviously. But why did they care to equip this laptop with the RAID by default?


Apparently, Intel's RST is not just RAID, but it's also supposed to help when a laptop is equipped with two storage devices, one small and fast (SSD) and another big but slower (HDD): https://superuser.com/a/1578326

IIRC there were Dell XPS models like that in the lower price segment. Never tried one personally.


It is simple enough to set up windows to use the normal boot rather than the RAID one. I've got my XPS set up with one drive in Linux, one in Windows - and boot into the drive I want to be in.


I'm hoping in the future System76 does, they're already making excellent desktop machines.


My understanding is that some of the newer ThinkPad's fit that bill.


The other thing that makes M1 interesting is it's ARM with specific hardware support for fast x86 emulation, which IMO is unlikely to be replicated in a top of the line laptop for the next 5 years.


x86 emulation is not an issue for 99% of the ARM market. I think it's safe to assume M1 is the first Apple chip that has it.


A12Z has it.


That's kind of weird. What is it used for in iOS?


Nothing, but the DTK uses it.


Interesting. It's either that Apple wanted to use it as the first-gen ARM-based Macs or that they planned it to be used internally and in the DTK while M1 isn't ready.

Either way, it's quite impressive to have deliberately engineered dark silicon on consumer devices.


I could be mistaken, but I’m unaware of any kernel capabilities/optimizations for heterogeneous cores with differing power requirements. System memory for both gpu and cpu workloads would also be a new requirement requiring some thought.

I’d expect it to be difficult for amd/intel to match performance without similar software capabilities.


Aren't you basically describing big.LITTLE, which has been standard in most mobile SoCs for years now? (And was probably introduced in Android before iOS.)


ahh you are correct! I see support was added in kernel 3.11


Without any independent actual measurements of the same stack running on the same hardware with those optimizations turned off, we can't draw any conclusions regarding their significance.

Considering Apple has been so adamant about misrepresenting what's really TSMC silicon as "Apple Silicon", I'm viewing everything they say through "PR oozing with insecurity about not actually controlling access to their chip's manufacturing process, and desperate to convince consumers (and investors) of this being a uniquely Apple advantage" glasses.


Based on your qualification of who's silicon it is, AMD, Broadcom, Marvell, MediaTek, Nvidia, STMicroelectronics, and Qualcomm do not manufacture chips... Even IBM uses TSMC. That's a lot of the CPU silicon shipping right now.

The market has changed, and most CPU vendors don't fab their own chips anymore. The IP is in the chip design. Are you saying TSMC designed the M1?


I'm saying TSMC makes non-Apple chips as well, so Apple isn't enjoying as defensible of a moat when it comes to silicon as their marketing wants you to believe.

We are in agreement.


But the point is that manufacturing has become commodified. If TSMC doesn't work out, there are other fab companies to use.


Or do support Linux on M1 because you do want it, and reward Apple for making good hardware.


The other thing that makes it interesting is getting performance comparable to the fastest Intel/AMD processors as 3Ghz. I expect that AMD will catch up with Apple on performance when they switch to 5nm, but Apple's low clock speed means they're likely to retain the power usage advantage.


My AMD APU begs to differ.


Is there an ETA for 5nm AMD SoCs?


Not AFAIK, but it's not like 7nm AMD SoCs are intolerable dogs.

Most M1 vs. PC comparisons are being made against Intel's ancient process, for obvious Apple-favoring reasons.


It’s mostly compared to Intel since there’s no official AMD support on MacOS whatsoever. You very well could compare hackintosh AMD to Apple’s M1, but it wouldn’t be comparing Apples to Apples. There are indeed AMD Windows comparisons, however[0].

0: https://youtu.be/4MkrEMjPk24?t=10m39s


Although Apple booked TSMC’s entire production line (for an unknown length of time)[0], this official graph looks like it indicates 5nm before 2022 (so 2021)[1].

0: https://www.extremetech.com/computing/315186-apple-books-tsm...

1: https://youtu.be/iuiO6rqYV4o?t=12m56s


Huawei got a large fraction of TSMC's 5nm production this year because it had to stockpile chips ahead of the trade sanctions kicking in


Yes. Because it'd be good to provide an alternative. Yes it's locked platform, but performance/price wise it'll be interesting to see what happens, and it's unlikely that we'll have an open alternative to it soon.

People replying "no" seem to be doing so for ideological reasons with no regard to practicality. Obviously it'd be better if it was an open platform, but it isn't, and that's fine. Whatever you think of Apple, you have a very simple option of not using their products. I don't use a Mac for that reason. But with the performance and price of new M1 machines, I can definitely see a usecase, probably not as a daily driver, but I'd still need it to run linux natively. Anyway, people who will take on such an effort (with the appropriate funding) aren't necessarily going to be working on something else if this doesn't materialize.

I don't think that asking this kind of question on twitter/hackernews is representative. Start with something and open up some way to fund it, then either continue or stop if it makes sense.


Having this project funded also would send a signal to Apple, that there is at least a group of people willing to put some of their money onto having Linux on their hardware and that there is interest, if they would give some support to the Linux project.


I wonder—how certain are we that Apple would be unwilling to assist with a project like this? They've said that Windows support is "up to Microsoft"[1], and they explicitly demonstrated Linux VM support at WWDC. Apple's primary mission is to sell hardware, and they know that Linux is important for certain consumers.

I wouldn't expect Apple to write or finance any code, but I could certainly see them providing documentation, and making an engineer available to answer questions, if the request came from a suitably professional source.

1: https://www.macrumors.com/2020/11/20/craig-federighi-on-wind...


Something I'm just not understanding here. Why the obsession of single thread time in 2020?

Give it 2 years and a competitor will beat it, Apple will be behind yet again and the price will be higher than ever.

And again, this is only for single thread. My computer is already blazing fast and intense programs will use multi threads.


I don't know what computer you have, but it probably produces more heat and has a shorter battery life than the new M1 Macs. That's Apple's real accomplishment—they made an ARM chip that is (very) competitive with modern x86 processors, but which has the power efficiency typical of ARM.

Will a competitor beat them in two years? It's possible, but I expect Apple to keep their lead for a while. CPU-wise, the iPhone has been well ahead of competitors for several years (even as other specs like memory lag behind), and no one else appears to be making serious inroads in this space.


Yeah heat and battery life are non issues.

The iPhone is a good example because supposedly it's the fastest, but it doesn't feel like it. Maybe it's the animations, maybe it's the annoying Apple popups to sign in and update. Whatever the case, it doesn't make for a great device. (My experience is from 2018)


> The iPhone is a good example because supposedly it's the fastest, but it doesn't feel like it. Maybe it's the animations, maybe it's the annoying Apple popups to sign in and update. Whatever the case, it doesn't make for a great device.

So, I take it you can see why the ability to use Apple's fast processors with a non-Apple OS might be appealing!


Hmm. I have considered that as well in the past.

Although I'd be a bit skeptical that Apple would allow this (as others talk about).

It seems extremely high risk, high cost, for an unnoticeable reward that will be obsolete in a year.


Linux has never worked well on Apple hardware. At one of my previous jobs I had the choice between a very high end "MacBook Pro" (1TB SSD, 16GiB RAM, etc) and a lower-end PC.

Since my job was developing a Linux distribution from scratch, I was going to be running Linux on whatever I got. I (naively) assumed that since I saw a lot of MacBook Pro's at open source conferences and other places where open source software is written, and the MacBook Pro was x86-based that Linux would just work -- but it was a disaster.


I've actually had no problems - older macbook pro (2012?) only issue was the retina console fonts were really tiny. I used ubuntu and arch.


I think if you want a laptop to run Linux then you should just buy a laptop that comes with Linux. There are some pretty good options out there today (Dell XPS Dev Edition, ThinkPad, System 76, etc.) that you can buy from the manufacturer with Linux.

Trying to take a laptop designed for macOS and turn it into a Linux laptop is going to be a painful and frustrating experience. It's like buying a Tesla and trying to put the Toyota navigation operating system on it. If you like Toyota's system then just buy a Toyota.


The only reason those Linux laptops exist is because thousands of engineers (mostly volunteers but also commercial vendors like RedHat) improved Linux hardware support to the point commerical system integrators like Dell and System76 could select components well-supported by Linux to make the Linux laptops.

Hypothetically if one-day the performance gap between Apple's chips and the rest of the industry cannot be ignored (contrary to Apple's marketing, it's not there yet) then having Linux well-supported will allow third-party vendors to provide on-going Linux support (because the daunting heavy lifting has been completed by marcan).


A very important chunk of those developers who improved Linux hardware support work at Intel, and it shows -- the safest choice for Linux-compatible hardware is Intel (GPU, Ethernet, WiFi, ...). It's very doubtful that Apple is going to do the same.


Keep in mind that something as fundamental as the GPU driver wouldn't be available even they managed to get Linux up and running.

And the M1 GPU is Apple's full custom GPU. Even if someone tried to reverse engineer a driver, it would take decades.


He seems relatively optimistic about getting a GPU driver going: https://twitter.com/marcan42/status/1333126014910701568

I don't think it'd take decades. Nouveau was pretty ok after like 5 years, wasn't it?


And Nouveau has to work with dozens of Nvidia cards, not just one, and full of legacy nonsense. I'm fairly optimistic it won't take 5 years.

(Especially not to the point of having a composited desktop; long tail game compatibility is less of a critical thing since, well, there are next to no games for ARM Linux...)


If you are just planning to hack something to show some basic and buggy 2D rendering, 5 years might be realistic.


This talk was posted in another comment chain: https://media.ccc.de/v/33c3-7946-console_hacking_2016

This was 3 years after the hardware release, presented on substantially more locked-down (but x86 and PC-like GPU to be fair) hardware with a demonstration of Portal 2 running in real time. Presented by the person you are replying to. I'm also assuming there was no funding involved given the requirement to build an exploit just to boot anything on the PS4.

As an unabashed cynic myself: Your cynicism as someone uninvolved with the project is outweighed by the informed opinion of someone who has demonstrated ability in the field of porting linux to other platforms.


To be fair, as the author of that: that was mostly a job of reverse engineering what was weird/broken/different about the PS4's GPU, which is otherwise a standard Radeon. It was effectively adding support for a new Radeon variant chip of an existing GPU generation. There was definitely a deep dive into the platform and I reverse engineered the proprietary Radeon firmware CPU architecture in the process (which nobody had done before), though, so I think I get some credit for that. But e.g. the userspace side "just worked" after a few trivial library patches; AMD's entire userspace blob GL/Vulkan driver worked completely unmodified once the kernel side was fixed up.

This would be adding support for a completely different GPU, which is a whole different ballgame and order of magnitude of complexity.

That said, as many hours as went into the PS4 Linux project, it was a hobby thing and I'm pretty sure if you add up the hours spent on the GPU side it wouldn't hit one month's worth of full-time work. I'm also offering a whole different order of magnitude of time investment here.


If you get funding to do the Linux to Apple hardware port, have you considered streaming the whole process on e.g. Twitch?

Would be very interesting to watch everything from the sideline. And as a reason to do it, aside from being inspiring and educational to others, by live-streaming your work you will certainly be able to attract even more funding over time.


I don't think I'd do literally the whole process as that could get old and tiring quite quickly, but I am definitely considering doing streams when it makes sense and I'm in the mood, or maybe setting up a regular schedule, or something like that.


Somewhat responding to a twitter post you made, but the Apple GPU is absolutely not a legacy-free architecture: it still looks broadly (very broadly) similar to a PowerVR GPU, which I don't think anyone would claim is even remotely sane.

To give you some context for the scope of the task: with all the hardware documentation and close to 32 years (combined) experience with that specific architecture it took around two years to write a conformant Vulkan implementation. Not including the compiler, kernel driver, or firmware.

Source: I have worked both on PowerVR and Apple GPU drivers.


Thanks for the context. I know the GPU isn't going to become a fully compliant implementation on par with the e.g. Radeon drivers with only one person working on in in months.

That said, are you talking about a full graphics stack, or just the GPU-specific bits? Modern Linux graphics heavily emphasize code reuse across architectures, both on the kernel side and userspace side, which is completely different from how vendors typically do things.

The goalposts are quite different between having a fully compliant implementation that can deal with software that uses the GPU intensively (and hits all the corner cases properly), which is what Apple has to build, and being a usable desktop experience for Linux - having the whole stack be open source makes debugging things a lot easier too. Getting to full Vulkan compliance is definitely not going to be a one-person job. But we've seen fairly fast progress on GPU drivers going from triangles on the screen to simple game ports in some cases, with only one or two developers involved. I obviously don't have any yardstick for how the Apple GPU compares to everything else, but a priori, my hope is to be able to push things past the initial cliff to where the GPU starts being useful, and then of course other contributors can help improve support. In my experience, the initial hurdle in building a community project is often the hardest part, as a lot of people have no idea where to start supporting undocumented hardware (and I'm particularly good at doing black-box reverse engineering of this kind of stuff; I can claim a good 15 years+ of experience there myself). For example, once e.g. the shader ISA is documented and a PoC backend implemented, I expect people with more shader optimization experience than me to want to pitch in and make it faster.

Re firmware, I would expect to use Apple's (at least initially).


I know very well who marcan is and I also have quite some experience in this area. I know how challenging these kind of things are and I just didn't expand on my argument because it is a very ambitious project.


"Just 2D" isn't really a thing, given modern GPU usage is all based on fully programmable 3D pipelines anyway. What I mean is the endless frustrating job of debugging closed source game corner cases (and often outright game bugs) is less critical here, since those games won't exist for ARM anyway. If it gets to the point where X11 and Wayland, desktop environments, common widget toolkits, browsers, media players, etc work smoothly and stably and perform well, then a lot of people will be very happy already. The long tail will always be there to chip at, and that will take years (not like vendor drivers are bug-free on any OS either!), but it's not a blocking factor to a usable desktop Linux experience. Lower-hanging fruit first.


It has been longer than that and 3D support is still a bit shaky. And this is based on known hardware.


Maybe Apple can provide some blobs? Like the closed Nvidia drivers


I doubt Apple would though. Why help people run non-standard OSs on your hardware with no App Store?


Because it sells the hardware, where they make the most money. I don't think they will do something like that, but I think the reason is that they just don't care enough to dedicate internal resources.


Well, even for BootCamp, a marketed feature, they only dropped a halfarsed set of drivers which worked very poorly and were pretty never updated. We need to use hacked AMD drivers to get updates for the Radeons and the Macs still overheat and burn a lot energy due to lack of even basic power saving features. Heck, even the GPU switching isn't implemented.

Why would there be expectation they'd do anything more for Linux?


> Why would there be expectation they'd do anything more for Linux?

Linux is more complementary, compared to Windows which has been an existential threat to the Mac since the 90s.

I agree with an earlier comment: I don't expect Apple will provide any assistance, but neither do I see it as entirely unlikely.


Windows in the Bootcamp role was also compliment ary, in that it enabled a company or individual worker to consider Ape hardware where Windows was a requirement or desired as a personal preference.

This also enabled those users to spend time in MacOS or switch to virtualizing Windows and may lead to one or more personal purchases for themselves or members of their household.


Apple is turning into a services company, though, look at their numbers.

And one of those core services is the App Store.


I'm pretty sure the vast majority of Mac apps still aren't distributed on the App Store. The Mac App Store almost certainly isn't a big money-maker.


They're turning the screws, the ARM Macs are the first Macs that will not run unsigned binaries.

It's already at the point where macOS will treat apps as if they're radioactive if the developer didn't pay the $100 Apple tax before distributing it.


They don’t run unsigned binaries, but they run self-signed binaries (to the same extent that Intel Macs run unsigned binaries) and the linker automatically does the signing. It’s an architectural simplification, not a substantive tightening of the screws.


Self-signed applications are treated as if they're radioactive by macOS, too[1].

[1] https://lapcatsoftware.com/articles/unsigned.html


Yes. My point is that blocking truly unsigned applications on Apple Silicon is not a substantive change from the status quo on Intel Macs.


Programs on Intel Macs can do some funny things to invalidate their code signature in was that Apple silicon won’t support, which I guess you could call a change. But I agree that the transition was probably mostly made for simplicity.


Yet.


By the way, there's a crowdfunding project to improve Linux touchpad support titled "Linux touchpad like a Macbook: goal worth pursuing?".

I don't know how successful that project was, but it might provide some ideas around crowdfunding improved hardware support.


I used to run Ubuntu on an Intel macbook back around 2009 or so. In general stuff worked ok but little things like the touchpad was goofy and only basic functionality. webcam had all kinds of problems. Suspend/resume didn't really work. Battery life was terrible. Screen brightness buttons didn't work.

So I'm sure it's cool and people will like it but no, I'm not paying $5 month for a crappy computer experience. Good luck though!


That sounds exactly like every Linux-on-desktop experience I've made with any hardware.


I'm not sure what you've tried. If you run Linux on a fairly generic computer (like an Intel NUC) then everything will work out of the box. The more you deviate from the average the more likely things will break.


I'd throw some money at an OpenBSD port, but that's even less likely to happen IMO.


One thing that's stumped me on the Apple M1 chip design is how they are getting consistent timing on their different kinds of cores, with the layout being so asymmetric. Like the 8 core GPU, the Icestorm cores, the neural engine cores.

Or is the fact that the physical distance is so close with the newer process that a small timing delay doesn't matter any more.

If the firestorm cores are laid out correctly, those make sense to be symmetrical.

https://images.anandtech.com/doci/16226/M1.png


I would much rather fund the development of a decent UI/UX and associated application suite that runs on Linux.


You know, Apple runs a lot of servers too. Those servers run Linux. Why wouldn't Apple want to run Linux on their own chips? The power savings would be enormous.

Apple could port Linux without much trouble. It's just Linux drivers -- they are very capable of the work.

Something like a less beautiful Mac Mini M1 would probably suit their server needs decently.

I wish they would sell the chips and boards as XServe again though, and perhaps offer some cloud computing too. I don't see it happening though -- they are going all in on the consumer market.


Do we expect that this port will be complete before another chipmaker is able to achieve similar performance?


If mobile is any indication, expect the performance gap to widen as time goes on.


Isn't Qualcomm kind of the same distance behind with the most recent hardware? I don't get the impression that the gap is widening, with the new high perf ARM cores it seems to be shrinking, even.


The M1 represents Apple’s best effort running on the world’s most advanced manufacturing process.

It’s a pretty safe bet that up until now the Qualcomm team hasn’t been given the freedom or the resources to make their best effort on a general-purpose desktop-class chip. Now that Apple has shown what is possible, you might think that the gloves are coming off. Especially if Microsoft and/or Samsung* indicate that they would buy such a chip at the price it would have to be sold at.

* Microsoft and Samsung currently sell ARM-based laptops, with the Microsoft version being a slight modification of the Qualcomm chip in the Samsung.


I am absolutely willing to contribute to such an effort, even if I am happy to run Linux inside a VM on my mac. The M1 is the breakthrough CPU ARM needs to get truely established on the desktop and consequently gain more momentum on the server. As Linus said, developers need to develop on the architecture they want to deploy on.

Having Linux natively run on the M1 would bring more Linux devs to the platform. Linus himself would be one of them. Also it means that if Apple deprecates older machines, there is a second life, even if one preferred MacOS in the first life of the machine. Especially, as Linux for most tasks can be run well on older hardware, just look how much is done with the PIs.

Yes, this seems to be an uphill battle, but that is the reason funding is needed. And while there are many obstacles, there is a chance that it can be done and I am willing to take it with some of the funding.

Apple supports Bootcamp on their Intel hardware. They don't seem to be fundamentally opposed to people running an other OS on their hardware. The best way to get any support from Apple for this, be it some documentation or some binary drivers, would to show clear interest and an existing community.

P.S.: Any larger motion on running Linux on the M1 also would help pushing other vendors towards offering ARM-based systems for running Linux, as it shows a fundamental interest in such a platform.


Question as long as we're on the subject... software companies can decompile competitor's software and figure out how they did something. Can Qualcomm or another chip maker "decompile" the M1 and figure out how Apple made it? Apple seems so far ahead of the other ARM chipmakers, can the other guys catch up by studying the M1 chip? Or is it too small/involved/etc to learn much from?


If Qualcomm and other ARM chip designers haven't been able to keep up to pace while the last 8 years of iPhone CPUs have been near the top of their game, I don't see that changing anytime soon, unless they hire the right people to be in charge and even still, chips like Apple's A series are hand laid out and very, very time consuming to create. They have the ability to use a scanning electron microscope to look at the design, but even if you can start to reproduce their design (very time consuming) you're taking away resources from designing other processors.

I asked a friend who used to work in chip design at IBM and was blown away by the design of the M1. according to my friend, the design is extremely custom and most likely hand laid out for their specific optimizations. Also they were stumped by how apple could be getting consistent timings between cores when the design isn't totally symmetric for a particular cluster.


No.

I am not going to spend time on anything to that company tax every developer, takes 100$ yearly, virtue signals that they focus on privacy but phones home every time when binary is executed.

I am sorry but I don't like to build home inside walls.


I would not fund this, because I would be suckered out of money by someone who not only fails to deliver, but who is surpassed by others who follow a conventional bazaar model.

Successful Linux on Apple Silicon start as something almost completely useless that only hackers want to use (gets you a Bash prompt with GCC kind of thing), not that anyone wants to use. This is perfectly fine.

Imagine if, in 1990-ish, Torvalds had announced, "I'm taking donations for making a Unix clone for 80386 PC's that everyone will want to use in production".


I'd rather crowd fund a 100% open mobile chip. We've got 'open phones' already. But they're all still using qualcomm blobs. I think that is a huge vuln in our communications system. It's like if one company controlled the entire internet... Every piece of data being passed in and out of the same chip right down to the way radio channels are chosen, ciphers used, transports checked, and control messages processed... all hidden. Using so much insecure crap. How many people use phones for sensitive info? this stuff is swiss-cheese, nightmare material


Wait, did Apple actually add an option for unsigned boot on M1? I remember their WWDC overview saying they were only going to have either iPhone-style latest-version-only or any-version options, and them later saying they'd only be supporting virtualization. If they have an owner override for the secure boot on M1, then that's absolutely fantastic, and I suppose puts more support around Apple's assertions that Microsoft needs to license Windows on ARM for M1 before we can have Boot Camp back.


Yes, you can use recovery mode to set the ‘permissive’ security policy, which “Does not enforce any requirements on the bootable operating system”.

https://support.apple.com/en-gb/guide/mac-help/mchl82829c17/...


From the very twitter thread you're replying to: https://mobile.twitter.com/marcan42/status/13331260180689551...


Yes, you can turn off all of the secure boot features. Booting other OSs is most likely coming.


Well, from what I understand its less "turn off secure boot features" and more "create/add a signature for your custom code to secure boot". https://mobile.twitter.com/never_released/status/13263157410...

I can't find a primary source for the comment in the tweet, but it seems widely regarded as true at this point, so fingers crossed it pans out that way.


Hm, interesting. I can't seem to find my source anylonger either. The wording "kernel" does suggest to me that this is about a check a bootloader performs. If you were to boot another OS, you'd probably have a different boot loader as well.

This tweet https://twitter.com/never_released/status/133243677102043545... from the same author suggests M1 is "not any more locked down than [...] Intel", so I'm not fully sure about the technical details, but in the end you should be able to boot your own code.


The lowest security level, permissive, allows to mess with the Secure Boot policy at will.

When you use kmutil to add a custom boot entry, enrolling the hash of your executable to the Secure Boot policy is handled automatically as part of the tool.


https://developer.apple.com/videos/play/wwdc2020/10686/?time...

Reduced security allows you to run any version of macOS, including the versions that are no longer signed by Apple.


You can go lower than that; this is the “medium” security setting.


Usually the SoC vendor provides the drivers to the Linux kernel tree as they have access to the hardware register specs of the various constituents of said SoC.

With these M1 chips, you're gonna take a long time to get to that level of understanding. I think Apple would likely need to provide Linux driver support to get this to a complete model.

I'm sure the chip can be reverse engineered as they got Linux running on the T2 chip when it was jail broken.


No. This would solve only one of my issues with using a Mac, which is that I want to use a Linux distribution. It doesn't affect the other issue, which is that Apple hardware, while very nice, is heavily overpriced and extraordinarily expensive to upgrade.


Does OS X still run on top of Darwin? If so, seems like a GNU/Darwin environment might be a better bet.

https://en.wikipedia.org/wiki/Darwin_(operating_system)


Since macOS runs on top of Mach, maybe we could port GNU/Hurd to it.

/me exits the room.


Twitter seems like a poor choice for this (anything really?). Is there a gofundme or something?


The Twitter link is just a poll to survey interest. If you read further in the thread, he suggested the actual fundraising would be on Patreon.


He should do a Patreon. Other people have done this sort of thing with this model over there. Seems sustainable.


Yeah forgot about this, not sure how well it would work for a project without a regular set of deliverables but probably better than gfm


Github Sponsors is better. No fees/VAT.


Certainly not, I'd be willing to fund a Linux port to an open architecture. Apple is not bringing anything positive to the table, why in the world should we give them software for free to increase their proprietary hardware sales ?


Can someone explain me the purpose of writing such this long thread....on Twitter?


The audience is on Twitter.


Interesting, $6300/mo currently committed. That's not insignificant.


Pretty easy to "pledge" by clicking on a twitter poll. maybe 5% of that will actually turn into real money


True, true.


It doesn't have to be Apple. Apple is the company that is most interested in complete lock in.

The Pinewood Pro, an ARM based Linux laptop, is currently pretty poor but it costs $US 200. For $US 400-600 perhaps someone could make a good Linux based ARM laptop.

Amazon's Graviton 2 processors are meant to be pretty good. Perhaps they could do it. Or someone else.

Ars has a review or the Pinebook Pro : https://arstechnica.com/gadgets/2020/06/pinebook-pro-review-...


Pinebook is possible (as are most of the SBCs we have) because there is an almost endless supply of processors and SoCs designed for cellphones and tablets. If a company decides to make a part that's really focused on performance (and tuned to the OS) the same way the M1 is, and that matches the needs of a laptop, then it'd be possible. But, unless the part is incredibly well tuned for this scenario, it'll fail the same way as the Windows on ARM laptops have - same price as an x86 at meh performance.


I don't quite believe the people who might want to run Linux on the M1 would consider Linux-on-"currently-pretty-poor" to be as similar as you make it out to be.


marcan's patreon is up: https://www.patreon.com/marcan


No. It would be an ongoing effort, there are plenty of platforms for Linux already and we have more flavors than needed as well


I think if Linux really wants to run well on consumer hardware - in other words, not servers - then we’ll have to build the hardware.

No idea how that happens from a business perspective, but Raspberry Pi seems to be a great model.

Perhaps a Pi Pro could be designed that meets some of those needs. The Zero is almost too mobile for a laptop to be usable, but who knows, maybe someone will come up with a unique design around it.

But if we don’t control the hardware, the. It’s always going to be a second or third class citizen on mobile (laptops or tablets).


The Raspberry Pi is not a good example if you want control of the hardware. Closed GPU and microcode, no chance of finding actual hardware documentation (beyond coarse schematics), and no way to buy components. Broadcom is not aligned with your goals.


I’d wait for Nvidia to produce a similar CPU for Windows and expect it to be open enough, meanwhile using amd64 on desktop


I'm assuming this is for the kernel itself. After that, does the distro matter?


A distro specifically targetting the apple form factor would be great. Even with linux as it is now there's still a huge number compatibility issues and jankiness at a per distro basis.

A distro targetting just apple would greatly simplify what needs to be done and overall lead to a better user experience.


Not really; all the distros use the same low-level components. There would be work on a per-distro basis to use the right #defines and patches though.


Asking "Per month" for non-specified period of time is not a well posed question. Do you suppose to pay per lenght of developement period? Or you expect users to pay for the resr of their life for priveledge of using such port?


Why asking the community and not the Linux Foundation?


I understand why the developer would like to get a recurring form of revenue, but as a consumer buying a subscription makes no sense at all!

The goods obtained (Linux on MabBookAirX,Y) are very much non-recurring by nature...


This is the newly-normal software subscription package or a stipend to bootstrap a project. Both feel pretty normal!

Even if it is the former, updates & improvements take work. And they seem likely to be necessary here.


The deliverables of any software project is obviously incremental in nature (EG, Initial boot with software graphics, 2D graphics, power saving features, 3D graphics etc), so the funding model makes sense


Come on Marcan, work on ARM's GPUs instead.


Hmm, donate to the homeless, or Apple?


No. I appreciate the effort and the skill this would take, but since the bulk of the work would be reverse-engineering the custom GPU to get a driver working, I’m not super-interested in a solution that in an ideal world (assuming this takes months and not years), still won’t be as performant as a native driver (and performance is the whole reason to do this, so getting substandard performance confuses me — maybe focus on more open ARM-based machines and chipsets).

There is also the ongoing work. Apple currently releases a new iOS AX chipset every year. It’s reasonable to expect something similar for the M series. That means additional work to reverse-engineer. I’m not trying to be negative, but the amount of money required for that kind of ongoing work really doesn’t seem ideal for crowdfunding. Someone downthread made a point that most of the good Linux hardware support in recent years has come from Intel. That’s a great point. For the work this would need to be really viable and not just a fun toy, this needs to be backed by a corporation or by Apple itself.

If this were explicitly stated to be a toy/proof of concept goal, I might chuck $50 as a one time thing at it. Because why not. But for an ongoing thing with real support, I’m sorry, I don’t think this is feasible to undertake. I would love to be wrong about this, but I’ll add that expectations are a lot higher when you start out charging.

And here’s the real thing: As happy as I am that people want to explore this and as open as I will be to experimenting with this stuff when people inevitably reverse-engineer the M1 enough to get X11 running (because that will happen without any crowdfunding campaign — we’ve had people successfully get Android and other versions of Linux running/booting on older iOS devices), the real future of Linux on M1 and macOS will continue to be through virtualization. QEMU and other hypervisors (including the native HyperKit) are how most Mac users interact with Linux anyway — and QEMU is quite performant.

I get the appeal of doing something because you can. I get the appeal of wanting to use really nice hardware for other purposes. I don’t necessarily understand the appeal of running a Linux distro on bare-metal in a hacky way when running it virtualized is likely going to offer better performance than a reverse-engineer job funded by the potentially small Venn diagram of people who both care enough about Linux on the desktop to fund this sort of project AND who don’t have a problem buying hardware from a company that really gives zero fucks about FOSS ideals and is literally the antithesis of open hardware. (For the record, I’m not in that union because I personally don’t care about Linux on the desktop on modern Apple hardware (I do actively support people building PPC Linux distros for obsolete hardware).) Especially when macOS IS Unix and will be able to run most of the same userland packages as Linux once they are ported to ARM, albeit with a much prettier windowing system and much better commercial software support.

If you’re someone who really loves KDE or GNOME or Xfce or whatever DE, well, you do you bro — happy for you. But you’re likely going to be way happier on a machine that doesn’t cost $1500 and isn’t designed in ways that actively discourage you from replacing that DE or make booting to another kernel more difficult.


No.

Between VMs and other cheaper hardware this feels very silly to put it lightly


Cant a $2 trillion dollar company fund this by itself??


Yes, but they make macOS not Linux. The amount of HW sales generated by Apple putting people on writing code to support Linux on their HW will be less then the Apple sales teams collective bar tab.


> Yes, but they make macOS not Linux.

And why is that? Instead of maintaining an entire PC OS themselves they could build macOS XI as a desktop environment on Linux and benefit from the vast ecosystem. But they don't, because...? Presumably because they value total control.


Because why? What would the motivation be? They are a company that goal is to make money for their shareholders. They purchased NeXT which was built on MACH+FreeBSD which was released in Sept. 1989 (work started in 1985), 3 years before Linux 1.0 was released. MACH+FreeBSD was a stable working platform for Apple when they acquired NeXT in 1997. There was zero reason for them to move working code to Linux 2.x (2.0 shipped in 1996). Heck I had a NeXt setup and a Slackware setup at that time and NeXT was better hands down. Today the kernel they used is tightly coupled to their HW and they have control over it for HW features like the T2. Why would a $2T company put themself at the mercy of a kernel maintainer to take their patches? The vast Linux ecosystem is of no use to Apple in most cases.


Btw, I hate twitter for this. Why isn't someone as technical as Marcan posting this on a blog?

The world went mad.


Eyeball count matters a lot.


Not sure. As long as Apple are hostile to running anything else on their hardware, reverse engineering it (for all the drivers and etc.) might be simply a waste of time and resources which could be better applied to something useful and actually friendly to openness.

Especially since such reverse engineering will require constant chasing after changes to stay relevant.


False. Respectfully it seems you didn’t read much of the thread.

From the thread Apple have actively left the door open so other (currently non existent) OS could be booted, but as the author said, they’ve left it open but have not provided documentation.

From the thread:

“Linus himself has said he'd love to see Linux running on M1 macs, but doesn't think it'll happen. Not because they're locked down, but because Apple won't help us with documentation.

But they at least did this:

https://mobile.twitter.com/marcan42/status/13331260180689551...

The fact that Apple allows this in their secureboot policy is critical. Linux ports to game consoles / iPhones / etc are all fun and games, but you're always at the mercy of exploits, and the resulting cat and mouse game.

That means all effort is wasted unless exploits keep up, and regular users would never want to do this and severely limit their access to official upgrades and such. But this isn't an issue with M1 macs.”

Continues to discussions of limitations..


> but because Apple won't help us with documentation.

No documentation is being hostile to other OSes running on their hardware.


I agree. Not locking the bootloader doesn't mean they're going to embrace running other operating systems.

As a long time mac user, the idea of them supporting other operating systems seems really un-Appley to me. What happens if I put linux on my (still under warranty) macbook and then have to take it in to an Apple store for some reason?


Presumably the same thing that happens if you put Linux on an Intel MacBook, which has always been possible. I don’t know what that is exactly. But in my experience, Apple does tell everyone to back up their data before handing a computer in for repair; there’s no guarantee the computer will come back with disk contents intact. So assuming you’re backed up, you could just wipe the drive and temporarily install macOS if it becomes an issue…


It's certainly possible, but I don't think it's supported. I'm not just talking about hardware issues, but like general support. As far as I know, if you have a mac under warranty/applecare you can book an appointment with a 'genius' to help you with any sort issue, hardware, software, or even just general instruction on how to do stuff.

I would guess that if you showed up with a mac under warranty but with a custom OS installed, they would boot macos in some fashion and run whatever their diagnostic tool is, and if the hardware checks out they'll tell you to buzz off.

I guess my point overall is that Apple seems to be pretty intent on curating and supporting a specific user experience with their products (which is a big part of their pre-occupation with controlling everything top to bottom in the first place), and supporting 3rd party OS's seemed at odds with that.


See the comment by Linus Torvalds about it. Leaving the door open is a poor excuse, because they can close it at any time putting all your invested effort down the drain. I'd say - don't waste your time.


People have been saying that since the first implementation of System Integrity Protection/Gatekeeper 5 years ago. It's been long enough to the point I think your comments counts as FUD... yes they can do it, but there hasn't been any indication they would.


I wouldn't trust Apple with anything. Lock-in is their bread and butter. Just because they didn't lock something down for some time means nothing given they are very aggressive with lock-in in other areas. You don't need FUD to know their infamous reputation.


The problem is this notion is not able to be disproved. While I'm completely confident that 10 years from now the Mac will still be open enough to be considered a "real" computer, I'm also confident that 10 years from now someone will be predicting its imminent demise because the water just keeps boiling hotter and hotter.

So we'll keep having this argument (and the argument over whether apps will be installable in the future without Apple's permission), over and again, year after year.


IMO it's not an unreasonable notion to have. I don't think its out of the question that they would change course and lock it down. I'm not saying it's likely or probable, but I don't think it's an unreasonable concern. In any event, it annoys me that the parent commentator is being dismissed out of hand and downvoted for expressing a legitimate concern.


It's a tired, un-nuanced concern that doesn't indicate any understanding of why Apple favors lock-in.

It's like an online political debate: it's possible to persuade someone, maybe, but you need a better argument than mean old Apple wants to take our toys away.


I'd say it's not tired and is completely legit. Apple have no one to blame but themselves for having such reputation. They simply didn't do enough to earn trust, while did a ton to earn distrust.


It's a tired, un-nuanced concern that doesn't indicate any understanding of why Apple favors lock-in.

Well, why do you think that is? As far as I can tell it's some combination of protecting users and protecting app store revenue, both of which apply equally well to macOS as iOS. The only difference is that the Mac app store isn't as established so they can't force everything to go through it today.


I think you're probably right in that this discussion isn't changing anyone's opinion.

That said, I don't know how it could be tired, as these laptops were just released. I also don't know what you mean by 'un-nuanced', but I don't see how understanding why they favor lock in is relevant. He's not talking about the merits of a closed ecosystem or locking things down, just about whether or not they might lock things down further in the future.

I think most people understand the motivation behind Apple trying to control every aspect of their products, whether or not they think it's a good practice.


> As long as Apple are hostile to running anything else on their hardware

Are they? Apple isn't only iOS. Macs have always had unlockable bootloaders.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: