iCloud ‘Optimize Mac Storage’ breaks the Mojave installer

Yet another example of a really bizarre macOS bug that’s pretty inexcusable as a test escape, given it occurs with the default installation settings on a completely clean OS install.

In short, the Mojave update installer does not work (on High Sierra at least) if you have ‘Optimize Mac Storage’ enabled for iCloud Drive (System Preferences > iCloud pane > iCloud Drive Options… button > Documents tab > Optimize Mac Storage checkbox).

Specifically, the installer reports:

Installation requires downloading important content. That content can’t be downloaded at this time. Try again later.

…and indeed fails to download the actual Mojave update files (the installer app as ‘installed’ via the App Store is merely a 22 MB bootstrapping app, that downloads the actual image only after you run it & start the installation).

Even more obnoxiously, if you use the dosdude1 Mojave Patcher Tool to force-download the entire installer, as soon as it completes the 6.5 GB download and produces the ‘Install macOS Mojave’ app in /Applications, the system deletes the downloaded installation files out from under that app, rendering it just as broken as the official App Store version. Infuriating.

Aside: to be clear, turning off ‘Optimize Mac Storage’ enabled me to produce – and keep – a working installer as downloaded by dosdude1’s tool. I did not verify that it also fixes the regular installer as downloaded via the App Store.

I also ran into the “The recovery server could not be contacted” error message even before all the above, but thankfully that was fixable via the means normally prescribed online – running “sudo ntpdate -u time.apple.com”.

The Myth of The Web

The recent kerfuffle with Microsoft Edge vs YouTube was particularly interesting since while I have no specific knowledge of that instance, I certainly do have some cultural insight from nearly eight years working inside the so-called Chocolate Factory – though not on any web stuff, to be clear, so my experience is in the broadest internal sense.

Like everyone else last week, I was trying to determine how much intent or malice was behind Google’s actions, but with a marginally more informed perspective – or at least a relatively unusual one.

Permit me to first provide some larger context, though.

When I worked at Apple, back in ~2006-2010, I insisted on using Camino, because it was superior to Safari at the time (which, among other flaws, was particularly crashtastic in its early years – an attribute which thankfully is long gone, but has burned itself permanently into my emotional memory).

That choice to not use Safari caused periodic issues because some Apple-operated websites wouldn’t work properly with anything but Safari. When I reported those issues internally, the typical response was “we only support Safari”. From the perspective of Apple, once they had their own web browser, that was all that mattered. Thankfully it wasn’t a huge issue since the web wasn’t that important to day-to-day work at Apple, as they used native applications for most things (sidenote: I still miss Radar… I didn’t miss Xcode for a long time, until I was forced more recently to use IntelliJ). And certainly the world outside Apple didn’t care about this cute little ‘Safari’ thing, at the time.

My experience at Google was essentially the same.

The vast majority of Google’s internal websites do not work properly in any browser except Chrome. This is a very real problem since it’s practically impossible to perform any job function at Google without using their internal websites heavily, since Google is so dogmatically opposed to native applications. Google has worked extremely hard to [try to] make it possible to do almost anything through Chrome (often to the point of absurdity).

Ironically even Microsoft – whom I currently work for, via LinkedIn – are on the Chrome bandwagon, as some of their websites – that I am required to use for work – require Chrome.

Most interestingly – and distinct from Apple’s behaviour, where dysfunction in browsers outside their own was predominately based on actual functional differences between them – this ‘requirement’ to use Chrome is often not because of any actual, functional dependency on Chrome, but because Google’s (and Microsoft’s) web developers will specifically require a Chrome user-agent, and explicitly block any other browser. While this is easily worked around by spoofing the user-agent field – and is how I know that the purported Chrome requirement is usually a fallacy – it emphases the mentality at Google:

There is no web, there is only Chrome.

This is, I believe, the crux of the matter in not just this Edge vs YouTube issue, but with web development broadly in space & time. The vast majority of web developers don’t create content for The Web, they create content for a browser. One browser, usually.

I saw it unashamedly unfiltered inside Google, but it inevitably leaks out in time – through things like carelessly & needlessly crippling other browsers’ performance on YouTube.

While today that browser happens to be Chrome, before Chrome existed there was still always that browser – e.g. the 90s and much of the 00s was defined by Microsoft Internet Explorer’s dominance and the refusal by the majority of so-called web developers to create content for The Web rather than just Internet Explorer. (Of course, back then The Web really was almost synonymous with Internet Explorer, with ≥90% marketshare for many years, so at least it was more pragmatic back then than Chrome obsession is now.)

So, I’m actually sceptical that the YouTube team explicitly sabotaged Edge – rather, I think it’s just one of endless cases of web developers not really caring about The Web – ignorance & indifference, in other words, rather than [outright] malice. But just as caustic & dangerous.

What particularly concerns me today is that it’s not quite the same as the terrible 90s and 00s. Then, when Internet Explorer dominated, the vast majority of important websites were not operated by Microsoft.

Today, Google’s web properties are dominant in mindshare if not marketshare to the point of essentially being monopolies (certainly in the case of YouTube, at least) – way more than anything Microsoft ever did was.

More to the point, 90s web developers chose to develop for Internet Explorer exclusively – they were not coerced into it for the most part, nor firmly bound to that choice, because their corporate masters did not have a horse in the browser race and were pragmatically & unemotionally going for audience reach. A meritocracy was possible, and existed to a degree, and was essential to the rise of Firefox, my nostalgic Camino, and yes, even Chrome.

Google very much does have a horse in that race, and I know – from many years of experience inside Google seeing their unfiltered opinions – that they absolutely do want Chrome to become the only horse in that race. Not because of some comically-evil secret council scheming at the heart of Mountain View, but because they culturally & corporately just don’t care about anyone else. Modern Google is just as paranoid, fearful, power-hungry, and ruthless as 90s-era Microsoft ever appeared to be – Google want control, and the browser today is as fundamental to control as operating systems ever were.

Given all that, my fear is that there’s no longer a practical way for another browser to compete on merit with Chrome – anymore so than a third party app store can compete on iOS, for example.

Chrome is open source in the literal sense, but not in the more important governance & existential senses. The only way to give The Web a chance is to remove any corporate browser bias from the minds of the top websites’ developers – Google’s web developers. (Or, technically, to just supplant Google’s numerous dominant web properties. Good luck with that.)

This assuredly won’t happen anytime soon by way of government intervention, given the current U.S. political circumstance, but it is conceivable that Google themselves would perform this surgical separation voluntarily, for the good of The Web.

Sadly, I fear that’s unlikely in an era post-“Do no evil”.

Full Disk Access is required to access Time Machine backups in Mojave

I’ve been struggling since Mojave came out to deal with it’s over-bearing expansion of SIP (“System Integrity Protection”), which is basically a super-root notion that blocks access – even to root – to lots of basic parts of the system, including obvious & mostly sensible ones like /System and /Library, but also less usefully things like any & all Time Machine backups.

Blocking access to Time Machine makes it very difficult to actually use Time Machine, since it’s then difficult to retrieve files from a backup (you have to then use the stupid ‘warp’ Time Machine interface, which is slow, ugly, and buggy).

Luckily, it turns out there is a fairly simple solution that isn’t disabling SIP entirely (which requires multiple reboots in order to do, so is typically quite disruptive & slow). It appears that any application granted Full Disk Access (System Preferences → Security & Privacy → Full Disk Access) can read Time Machine backups.

In case you’re unfamiliar, the symptoms of this problem include:

  • Being unable to navigate into Time Machine backups in the Open / Save / etc dialogs.
  • Being unable to see – through ls or similar tools – the contents of Time Machine backups via Terminal.
  • Apps reporting errors like “The file “Foo” couldn’t be opened because you don’t have permission to view it” or bluntly “Operation not permitted” when trying to read something in a Time Machine backup.

There’s a strange & ironically very bad security quirk though – curiously, any tools run via Terminal inherit Terminal’s access (or lack thereof) to Full Disk Access. They don’t use whatever setting might be specified for them in the Security & Privacy preferences. This is pretty baffling, as it means to give Full Disk Access to anything you run via Terminal, you have to give it to everything you run via Terminal. Anything you specifically give Full Disk Access won’t actually receive it if it happens to be launched via the Terminal (which confused me for a while, since it’s so unintuitive).

I’m guessing whatever mechanism enforces all this so-called security is based in LaunchServices or somesuch – while the Finder and most things in general will launch apps via LaunchServices, as detached & independent process sessions, Terminal doesn’t – everything it runs, from the shells down, run under it in the process hierarchy, and seemingly share its security & privacy settings.