iOS Family Sharing users cannot mix authentication schemes

Apple supports two styles of two-factor authentication, that they call (and distinguish as) “two-step” vs “two-factor”.  “Two-step” is their older method, though functionally they’re basically equivalent.

If you have multiple accounts on a Family Sharing arrangement, and some use “two-factor” while others use “two-step”, you’re in for a bag of hurt.

For example, any time you change the password on any of the non-master accounts, you’ll have to reauthorise all devices on that account with the master purchaser.  You’ll be prompted, when trying to download apps or purchase anything etc, with a dialog saying “Your Family Organizer, [foo], must enter the security code for their payment method”, asking for some kind of input.  There is literally nothing you can enter there that will make it work.  Not the password for any of the relevant Apple IDs, not any security codes for any credit cards, nada.

The problem is that it’s asking for a verification code that you can only create on a device which has “two-factor” authentication enabled.  Compare for example what you see with “two-factor” authentication enabled on your iDevice:

Screenshot of two-factor authentication enabled in iOS account settings

Versus what you see with “two-step”:

Screenshot of two-step authentication enabled in iOS account settings

That “Get Verification Code” “button” is what you’re looking for.  As you can see, it simply doesn’t exist with “two-step” authentication enabled.

The only solution – to allow your family members to download apps, purchase music / videos / books / etc, or pretty much do anything else on their iDevices – is to force the master account over to “two-factor” authentication.

To do this, you have to go to https://appleid.apple.com/ and turn off “two-step” authentication (which will require you to complete some stupid ‘security’ questions).  You cannot turn off “two-step” authentication from any of your actual iDevices’ Settings apps.

Then, stupidly, you can’t actually enable “two-factor” authentication from that same website.  That can only be done in the Settings app on one of your iDevices – by (in iOS 10.3 or later) going into Settings ➜ <your name at the top of the list> ➜ Password & Security.

There’s no way to enable “two-step” authentication anymore.  And not having any form of two-factor authentication enabled is a very bad idea.  So if any of your family’s accounts have “two-factor” authentication enabled, you basically have to switch to “two-factor” on all of them.

Which would be broadly fine, if Apple hadn’t made it so needlessly complicated, and the two systems so incompatible that their own software can’t figure out what’s going on.

EXIF metadata stores random gibberish for dates & times

I hadn’t ’til yesterday realised that EXIF metadata doesn’t actually store dates & times correctly.  Whoever came up with the spec all those decades ago clearly didn’t know how to work with dates & times correctly.  This is immensely frustrating since now we have countless images taken with timestamps that are collectively gibberish.

The problem is that the standard doesn’t specify time zones in almost all cases (the sole exception being for GPS timestamps, which are in UTC).  Which means if you see the date & time “2016-02-03T10:36:33.40” in your photo, that could be any actual time give or take ~25 hours to either side of that.

I realise now, in hindsight, that programs like Aperture & Lightroom manage this by implicitly associating a time zone with photos as they’re imported (and both have controls of varying degrees for ‘correcting’ the time of the photos, in cases where the camera’s clock is set wrong – including being set to the wrong time zone).  They leave it to the user to ensure the time zone that’s set for import matches what was on the camera at the time the photos were recorded.

However, if you’re processing images at scale and don’t have that explicit information from the user(s), you’re SOL.

Additionally, I don’t know anyone with a DSLR who hasn’t at least occasionally forgotten to change the date & time on their camera to account for changes in daylight savings time, or movement to a new time zone.  If the time zone were recorded, this wouldn’t really matter since you could reliable change it later.  But since it’s not, it’s impossible to tell programatically when and where the time zone changes, in a given series of photos.

Now, you might think that since the GPS timestamp is actually recorded as a real, definitive time, that you could just use that to determine the time zone of other dates & times in the metadata (by simply looking at the difference between them).  Unfortunately, in this case, the GPS timestamp is defined as the time at which the GPS data was recorded, not when the photo was created (or edited, or any of the other types of timestamps recorded in EXIF metadata).  Which means that in practice the GPS timestamp can be an unspecified & unpredictable amount of time older than the other timestamps[2. And that’s assuming the camera’s clock isn’t set wrong anyway – it’s possible to include GPS data in your photos but not sync the camera’s clock, in at least some popular cameras like Nikon’s.].

If it were just a matter of a few minutes difference then this wouldn’t be an issue, since the vast majority of the world only acknowledges half hour increments in time zone steps[1. Wikipedia reports that there are a couple of small regions of Australia & New Zealand which use 15 minute offsets, and the entirety of Nepal does too, but those are the only exceptions.  And only a small minority use half hour offsets, as opposed to hour offsets, to begin with.] and thus you could just round and get things right most of the time.  Unfortunately, at least some notable GPS implementations in popular cameras have potentially huge deltas (hours or more) – e.g. all of Nikon’s SnapBridge cameras, including the D500, D5600, & D3400.

Building John The Ripper Jumbo for macOS Sierra

It’s quickly apparent that John The Ripper Jumbo doesn’t build out of the box on macOS, and probably hasn’t for a long time, due to its complaint about missing OpenSSL headers.

This guide was almost helpful, except it’s out of date – e.g. the Makefile.in patch it provides no longer applies cleanly – and simply doesn’t work – once you get the John The Ripper configure script to see the OpenSSL devel headers, it then just complains that it can’t find a valid libssl anyway.

Even the Pro version of John The Ripper, which isn’t cheap, doesn’t look like a good option since its web page has the hallmarks of something that hasn’t been updated in many, many years.  e.g. talk about support for Mac OS X 10.7 Lion being planned.

And although it appears to support using CommonCrypto instead of the now deprecated OpenSSL, that doesn’t work – even when configured that way it still compiles in code that requires OpenSSL, for SHA1.  Sigh.

Trying to get it to use a fresh build of OpenSSL (1.1.0) also seems an intractable failure – OpenSSL 1.1.0 out of the box produces libraries which don’t contain SSL_library_init, which is of course necessary for any OpenSSL user and foils any attempt to use the built libraries by way of missing symbol errors in the link phase.

And OpenSSL 0.9.8zh’s build system is just screwy.  By default it builds only 32-bit and only static libraries (no matter how hard you tell it to build shared ones).  You have to bypass its first layer of configery and do it ‘manually’, like so:

./Configure –prefix=<install location> darwin64-x86_64-cc -no-shared enable-camellia

You can then configure JohnTheRipper to point to that version of OpenSSL, like so:

./configure CPPFLAGS=’-I <OpenSSL install location>/include’ LDFLAGS=’-L <OpenSSL install location>/lib’ OPENSSL_LIBS=”-lcrypto”

Now it’ll finally get past the OpenSSL issues, and build successfully.

Note:  don’t use the Makefile.in patch provided in the aforelinked guide.  That actually breaks the build now, even if you properly apply it manually.