EXIF metadata stores random gibberish for dates & times

I hadn’t ’til yesterday realised that EXIF metadata doesn’t actually store dates & times correctly.  Whoever came up with the spec all those decades ago clearly didn’t know how to work with dates & times correctly.  This is immensely frustrating since now we have countless images taken with timestamps that are collectively gibberish.

The problem is that the standard doesn’t specify time zones in almost all cases (the sole exception being for GPS timestamps, which are in UTC).  Which means if you see the date & time “2016-02-03T10:36:33.40” in your photo, that could be any actual time give or take ~25 hours to either side of that.

I realise now, in hindsight, that programs like Aperture & Lightroom manage this by implicitly associating a time zone with photos as they’re imported (and both have controls of varying degrees for ‘correcting’ the time of the photos, in cases where the camera’s clock is set wrong – including being set to the wrong time zone).  They leave it to the user to ensure the time zone that’s set for import matches what was on the camera at the time the photos were recorded.

However, if you’re processing images at scale and don’t have that explicit information from the user(s), you’re SOL.

Additionally, I don’t know anyone with a DSLR who hasn’t at least occasionally forgotten to change the date & time on their camera to account for changes in daylight savings time, or movement to a new time zone.  If the time zone were recorded, this wouldn’t really matter since you could reliable change it later.  But since it’s not, it’s impossible to tell programatically when and where the time zone changes, in a given series of photos.

Now, you might think that since the GPS timestamp is actually recorded as a real, definitive time, that you could just use that to determine the time zone of other dates & times in the metadata (by simply looking at the difference between them).  Unfortunately, in this case, the GPS timestamp is defined as the time at which the GPS data was recorded, not when the photo was created (or edited, or any of the other types of timestamps recorded in EXIF metadata).  Which means that in practice the GPS timestamp can be an unspecified & unpredictable amount of time older than the other timestamps1.

If it were just a matter of a few minutes difference then this wouldn’t be an issue, since the vast majority of the world only acknowledges half hour increments in time zone steps2 and thus you could just round and get things right most of the time.  Unfortunately, at least some notable GPS implementations in popular cameras have potentially huge deltas (hours or more) – e.g. all of Nikon’s SnapBridge cameras, including the D500, D5600, & D3400.

  1. And that’s assuming the camera’s clock isn’t set wrong anyway – it’s possible to include GPS data in your photos but not sync the camera’s clock, in at least some popular cameras like Nikon’s.
  2. Wikipedia reports that there are a couple of small regions of Australia & New Zealand which use 15 minute offsets, and the entirety of Nepal does too, but those are the only exceptions.  And only a small minority use half hour offsets, as opposed to hour offsets, to begin with.

macOS 10.12.2 appears to have brought with it some GPU issues

I run Einstein@Home, using both CPU cores & my GPU.  Other than a few month period where Einstein@Home was issuing broken GPU work units, I’ve been successfully doing this for years, I think.  Longer than I can really remember, in any case.

It appears, however, that 10.12.2 has introduced some serious issues impacting those GPU tasks.  While there’s always been occasional issues with performance while running these GPU tasks – e.g. Amazon streaming video drops frames – I’ve not had any major complaints.

Now, however, I have this:

Screen shot showing massive graphics corruptionThat’s what I get when I render a Nikon NEF file, pretty much anywhere in the system.

The exact symptoms of the issue seem to vary depending on where & what type of NEF file I render – e.g. rendering them in Preview mostly constraints the graphics corruption to Preview, and doesn’t readily lead to the whole system hanging.  Using the Finder for its previews, or Quicklook, however, very quickly leads to massive graphics corruption and, for Nikon D7100 NEFs, quickly hangs the system entirely.  Oddly, Nikon D500 NEFs don’t tend to cause immediate system hangs, but will prevent the system restarting or shutting down – it ends up hung at a black screen, after seemingly closing the window server, with a very consistent pattern of corruption and a frozen mouse cursor.

I never saw this, or anything like it, prior to the 10.12.2 update.  Sigh.

FWIW, the particular work unit in question triggering this right now is:

Screen shot of the Einstein@Home work unit properties dialog

Nikon SnapBridge

Finally.

Nikon have released the SnapBridge app so that the much-touted Bluetooth+Wifi capability of the D500 can actually be used.  A mere eight months after it was announced.  Fuck you too Nikon.

However, as I’d clearly forgotten, it’s not very useful anyway.  It doesn’t work with raws, you see.  Doesn’t even acknowledge that they’re in the camera, on the card.  It took me twenty minutes of screwing around with the app, wondering why it was so completely broken and dysfunctional, before I stumbled upon a tech support article for it buried half a dozen layers deep inside Nikon’s website (yes, there’s essentially no documentation within the app itself).

It does appear to at least work for geotagging & time sync, which is something.  Something Nikon could have put in natively for a $1 GPS receiver, and then not have to kill my iPhone battery to accomplish rudimentary tasks.

The almost saving grace of the D500 is its speed – specifically the UHS-II support, which helps it clear out its buffer extra snappy, given a decent SD card.  That means I can turn on NEF+JPEG without much concern about slowing down burst shooting, and only marginal concern about the wasted SD card space.

But it’s only almost saved by it.

The problem, you see, is that even if you abuse the NEF+JPEG option to yield little JPEG turds on your SD card – and even though those JPEGs can be surprisingly decent quality, even on ‘Small’ and ‘Basic’ settings – in NEF+JPEG mode the camera insists on using the JPEG version for all in-camera playback.  It becomes completely impossible to view the actual NEF.

Now, granted when ‘viewing’ NEFs in-camera you’re only getting the JPEG preview that’s built into them anyway, but still – it’s at least a decent quality, full-size preview.  You can at least zoom all the way in.  Not so if your JPEG turds are not full-size.

Which might be a good enough option, if one is willing to waste up to 50% of your space saving full-size JPEGs alongside the NEFs.

But, SnapBridge transfers the images via Bluetooth only.  Even when you’ve configured it to bring over the originals, at up to 10 MB each.  It can take minutes to transfer a single image of that size at Bluetooth speeds – I know, I accidentally proved it empirically.

Now, you can limit the transfer to 2 MP versions of those JPEGs, but 2 MP is tiny, even by Shitagram standards.  The ‘Small’ JPEGs the D500 saves natively are 5.2 MP, for point of reference.

So the 2 MP transfer option – call it “Thumbnails only” – is not a practical or useful option.

So we’re back to having to use full-size JPEGs, alongside the real photos (the NEFs).

And remember the prior point about abysmal Bluetooth transfer speeds?  To make SnapBridge’s auto image transfer plausible to use with any frequency – let-alone leave on permanently – you need tiny file sizes.  Even on the highest compression setting (vanilla ‘Basic’) the 21 MP JPEGs are several megabytes.  Only by using the ‘Small’ image size – which is frankly still good enough for Instagram types – can you get the sizes into the sub-MB range, and transfer times down to ‘merely’ a few seconds per photo.

So you’re stuck between a rock and a hard place.  The net result is that the whole image download thing’s kinda horrible and useless to me.  Which makes me sad, because it could easily have been implemented much better.

The CamRanger remains a significantly better experience in almost every respect – the main detractor being the additional monetary cost it imposes.