WWDC Logo

WWDC 2019 Keynote Observations

Altconf screen for wwdc keynote
AltConf Screening of WWDC Keynote

Steve and I attended an event called AltConf, which is a WWDC-adjacent conference. We got to see the keynote from Apple’s Word Wide Developer Conference live with all the developers that wanted to be in the area but couldn’t get into the conference.

I’m sure by now you’ve either watched the WWDC yourself or read the lowdown on every single thing that Apple announced. I don’t want to talk about every little thing but rather highlight some of the information I found interesting.

I’ve categorized my observations into three categories: Things that got huge applause, things I found really cool, and things that made me go, “hmmm.”

Things that got big applause

Let’s start with things that got a big reaction from the audience.

Some PlayStation and Xbox controllers will work with Apple’s gaming service, Arcade when it arrives on Apple TV.

They announced Quick Path, which is a swipe keyboard gesture for faster typing on iOS. They didn’t really demo it but people got pretty excited

New maps features with the binoculars got a big reaction. Specifically, people were wowed when the woman demonstrated using a street view with the binoculars and then suddenly zooming along fluidly as though you were zooming along in a car. I liked how in that view you could see the name of a store and tap on it to learn about it.

Sign In with Apple for iOS 13 – They explained that there will be a feature in all apps that will allow you to create a sign in via Apple. When you use Sign In with Apple, you’ll be able to choose whether or not to share your real email address with the developer. If you choose to hide your email address, the developer will instead get a unique address @privaterelay.appleid.com. This will allow you to shut that one down selectively if the developer bothers you.

In the explanation of enhancements to Photos they talked about a bunch of new adjustments they’ve added on iOS and how the UI has been redesigned but it didn’t get much reaction. Then they said that you’d be able to apply these adjustments to videos, which is really cool. But people were most excited when they said you can rotate a video inside Photos. Finally!

The Files App got some real love. I was excited about a column view (small things make me happy), but the big money maker was when they said you can use SMB file sharing to get to a local server and you can plug in USB drives and SD cards and see them in the Files app. I think I can remember people asking for this as far back as the existence of iPhone.

For many of us, we have just been annoyed by the lack of USB and SD card support for iOS because we want to bring in images to an iPad or iPhone but we don’t want to import them into Apple Photos app. But there’s a group of people who had a more critical problem. I’m not sure how wide-spread this problem has been, but it has affected the blind community. A company called Orbit Research tweeted about their Braille display called the Reader 20. In the past, users would have to use a Windows PC to upgrade the firmware of the Reader 20, but now they’ll be able to do it directly on their iPad. I don’t know why it wasn’t possible with a Mac, but this is still great news.

The single thing that got the most applause from the developers at AltConf was the Voice Control video and demonstration. This is a new capability that will be built into macOS Catalina which will allow those with mobility impairments to move around the screen and make selections and type all via voice. I think the reaction of the audience was because adding accessibility to development projects is a lot of work and in some cases is very difficult. Having a framework that makes this easier can only be a good thing for everyone.

In case you hadn’t heard, Nuance, the authors of Dragon Dictate announced a few months ago that they were no longer going to support the Mac, which was devastating to many. It seems easy to infer that Apple’s advancement in this space made the business model untenable for Nuance. I also heard through the rumor mill that Apple has been recruiting Nuance engineers like crazy lately so the whole thing makes sense. I’m thrilled for those who need these features to be productive. Even lacking physical disabilities, I would truly love it if dictation on the Mac was significantly better. I think it’s quite ironic that dictation into my Apple Watch is the most reliable, while into the iPhone is slightly worse, and macOS bringing up the rear in terms of accuracy.

Things that I thought were cool

Decibel level to tell you if your environment is too loud. I have wanted to be able to easily measure (and characterize) sound level for a while. At our gym, we had the spin cycle room right next to the changing room and they would often leave the door open. I wanted to be able to hold up a device and point to a screen saying, “It’s too loud!” Sounds a little like “get off my lawn”, doesn’t it? In any case, when they described the loud environment warning, they made a point that the device doesn’t record or save the audio it’s measuring around you.

Cycle tracking for women seems to have been a big missing feature so pretty cool they finally brought it. Might have Sherlocked a few apps.

Reminders being rewritten from the ground up excites me. I like Reminders but having subtasks is super useful so I’d always reach for other apps. Hope there’s more goodness in there.

They’re adding an ETA to Maps. I’ve always used a third party app for that, like Glympse. Sadly it’s been getting dodgier all the time and often simply won’t send the notification to the recipient. Would be pretty swell to have this built into Apple Maps.

Some people thought it was dumb but I thought the enhancements to Memoji were pretty fun. The explanation of it was via video of two Memoji talking to each other as they changed their hair to blue, added modern glasses, changed lipstick and eye shadow and even put in tongue rings. More personality means more fun. They also came out with Memoji stickers which will be awesome.

Surprisingly they extended Memoji down to any device running an A9 or higher processor. They weren’t really clear on what that meant, but it seems logical to assume that the stickers, not the animated characters because those older devices don’t have the True Depth camera. Still, Memoji stickers will be really fun.

That means you don’t need Face ID. Does this mean they’ve enhanced the software to animate them to use the regular front-facing camera on those devices?

I inferred from the announcement that HomePod will be able to recognize different voices that we’ll be able to have more than one AppleID attached to it. They didn’t exactly say that but I’m hopeful.

The enhancements to the newly-named iPadOS looked like they took another big step forward in making the operating system function more like a desktop machine. They demonstrated having two Notes files open side by side, and more Notes files open in split view with other applications and then showed using exposé to see all of the open Notes files.

iPadOS will now serve up desktop websites instead of mobile sites. I presume they’ve changed the User Agent to cause this. On a 12.9″ iPad Pro that makes sense, and still makes sense on the 11″ iPad Pro and even the 9.7″ iPad, but I’m not sure how that will work out for the iPad mini.

Owning an Apple Pencil makes you search for excuses to use it. Marking up documents is one of the most obvious things people want to do. But to do this on anything that isn’t already a PDF you have to:

  1. Hit the appropriate button to go to print
  2. Use a secret hand gesture – pinch out on the screen, which brings the document full screen
  3. Then use the share button to send it to another app as a PDF where you can then mark it up

Now though, you can simply swipe up from the corner of the screen with Pencil and start marking it up. That’s just what you would hope it could do.

I really enjoyed Craig Federighi’s presentation of the break up of iTunes. He was hilarious as he said that they’ve heard our requests and overwhelmingly users want them to put more things into iTunes. He showed mockups of iTunes now containing Calendar and Photos and more. He really leaned into everything that’s wrong with iTunes.

You know I have to love there being a dedicated Podcasts app from Apple in macOS. According to Jason Snell on Six Colors, the Podcasts app is written in Catalyst, meaning it’s the iPadOS (or possibly iOS?) app ported to the Mac. Jason said that it was indistinguishable from a native Mac app.

In the Keynote, they showed how if you plug your phone in now, instead of launching iTunes or any other app automatically (which drives me bonkers), you’ll just see your iPhone as an attached device in Finder. But when you select it in the Devices section, you’ll see the familiar screen showing how to sync your content. I think that’s fantastic.

You know I’m a big fan of Luna Display, the little hardware dongle that lets you use an iPad as a secondary display (or even a primary display for a headless Mac mini). I like it enough that I agreed to go to a happy hour sponsored by the Luna Display people in Los Angeles. The happy hour just happened to be a day or two after the rumors started to fly about Apple coming out with a competing technology called Sidecar.

I asked the CEO, the founder, and the Chief Technology Officer what they thought, and they expressed hope that the implementation by Apple would be a limited version of what they do, and thereby helping to increase the overall market by letting people know this type of technology exists.

The rumors came true and Apple did announce Sidecar. They didn’t give a lot of detail on how this would work but they did say you’d be able to use your tablet both wired or wirelessly to mirror or extend the desktop of your Mac. I wrote a quote down, “Works across all apps that support tablets.” I wrote a bunch of question marks after it wondering what the heck that means. I have a feeling that this could work better than other solutions like Luna Display because Apple has access at a lower level to the hardware.

On Apple’s page all about macOS Catalina, as pointed out by Steven Goetz, it says, “Bring the ease and precision of Apple Pencil to your favorite creative Mac apps with Sidecar. Use Apple Pencil to design in Illustrator, edit photos in Affinity Photo, or create 3D models in ZBrush.” I sure liked to see my favorite Affinity Photo up in lights like that! By the way, there was a footnote next to Sidecar in that sentence which said, “Some features require Sidecar-enabled apps.” So it might not be as seamless as Luna Display until apps get up to speed.

Apple announced that there’s no longer Find My iPhone and Find My Friends; they’ve been combined into simply Find My. I think that’s an unfinished statement but they didn’t ask me before they named it. Anyway, that wasn’t the cool part I wanted to highlight. The new thing I got excited about was that Find My would find your Mac even when it’s offline. They explained that the Mac sends out a secure beacon that other phones and devices can see and then somehow Apple can geolocate the device from that information. Not sure how those other devices get roped into helping though, that seems kind of weird. Still, if that works it would be awesome.

Things that made me go “hmmm”

A lot of focus on Apple Watch apps. In general, I’ve not found any of them to be compelling at all so this seemed curious. One big announcement was they will allow independent apps, meaning they won’t require an iPhone app to function. Also bringing the App Store to the Apple Watch. That seems rather bizarre to me. Can’t picture it on the small screen.

They said that with iOS 13, app downloads from the App Store would be 50% smaller and upgrades would be 60% smaller. I would like to know what kind of sorcery they’re performing that would cause this. Perhaps the framework in which Apps are held by Apple were hugely inefficient? But to cut the size in half the extras would have to be more than 1X the size of the apps themselves. Seems curious.

They talked about a new method of selecting text on iPadOS. They said you’d be able to simply drag across a sentence and it would select. They were very excited that you would no longer see the magnifying glass when you were trying to get your finger to just the right spot in the text. They described a three finger pinch to copy, three finger zoom to paste, three-finger swipe left to undo (no more shaking an iPad) and three-finger swipe right to redo. That sounds kind of complicated and difficult to remember and execute.

In the demo, the poor guy had a terrible time getting it to work and more than once said, “Oops, let me try that again.” He lost the entire game for me. That’s exactly what we’re trying to get away from. The three-finger left and right for undo and redo might be good but I’ll have to use the selection technique before I’ll believe it’s any good.

Bottom Line

It was great fun to hear and see the keynote in the company of so many developers. It helped to gauge the real reaction of that community.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top