The on-stage reasons for removing the headphone jack were space, being antiquated, and the advent of wireless. Ben Thompson brought a BuzzFeed article to my attention that goes more in-depth about the space portion of this justification:
Apple executives told BuzzFeed that removing the headphone jack made it possible to bring that image stabilization to the smaller iPhone 7, gave room for a bigger battery, and eliminated a trouble-spot when it came to making the iPhone 7 water-resistant. It’s a solid argument, albeit one not quite worth Schiller’s hubris.
It also adds easier water resistance as a benefit of removing the jack, which I believe is probably true even given the other devices which are water resistant with headphone jacks.
Thompson also raises an excellent point about the iPhone 7 Plus camera. Because there are two sensors, with some clever software which combines triangulated information about the subject with clever guesses about depth, it’s possible to use the iPhone camera to created 3D imagery, even if only very slightly (the greater the space between the two sensors, the more you can triangulate and capture the 3D scene). Here’s how he puts it:
[W]hat Apple didn’t say was that they are releasing the first mass-market virtual reality camera. The same principles that make artificial bokeh possible also go into making imagery for virtual reality headsets. Of course you probably won’t be able to use the iPhone 7 Plus camera in this way — Apple hasn’t released a headset, for one — but when and if they do the ecosystem will already have been primed, and you can bet FaceTime VR will be be an iPhone seller.
What also struck me about this presentation was how conspicuously Schiller brought up machine learning as what powers some of these new features. This, along with artificial intelligence and virtual reality, are the new hotness in the Valley, and it seems to me that Apple is making long-term plays and hedges with the iPhone 7 Plus camera sensor.