Greetings Mac users!

Happy Fall to you all. 🍂🍁🌤 I hope you’re enjoying the glorious color displays being put on for us right now; I know I sure am!

It’s early October, and Apple just held its annual iPhone introduction a few weeks ago. I received my iPhone 13 Pro last Friday, and I’m happy to report that it’s definitely a worthy upgrade, especially if you haven’t upgraded in a few cycles.

While you may have read that the 13/13 Pro/13 Pro Max are just “incremental” updates, I beg to differ. Of course, where the real upgrades are most apparent is in the camera modules. Again, many tech reporters are describing the camera upgrades as nothing special, but to me, they’re significant. On the base model, the plain ol’ 13, there are two lenses, as on the 12’s, but they’ve been reconfigured into a diagonal arrangement. This is because the lenses are significantly larger than the 12’s, as they are on all the iPhone 13 models. The iPhone mini lives on as well, at least for another year, in the iPhone 13 mini. For those of you who like a smaller smartphone, the mini is definitely worth a look. Where the camera upgrades really shine, of course, in in the two “Pro” models, the iPhone 13 Pro and the iPhone 13 Pro Max. In fact, a lot of the features that were exclusive to the Pro Max model in their iPhone 12 iterations are now available in both Pro flavors of the iPhone 13.

In both iPhone 13 Pro models, there are the usual 3 camera modules, and they’re all pretty major upgrades from the previous versions. There’s the standard “wide” module, the telephoto, and the “ultra wide”. The standard wide lens is a serious upgrade to the previous versions, with a larger sensor and a faster f1.5 aperture. The telephoto lens goes from a 65mm “2X” equivalent to a 77mm “3X” equivalent, but it also has a slower f2.8 aperture. You’re going to get the best quality images out of that lens in strong light, but for most of us that will be fine, since we’re primarily using it for landscapes outdoors. Lastly, the ultra wide lens has three great upgrades: it goes from a f2.4 aperture in the 12, to a f1.8 in the 13, it goes from fixed-focus, to auto-focus, which in turn allows it to gain a macro feature for super close-ups, with a minimum focal distance of 2 cm!

In last year’s models, only the Pro Max model had “sensor shift” optical image stabilization, which provides much sharper images in most circumstances, but especially in low light. This year, they’ve included the feature in both iPhone 13 Pro models, which is a major plus for those of us that don’t want to have to carry around the much larger Pro Max.

Another feature that’s getting a lot of press is the new Cinematic mode when shooting video with either of the two Pro models. Cinematic mode automatically adjusts the aperture, or f-stop, to change the depth-of-field effect of your video while shooting, and allows you to edit it after you’ve shot your video. It also automatically changes what’s in focus (rack focus) from one subject to another based on things like a subject glancing away from the camera. It allows you to edit all that after-the-fact as well. Google some examples online; it’s pretty amazing. And while far from perfect, it does portend a huge advantage for videographers in the next few years as the feature evolves and improves, much like Portrait Mode has improved over the last few years.

Other improvements this year include a brighter OLED display on all models, as well as a feature called Pro Motion which makes for a much more responsive display for scrolling and other effects, due to its ability to dynamically adjust the display’s frame rate. If that’s all g(r)eek to you, don’t worry, you can either google more details about the feature, or just know that your new iPhone 13 will feel a lot more responsive, especially if you’re upgrading from an older model like a 7 or an 8. HDR playback has also been much improved across the iPhone 13 model line. For more on that, check out this video.

Now for a couple examples of the new features (as well as ones that have been fixed from previous releases) in iOS 15. Probably one of the most impressive is the Live Text feature, which allows you to copy and share text within photos and other images super easily. To access this feature, you’ll see a new icon in a lot of text input fields:

“Live Text” icon

…which will take you to a camera interface that allows you to capture text from pretty much anything that you can point your camera at! It’s actually the evolution of a long-standing feature on computers called OCR (optical character recognition), which began as a way to scan a document and turn it into editable text. Those of you familiar with OCR will remember how awful it was in the early days, even though it was a lot better than having to retype a page of text from scratch. Through the magic of AI and ‘machine learning’, Live Text is now extremely powerful and effective, and I think you’ll find more and more uses for it all the time. Another cool aspect of it, called Visual Look Up, is that you can now search through your Photos Library for any text that might appear in any of your photos; you can even use it to translate signs you saw on a trip you took years ago! Visual Look Up has its own icon, that you’ll see in the row of icons below a photo in the Photos app:

“Visual Look Up” icon

…which indicates, when looking at a photo in your Photos app for instance, that other data might be available about the content of that photo. When you click on that icon below the photo, if there’s an animal in the photo, you might see a little paw print icon on the photo. If there’s a plant, you might see a leaf icon. Click on that icon, and it will attempt to identify the plant or animal involved. For instance, when I tried it on a photo of some of our Virginia Creeper vine in full fall color, it correctly identified the plant! On the web, you’ll need to long-press on an image, then select “Look Up” to see the icons described above. I tried it on a web search for flower images and it worked quite well there too.

Here’s a couple of nice “fixes” in iOS 15: first, they brought back the little magnifier for when you’re moving your cursor around in a text field. As you may or may not know, for years you’ve been able to move around in a text box by press-and-holding and moving the cursor around from there. In iOS 14, they removed the little magnifier that helped you see where you were dragging the cursor, but in iOS 15, it’s back! That’s a welcome change, for me anyway, though I’ve noticed it to be a little buggy, in that it doesn’t always show up. [A related tip is that if you press-and-hold on the space bar when entering text on an iPhone/iPad, the keyboard turned into a sort of ‘trackpad’ that allows you to move the cursor around the text field. It’s a really handy feature!] Another fix is that they got rid of the awful time-selection tool from iOS 14 and returned to a variation on the tool that existed in most other previous versions, where you can drag up and down on the time selector to change the time for a reminder or an event, instead of having to type in the number. Another welcome reversion in my book!

Well, that’s about all I have time for this go-round. I hope I’ve given you some helpful info about the new iPhones and some examples of the cool new features in iOS 15. Stay tuned for more Mac Doc news, coming your way again soon!