Category Archives: Hardware

Stuff

Louche Long

Taste and money rarely mix.

Apple has had several justly famous advertising campaigns, from the ‘1984’ ad where an athlete hurls a sledgehammer at a movie screen in a theater filled with automatons, to the ‘Think Different’ series which adulated original thinkers. But maybe the most beloved was the long running ‘I’m a Mac and I’m a PC’ with the comedian John Hodgman as the nerdy and lovable PC-using klutz and, well, Justin Long. Long portrayed the oh! so cool Mac user and his smarmy, condescending, hipster presence did nothing to endear prospects to the Apple brand, for it was Hodgman viewers tuned in to view. One of the best known ads had PC swathed in bandages head to toe, explaining that his multiple crashes were the cause. Another had him on the shrink’s couch relating how unloved he was. Hodgman simply nailed it.


Nerd and hipster.

Before examining the new Intel ads claiming their CPUs are superior to Apple’s new M1 – a CPU which is universally lauded as redefining the realms of possibility in Macs – it bears to relate Apple’s history with CPU makers. The Motorola 68000 family in early Apple ][ computers could not hold it own, Motorola falling behind the performance game, and gave way to the IBM G3/4/5 series. Capable performers, these suffered from high heat output and, when Steve Jobs asked for a cool running successor to the G3 in the fabulous Powerbook notebook, IBM gave him the G4 which did a more than passable imitation of a toaster. It ran that hot. So Steve started the team working on converting the product line to Intel’s CPUs and did so successfully until …. Intel started repeating the errors of Motorola and IBM. Slow development cycles, loss of competitive position, we had seen it all before. But Apple, as always looking down the road, had an answer, having been sub-contracting design and development of its iPhone and iPad CPUs to ARM with whom the company increasingly adopted a tailored approach, not willing to rest on the laurels of a commodity product suitable for all.

This exercise culminated last year in Apple going whole hog and developing its own M1 CPU which not only derived from the state-of-the-art A14 in the iPhone, it also spanked the competition on performance (high) and power use and heat output (low). It was such a success that Apple has started migrating its notebooks and the Mac Mini to the M1 and later this year will do the same for the iMac and Mac Pro.

So Intel, always a day late and an idea short, felt it had to strike back and hired the louche Long, ever willing to prostitute his C-list Hollywood credentials, to talk up the advantages of Intel’s latest (very late) and (not so) greatest CPUs. And they got it so wrong, it’s comical to behold. Not only is Long still smarmy and condescending – characteristics as tied to the actor as the sneer is to Donald Sutherland – it’s really quite unclear what he is going on about.


See what I mean about Long?

For the whole story, capably reported by Apple Insider, click here.

AMD Radeon GPUs

An element of future proofing.

For an index of all my Mac Pro articles, click here.

Apple simply cannot leave OS X alone. An OS that was perfectly solid with the Snow Leopard iteration back in 2009 continues to see annual ‘upgrades’ which add nothing but useless bells and whistles. And with OS X High Sierra (10.13 – 2017) Apple made the last version of OS X which worked with Nvidia GPUs. Nvidia and Apple had parted company a while back and Nvidia decided that the small user base no longer justified coding of drivers for later versions of OS X despite having some of the best GPUs made. Then Apple insisted on adding its Metal GPU technology meaning that from OS X Mojave (10.14) no Nvidia GPUs would work. A solution looking for, and finding, a problem.

So why care when OS X High Sierra works well? Because increasingly app makers are coding for Metal, and if your app is one of those and is in need of an upgrade, you are stuck with migrating to a recent model GPU from AMD/ATI.

I have not researched this in great depth but the AMD HD7950 is known to work well as are later versions. I bought a used 3gB HD7950 on eBay – it was fraudulently advertised as a later R9 280, so what else is new with eBay – and it works well in High Sierra. Here are the comparisons from HWCompare:


Nvidia GTX980 4gB vs. AMD HD7950 3gB.

My tests on my Mac Pro 2010 confirm that the AMD card – it ran me $220 – is marginally slower than the Nvidia/EVGA GTX980 in use, but as I do not do heavy video processing that does not concern me.


Unigine video tests. The upgrade to a faster CPU is irrelevant here.

Nova returns similar performance comparisons:


The GTX980 returns 2640.

Static power consumption, measured using a Kill-A-Watt meter, is 225 watts, compared with 252 watts for the Nvidia card. That rises to 460 watts with a BENQ 32″ 1080p display while running Unigine Heaven. That is well within the power handling of the Mac Pro and, more importantly, the 200 watt maximum power draw of the AMD GPU is below the 225 watts available to the card via the PCIe slot (75 watts) and the two motherboard cable feeds, one 6 and the other 8 pin. Those provide 75 watts each for a maximum of 225 watts. The AMD card is very quiet and there is no start up whirr, unlike with the Nvidia card. While the seller stated the boot screen works (my MacVidCards modified GTX980 provides a boot screen on its DVI and DP ports, not on the HDMI one) that is not the case. I could not get boot screens on any of the MDP, DVI or HDMI ports. Oh! well ….

Should you ‘upgrade’ to an AMD card? Well, if your latest apps insist on Metal and you want/have to install OS X Mojave (10.14) or Catalina (10.15), you have no choice. Only AMD GPUs will work. Does the AMD card work with the latest OS X, Big Sur (10.16)? It’s too early to say.

Adding 4K to the movie Mac Pro:

My motivation was different. Having added a big screen LG OLED TV a couple of years ago to the home theater, I wanted to deliver 4K video to the display which is mostly driven from a single CPU 2009 Mac Pro which is a robust and reliable file server as well as offering internet access to streaming services like Netflix and Amazon Prime Video. But the HDMI card in that Mac Pro is an ancient GTX520, limited to 1080p. Thus I have been using an Apple TV4K to deliver 4K streams, but that’s not an especially elegant solution as you have to switch between sources. That machine is running OS X Yosemite (10.10 – 2014) and has no need of Metal technology. So it occurred to me I could both future proof my 2010 Mac Pro by installing an AMD card and at the same time switch the GTX980 over to the 2009 movie Mac Pro where it will happily deliver 4K definition (3840 x 2160 in the 16:9 aspect ratio, compared with 1920 x 1080 in 1080p). That would allow sale of the AppleTV4K as all movies would be streamed through the Mac Pro and no input switching would be required. The home screen looks like this:


The Mac Pro movie server home screen – all icon driven. DVDPedia – at top left – catalogs movies on the server.

I use a small app named Img2Icns to generate icons from images found on the web.

HiDPI

When Apple migrated its screens to the Retina Display icons shrank to a quarter of their original size, making them very hard to see and click. Most external monitors, like a 4K TV, do not recognize the HiDPI tech built into the displays on Macs, so HiDPI has to be enabled to show icons in a decent size while not affecting 4K definition in 4K movies. HiDPI scaling can be enabled by starting Terminal (in Applications->Utilities) and typing the following (works with OS X Mavericks 10.9 or later):

sudo defaults write /Library/Preferences/com.apple.windowserver.plist DisplayResolutionEnabled -bool true

Hit enter, type your password, hit enter again and reboot. HiDPI is now enabled. Go to System Preferences->Displays, click on ‘Scaled’ (you may have to hold down the Alt key on your keyboard) and HiDPI options will now be shown. Click on 3840 x 2160 (HiDPI) and your icons will revert to regular size. You can verify that you’re getting the right resolution by clicking the Apple Menu in the top-left, selecting “About This Mac”, then the “System Report” button, then clicking “Graphics/Displays” in the list on the left. You will see something like this:

Displays:

AV Receiver:
Resolution: 3840×2160 (2160p 4K UHD – Ultra High Definition)
UI Looks like: 1920×1080 (1080p FHD – Full High Definition)

If you decide you want to revert the change above, just use this terminal command:

sudo defaults write /Library/Preferences/com.apple.windowserver.plist DisplayResolutionEnabled -bool false

SwitchResX:

If you struggle with getting the right definition and frame rate in System Preferences->Displays, install SwitchResX, a small, inexpensive utility which allows you to set both. I find 24fps is inadequate for some movies and with HDMI on a Mac Pro (which supports HDMI 1.4, not the faster 2.0) you can increase the framing rated from 24fps to 30fps, which works well. I have it set at 3840×2160 which is 4K and 30fps.

Conclusion:

So if you need to add Metal/Mojave et al compatibility to your Mac Pro a late model AMD card works well and needs no special drivers. An older Nvidia card (GTX680 or later, but no later than GTX980) can be used to stream movies in 4K definition, a definition also supported by the AMD card. However, the Nvidia card is limited to OS X High Sierra (10.13) or earlier.

No more identity theft

Hasta la vista, Zuck.


No more theft.

iOS 14.5 for the iPhone and iPad will be released shortly. Unlike previous versions of the operating system, apps which would require the user to opt-out of tracking their activity now will require the user to consciously agree to be tracked. The opt-in screen appears above.

Why is this a big deal?

Let me flashback to to my son’s 6th grade year in California. That was in 2014. As we were walking home I noticed that all the kids in the playground were busy staring at their smartphone screens.

“What are they doing, Winnie?” I asked in all innocence.

“Facebook, Dad”.

This set me off on a process of discovery and disclosed what has to be the greatest evil of our time. Not only was Facebook absorbing and wasting huge amounts of time for these fertile young brains, it transpired that it was tracking everything these kids did even if they were not on Facebook. And unless you have been in a nuclear blast-proof bunker the last few weeks with no access to any sort of connectivity, you will also know that Facebook extended its evil ways as an organizing vehicles for traitors, seditionists and insurrectionists. Censorship of hate speech be damned, thanks to Mr. Zuckerberg. The people who stormed the Capitol on their Pig’s orders on January 6, 2021 had organized their meetings on Facebook and, to a lesser extent, on Twitter.

But it gets even worse. 4 years ago a very close US presidential election awarded that same Pig the Oval Office thanks to the Russkies’ massive campaign of disinformation on …. yup, you guessed it, Facebook. And every time those seditionists clicked on the site of their local guns and ammo supplier, Facebook was there making money off their clicks. Zuckerberg was, simply stated, being paid by the makers of deadly weapons.

Now Zuckerberg is up in arms about Tim Cook’s privacy decision. He argues that the requirement to opt-in to being tracked will make your “….advertising experience worse.” Excuse me? Is there something like a good advertising experience?

Come to think of it, while you are at it, you might as well install an ad blocker on all your devices to cut the noise and disruption ads cause in the reading experience.

So when iOS 14.5 is announced, I advise all iOS users to upgrade immediately and refuse to opt-in to tracking of their activity. If you prefer to be watched, sold, tracked, filed and numbered while enhancing Mr. Zuckerberg’s bloated net worth, then stick with your Samsung cell phone. iOS 14.5 works on iPhone 6S or later.

As for my son, he gave up Facebook shortly after the experience explained above, and has never been happier or more productive.

Up periscope!

The future approaches.

This year or next will probably see the addition of an optical zoom lens to high end iPhones. I wrote “high end” as the change in Apple’s marketing strategy with the iPhone 12Pro and Pro Max is clear. They are distinguished from lower models by adding a longish lens (65mm on the Max) and, in the case of the Max, bigger sensors. And bigger margins, of course.

Rotating turret lenses in cine cameras have been around for decades:


The Bolex H16, originating in 1927, was last made in 2016 by the Swiss Paillard company.

Compared with zooms the lenses were lighter and faster. And mostly sharper, to boot.

Never one to resist an opportunity to make yet another gadget, Leica went all out with a turret attachment for its 35mm film cameras, coming up with this monstrosity



The Leica turret attachment from the 1940s..

While you might argue that simply changing lenses would be easier, Leitz persisted with this nuttiness into the Leica M era which saw the old, slow screw mount give way to a fast bayonet variant, yet the turret remained available, now with bayonet mounts. The pocketable aspect of the small and elegant Leica body was rather lost in the process.

But zooms were the way of the future and while they came with limitations, they were a lot more appealing to the average consumer. 2002 saw the introduction of Minolta’s Dimage film camera with a periscope zoom, and it was a knockout.



The elegant Minolta Dimage of 2002.

The periscope optical zoom, vertically oriented inside the case, saw light rays deflected through the associated right angle using a mirrored prism. This allowed the incorporation of an otherwise lengthy optical path within the tight confines of the body, a small 3.3″ x 2.8″ x 0.8″. For comparison, my iPhone 12Pro Max in its ‘bumper‘ measures 6.5″ x 3.1″ x 0.3″. You can read DPR’s 2002 review of this 2 megapixel digital masterpiece here.

This cutaway view shows how it worked:



Illustration of the ‘folded’ optical path.

While the Dimage sported a 37-111mm (3:1) zoom with modest aperture of f/2.8-3.6, I think we can expect a lot more from the iPhone 13 or 14. For this user a 28-200 (7:1) f/2 optic would be perfect, and leave the UWA lens as a separate choice. That makes the optical designer’s job easier and, let’s face it, you really do not need a zoom starting at 12mm given the relatively infrequent use of something so wide. Nor do you need a turret.

Once that iPhone Zoom hits the market the sole remaining users of traditional DSLRs or their mirrorless brothers will be press photographers and the fashion set, because both would be laughed off the set were they to be seen using an iPhone. And, of course, the few remaining nuts taking nature photographs because, you know, of the trillions of images already out there, all available for pennies from stock vendors, there must be something yet undiscovered. As for the camera divisions of Canon, Nikon, Sony et al, say goodbye.

The technology is out there. A 2019 Huawei cell phone uses it and you get free Chinese spying software as part of the deal. Wait for the real thing.

iPhone 12Pro Max bumper

Protecting the lens assembly.



Click the image to go to Amazon.

It’s not that easy to find a pure bumper for the iPhone 12Pro Max. Most cases come with a variety of front and rear covers, neither wanted by this user as I use a belt holster.

I did not want to get a case for the iPhone 12, to keep bulk down and to retain the better grip afforded by the square sides. However I did find the sides rather slippery but, more importantly, noticed that the protruding lens assembly meant that the iPhone would not rest flat on a desk or table, but would rest that corner on the lenses themselves. Not good.

So I caved and got one of the above. It’s slightly less slippery than the native edge, the square profile of the sides is retained, the touch of the buttons remains good (though the mute switch is a bit tough to access) and it has corner protrusions which protect the lenses when the iPhone is placed lens-down on a flat surface. While the package included a screen protector I consider that a waste of time as I have never known an iPhone screen to scratch unless something truly thoughtless is done to it, like putting it in a pocket with unprotected keys. As for covering the lovely Pacific Blue back, why on earth would you want to do that?