Yearly Archives: 2012

Microsoft Surface recycled

More of the same.

Microsoft Surface was a visionary product from Redmond, a decade ago. A huge table-style touch screen allowed icons and elements to be moved around by touch, and the whole thing was just so tomorrow.

Of course it was so good that it had to die; Microsoft has long known how to kill a good thing. Though there are many brilliant, innovative thinkers at Microsoft, neither skill has been in abundant supply in the corner office, occupied by a doofus whose only claim to fame is that not even he has managed to screw up one of the world’s great annuities – Windows + Office. The only other product which has sold in volume in the Ballmer years is the Xbox360/Kinect, emphatically not a Microsoft invention, and one whose profitability is a rounding error, so brutal is price competition in the game console segment.

For a long while I have refrained from knocking Microsoft in this journal as it has just become sort of assumed that anything from Redmond would be a costly time sink, and why mock the afflicted? But Microsoft’s latest just begs for a thorough thrashing, which it is my pleasure to administer.

Doofus with his latest disaster.

I mean, goodness, must he really insist on demonstrating something he is clearly clueless about? Has no one at MSFT the courage to tell this emperor that he has no clothes? Imagine the management culture at a company which prohibits criticism.

You see, Surface is the name for MSFT’s new line of tablets. Yes, the same tablet which Doofus was knocking (in this case, knocking on) two years ago. That was just when MSFT’s previous tablet effort was scrapped. And there’s little reason to think that the new one will last any longer. Goodness, they couldn’t even make up a new name for the product.

Each iPad comes with bullet proof software, iOS, robustly interlinked Mail, Scheduling, Address Book, etc. through iCloud, and a vast ecosystem of hundreds of thousands of applications developed over many thousands of man-years.

Now look at MSFT’s offering. First they are offering the tablet with two completely different CPUs – ARM and Intel. Eh what? Lucky developers having to craft apps for that. Then they come with plain, touch or type keyboard covers. There are two sizes – 32 and 64gB. And while they are mum on cellular, you can bet they will have to offer at least two carriers in the US, in addition to wi-fi only. And of course they come with a stylus, presumably because the touch interface is so poor there is no alternative.

And this from a manufacturer whose experience in hardware is largely limited to making mice and has little skill in complex hardware supply chain management.

Oops, did I mention applications? Ummm, no, there are none as of the time of writing.

Worst of all, the Surface Mk.2 runs Windows. Two different versions, no less. That spells DOA to me. And by the time these things hit the stores, if they ever do, iPad4 will have been released, instantly obsoleting whatever claims to currency Surface2 may have made.

So thanks, Microsoft, for reinforcing my disgust in you and confirming my decade old decision never to touch one of your foul products again. Apple may make mediocre, overpriced PC and laptop hardware, but OS X and iOS are robust as they come, the iPad is insanely great, and the whole ecosystem linking the two hardware platforms is not something that the dysfunctional corner suite in Redmond is remotely capable of disciplining into a functional whole.

Kudos to Bill Gates, however. He got out at the top and is doing truly wonderful things with his fortune. I would prefer to see a fool like Ballmer running MSFT into the ground than have Gates return and leave behind his groundbreaking philanthropic work. The only thing which mystifies is that Gates allows this Boob to continue running the business.

Erich Salomon

A master of the candid photograph.

Before the Leica popularized the candid snap in the hands of the likes of Cartier-Bresson, there was Erich Salomon (1886-1944) and his Ermanox.

With f/1.8 lens and plates.

The lens was very fast for the time and the body took glass plates 6 x 4.5cm (2.4″ x 1.8″). Leicas came with an f/3.5 Elmar as standard, whereas the faster 50mm Leitz f/2 Summar was not introduced until 1933. By contrast, the Ermanox with its f/1.8 lens was first sold by Ernemann, a German maker, in 1924, so it’s not hard to see why Salomon favored it. 2 stops may not seem that much in the day of 6,400 ISO digital, but film was 5-10 ISO at best back then, a full 10 stops slower! In addition to the faster lens, the negative only needed half as much enlargement for the same size print compared to the Leica, reducing apparent movement blur and grain.

Clunky as the camera may be, with the plates meaning only one snap at a time, this German master made the best of what he had, and pulled off great photojournalistic snaps in the 1920s and 1930s before the Nazi killing machine chewed him up at Auschwitz. Salomon was a German Jew, training in engineering and law before devoting himself to photography, something we can all be grateful for.

I was vividly reminded of his work when contemplating the current spectacle of Europe’s evil, corrupt men (and now women) destroying all around them in the interest of self rather than that of their fellow human beings. They call this a Union?

Evil men, wondering how to safeguard their supply of brandy and cigars.

The picture shows various purported diplomats at the 1930 Second Hague Reparation Conference where the assembled victors of 1918 are trying to figure out how to squeeze dry what is left of Germany in the name of war reparations. Brilliant economic concept that – tax the poor into oblivion, drive them to extremism. Among the collected toadies are Louis Loucheur, French Minister of Labor, holding his hands to his eyes, the poor tired dear; French Premier AndrĂ© Tardieu wondering when the cognac would run out. Next is Germany’s Foreign Minister Dr. Julius Curtius, (all Germans are Doctors, it’s a well known fact), yet to realize that he would soon be so much chopped meat. Henri Cheron, French Finance Minister, is on the right, seated in the high-backed chair, hoping his mistress is in town.

What caused this flashback? After all, these were pictures I had first seen when knee high to a grasshopper. Well, just look what is being done to the poor nations of Europe right now by the rich ones. And it’s the same lot, with all their expenses paid by the taxpayer, residing in their fancy palaces, transported in chauffeur driven bulletproof limousines, with legions of servants, wondering how to best screw the taxpayer while preserving their life of comfort and sloth.

Former French Prime Minister Aristide Briand points to
Salomon whom he dubbed ‘The King of Indiscretion”. 1930.
Such a witty, charming rogue, that Aristide.

How sad that Erich Salomon was murdered by the same evil men whom he so ably portrayed. Truly a great photojournalist.

A little more speed for the HP100

A little tweak ….

The other day the Hackintosh HP100 got a nice performance boost when the boot+applications SSD was upgraded from SATA2 to SATA3. Fast disk I/O is essential for best Lightroom and Photoshop performance. Now it’s the CPU’s turn.

Geekbench is a test of CPU speed. It’s a simple and quick comparator of great use to photographers as apps like Photoshop and Lightroom are far more dependent on CPU speed than on the latest in GPUs. Little is to be gained, data suggest, from using a high-end gaming GPU.

Cinebench framing rates are a measure of GPU speed. My Hackintosh HP100 (Sandy Bridge Core i5, 16gB RAM) uses a three year old, low power draw, Nvidia 9800GTX+ GPU, yet returns a very high Cinebench framing rate.

One of the beauties of the Sandy Bridge and later Ivy Bridge CPUs is that overclocking is trivially simple, unless you go crazy. Clock speed is a near-linear indicator of effective speed for like CPUs. Double the clock speed and you should see an almost identical change in the Geebench score. In summary, the Sandy Bridge i5-2500K overclocks from 3.3gHz stock to 4.4gHz with one key entry in the BIOS. The i7-2600K goes from 3.4gHz to 4.5gHz for the same effort. As long as you dispense with the inept stock Intel fan and fit a Coolermaster 212 ($27) or similar, you will be thermally protected. Further, the BIOS has many failsafes to turn things off if heat rises too much.

I have been running my Sandy Bridge i5 at 4.0gHz since inception, or 21% over the 3.3gHz stock, but the new Geekbench Ivy Bridge data for the just released MacBook Pros spurred me to action. Here are those data:

Meanwhile, HP100, perking along at 4.0gHz, records the following in Geekbench 64:

Actual speed with several apps running. 4.0gHz is correct, GB states it incorrectly.

Hmmm. Not good enough, even if my environment measures ‘real world’ results with Mail, Finder, Safari, Firefox, etc. running. You can also bet that the above MacPro data are in an ideal setting with no other apps running. That’s how Apple does data.

So I hopped into the BIOS on the Gigabyte motherboard, changed the ‘Frequency Multiplier’ from 40x to 44x, meaning the clock speed is now 4.4gHz, and restarted. Two minutes later I had the following result:

Core i5 Sandy Bridge at 4.4gHz. 42% faster than stock.

That’s more like it. A 10% clock frequency increase realizes a 9.4% CPU speed gain, and equalling the fastest, latest and greatest from Apple, at no incremental cost to me.

Heat, that bugbear of all computers, remains unchanged.

Temperature graph at 4.4gHz.

The above graph reports the temperature of the four CPU cores from restart. The usual start-up spike quickly disappears to settle at 109F, indistinguishable from the reading at 4.0gHz. The CPU cooler is set in BIOS as a variable speed device, meaning it cranks up only when needed. It sounds just a little louder than at 4.0, meaning it’s working harder but just as effectively. On the other hand, when I was running this test, ambient temperature was a high 85F (we have no air conditioning as it rarely gets that warm in the SF Bay Area) so there’s little to worry about. Things can only get cooler on regular days. The spike toward the right results from starting Lightroom 4. Starting Photoshop CS5 does not make any discernible difference. I have had no stability issues so far.

The Cinebench tests for GPU speed are outstanding. Brown (#5) is for HP100 at 4.0gHz, Orange (#4) is at 4.4gHz (not 4.0 as shown) – 13.3% faster. The highest reading here (#1) is for a Xeon equipped machine with a high end gaming GPU – meaning $1,400 more for the CPU and $1,300 more for the GPU – for a 25% speed increase. Goodness, the all in cost of HP100 is less than one of those components! And PS and LR do a very poor job of multi-threading so a 12 thread CPU is money wasted. Those economics do not solve for me nor does any photographer need to spend that sort of money. #3 is for HP100 running at stock GPU speed but with all other apps closed – hardly realistic, but impressive if you are a marketer. Marketing, after all, is lying for a living.

GPU results from Cinebench.

And when Apple gets faster, you can bet on one thing. With a tweak or two, HP100 will be right there at very little or zero cost.

Looking forward:

If you accept that CPU speed increases are leveling off, and that the focus will increasingly be on lowering power consumption, then simply dropping an i7 in place of the i5 will yield a 25-30% speed increase, for a net upgrade cost of maybe $150 after reselling your i5. I doubt Intel will be able to increase its CPUs’ speeds by more than 5% annually henceforth.

GPUs are already so far ahead of anything photographers need that spending lavishly here makes little sense. Any conceivable pixel density is already supported.

In the case of both, Adobe’s software is a long way behind what the hardware can do. Poor use of multi-core, multi-threading technologies means that far greater gains are to be had from software design than from hardware upgrades. Lightroom, in particular, is showing massive code bloat, with no improvement in operating speed. LR4 is some ten times the size of LR3.

The next frontier is peripheral I/O, where Intel’s LightPeak (we keep hearing that Apple’s Thunderbolt is the latest invention from Cupertino, when in reality it is simply LightPeak on which Apple’s 1 year wasted exclusive has now expired). LightPeak promises disk read/writes ten time faster than USB2, maybe three times faster than USB3. Whether it succeeds like USB2 did, or fails like Firewire has, remains to be seen. Very slow adoption is not encouraging, and I suspect it’s simply not a mass-market selling point. External drives are hardly the norm in the average home. If it does succeed, you can bet cheap PCIe cards will become available and that photographers’ Hackintoshes will be adding these for a few dollars.

Update:

As I’m not about to be beaten by Apple’s poor hardware, I set about adding a little more fire to the pot by tweaking the i5’s frequency multiplier from 44x to 45x, for a CPU frequency of 4.5gHz, and increased the VCore voltage to 1.385volts for stability. Nothing else was changed. System ambient temperature remains at 113F (45C) and is stable. At stock VCore it kernel panics. Intel specifies the maximum safe VCore at 1.52 volts so it’s not like I’m really puashing it here. This is the result – faster than the Core i7 in the fastest MacBook Pro:

Intel Core i5 – 2500K at 4.5gHz CPU speed.
1.4% faster than the fastest MacBook Pro, with more to come.

Cinebench GPU data remain unchanged.

I have shutdown failsafes in the BIOS set at a CPU temperature of 176F (190F is the danger point for the i5 2500K) so everything remains conservatively specified.
These data suggest that a modestly overclocked i7 – 2600K Sandy Bridge should be good for 16,000 or more. But you do need proper cooling to do this sort of thing, not Intel’s stock cooler.

The Z68 chipset on the motherboard does not work happily with OS 10.7.4, and while there are workarounds, it’s not worth the effort. (It slows to a crawl). H67 and P67 chipsets have no issues with 10.7.4. So on the HP100 I’m sticking with 10.7.3 for now. Only P and Z motherboards support overclocking.

Another 25% in speed?

Sure. Get a Core i7-2600k. Look here.

100mm, f/1.4

Nikkor MF lenses on the Panasonic MFT bodies.

This piece will finally join the heretofore parallel lines for the Nikon D700 and Panasonic G3 systems I use. Absent the one in the iPhone 4S and an old Panasonic Lumix LX-1, I have no other cameras.

Adapters and their limitations:

Adapters, most around $25, are available to use Nikon and Canon and a host of other manufacturers’ lenses on MFT bodies made by Panasonic and Olympus. But just because you can do that, does it make sense?

For the most part the answer is a resounding ‘No’.

You have no autofocus, auto-exposure is aperture-priority only, and Canon EF and Nikon ‘G’ lenses require specialized adapters to control the aperture. Otherwise you are restricted to full aperture only as those lenses lack a manual aperture ring. Except for Olympus MFT bodies which have anti-shake built into the body, a Panny user loses that feature also. Any VR/IS in a Canon or Nikon lens is lost. The sheer bulk of most full frame lenses destroys the compact concept of the MFT body’s design and the whole idea has a rather Rube Goldberg aspect to it. Cool to tinker, useless in practice.

Still, I plonked down $23 for one of these the other day and just received it. It adds some value in specialized applications and works with Nikon pre-Ai, Ai’d, Ai, Ai-S and AF-D (manual focus) lenses. If you want to adapt a G series AF-S lens as well as all older Nikkors, buy the costlier adpater with a mechanical aperture control ring. Read on.

Click the picture to go to Amazon US. I get no click-through payment.

Adapter quality:

I opted for the Rainbow Imaging version as user reviews suggested it has a better release catch for Nikon lenses than other cheap ones. Manufacturing quality is very high, the interior is semi-matte but that’s unlikely to have any effect on image quality as the reflectivity is low. Fit of both the Nikon end and the Panasonic end is excellent. Novoflex makes adapters for $300. Save your money. The cheap ones are fine. You can see the full range of Rainbow Imaging adapters by clicking here. There are 30 adapters for MFT alone, including such odd ducks as Alpa (a superb Swiss 35mm film SLR whose quality of engineering puts Leitz to shame), movie C-mount, Contax/Yashica, Retina Reflex (!), Exacta/Topcon, Zeiss Ikon Contax rangefinder (!!), and many others. Fotodiox makes an inexpensive adapter for Hasselblad lenses to MFT.

Checking the flange-to-flange dimensions with a micrometer I found a maximum-to-minimum variation of 0.0001″ (0.0025mm), right at the limit of accuracy of the measuring tool. That would be tough to beat at any price. The grinding of the front flange, which mates with the Nikon lens of choice, is to a very high standard. The body of the adapter is made of very thick alloy and not about to flex, regardless of the lens fitted. The serrations on the barrel provide a decent grip for installation and removal on the camera. A small set screw on the rear flange provides adjustment of tightness of fit on the camera. Springs permit adjustment of the tightness of the front mount. Both front and rear on mine were set just right on receipt, but it’s nice to know that adjustments can be made in the event of wear.

Best lenses:

So which lenses make sense? The MFT sensor is one quarter the size of a full frame one, meaning that you are using only the center of the image projected by a full frame lens. Thus a 50mm lens becomes a 100mm. However, the depth of field remains that of a 50mm lens. Depth of field is solely a function of focal length. A 50mm lens on a 4″ x 5″ plate camera will have the same DOF at any given aperture as a 50mm lens on medium format, full frame, APS-C, MFT, you name it.

That pretty much means wide angle lenses from full frame bodies are a waste of time. Even a super wide 17mm, with all its associated bulk, becomes a semi-wide 34mm on MFT. You are far better off using the kit zoom with all its automation, than using a gargantuan FF wide. It just gets worse the wider you go. A monster 14mm Nikon or Canon is a not so wide 28mm on MFT. Silly. If you want really wide, use something like Panny’s 7-14mm or Oly’s 9-18mm. I use the latter and it’s an outstanding optic.

Likewise, modest aperture standard or medium long lenses make little sense. The Panny kit zoom – 14-42 or 14-45 – meaning 28-90 equivalent on FF, has you covered. And if you want something really long, using a monster FF telephoto on MFT bodies makes little sense unless you need a very fast aperture. But then why bother with an MFT body when FF will deliver superior results with little aggregate change to weight and bulk? The superb Panny 45-200mm (=90-400mm) has decent apertures fully open and built-in anti-shake, making it perfectly useable at the long end hand-held. And it’s tiny compared to anything from a full frame body.

That leaves fast FF lenses and special purpose ones.

50mm f/1.4 Nikkor-S on my Panasonic G3 body.

The fast 50mm makes for a fine portrait lens and permits limited DOF effects, if you can handle manual focus.

Winston. One 60 watt bulb for lighting. Nikkor-S 50mm f/1.4 at full aperture, Panasonic G3, ISO 1600.

As you can see from the snap, DOF is extremely limited fully open and close-up.

In use on the Panny G3:

You switch the body to Custom->Use Without Lens (go figure; I saved this to C2-2 – the G3’s custom settings allow one on C1 but three on C2, the latter selectable using the LCD rather than the top dial) to enable control of the adapted lens and here’s where one of the great advantages of the electronic viewfinder in selected MFT bodies kicks in. With the camera set to aperture priority automation, as you stop the lens down the finder brightness remains unchanged. It’s as if you were using a standard auto-aperture MFT lens! The EVF adapts as the FF lens’s aperture changes, only the perceived depth of field changes. If only the D700 came with an EVF ….

So aperture automation is not an issue, though the finder will report the aperture as 0.0 regardless of how set. You have to check the lens to see which aperture you are using. With aperture-priority automation, the shutter speed is correctly displayed in the EVF.

As for focus, Panny has another trick up its sleeve. By depressing the control wheel into the body, with the G3 you get a 10x magnified center rectangle (the magnification is variable at will), picture-in-picture, which makes manual focus trivially simple and dead accurate. (Panny’s MFT bodies do not have a focus confirmation LED). Far easier than using MF on the FF D700! Press again or touch the shutter release and the EVF returns to normal display. (In the earlier G1 the whole finder image is magnified, but the functionality is near identical). Thus, with a 50mm lens you are getting the focus accuracy of a 500mm, and even at smaller apertures the magnified image snaps in and out of focus sharply, leaving little room for doubt.

Picture-in-picture 10x focus tool in use on the G3.

For my purposes there are just a few lenses in my extensive Nikkor MF collection which make sense to use on the G3. They include the 50mm f/1.4 and 85mm f/1.8 for their fast apertures and shallow DOF when fully open (one of the banes of MFT is too much DOF with just about any lens), the 100mm f/4 Micro-Nikkor for its close focusing ability, and the 300mm and 500mm Nikkors for extreme reach. The 300mm is sort of silly as it’s large, heavy and hard to hold at the best of times, but the 500mm (1000mm equivalent) is a real surprise. This mirror lens, with its slow f/8 fixed aperture. is an absolute pig to focus on the D700. The focus LED indicator is at the very limit of its capability (it starts checking out much below f/5.6) and the finder image is dark. With the G3, the finder image is bright as can be and focusing is a joy. No need for the 10x focus feature. The unmagnified image is easy to focus in any light. And the 500mm Reflex Nikkor, once you get the hang of it, is really a special lens – positively a midget for that focal length and sharp as can be when properly handled. Balance on the small G3 body is excellent.

500mm Reflex Nikkor on the G3.

Neighbor’s backyard test target. 500mm Reflex Nikkor, 1600 ISO, G3, 1/1000.

The above was snapped hand held through a dirty window, the ‘target’ is some 100 yards away.

So the FF->MFT adapter has its uses, even if they are somewhat limited. However, a mirror reflex on the G3 is a joy and a pleasant surprise. It’s almost as if the Reflex had to wait all these years for a body capable of doing it justice.

Using the adapter with the Nikon Micro-Nikkor 105mm f/4 makes for a powerful combination. At closest focus you get 1:1 reproduction, compared with 1:2 on an FF body. Despite the small maximum aperture, critical focsuing is very easy thanks to the EVF, and the outfit balances nicely in the hand.

An even better body for use with really long lenses would be the recently released Olympus OM-D MFT SLR, which has in-body image stabilization, though I do not know whether the IS in that camera works with adapted lenses. However, at $1,000, this overpriced body currently costs twice as much as the G3.

A note on CPUs, processing and EXIF data:

If you have installed CPUs in your Nikon MF lenses, as I have, these do not interfere with the adapter. EXIF data in LR or whatever you use for processing will be missing any lens information, as the camera has no way of knowing the focal length used. Thus if you want to apply a lens correction profile, it will have to be selected manually. As only the central part of the image is being used, the need for lens correction profiles is lower than with FF sensors.

The 16mp G3 sensor figures to the equivalent of 4mP on a four times large FF sensor for same-sized prints. That’s perfectly adequate for 18″ x 24″ prints, as the walls around me testify, provided your technique is up to it.