PC Pro

Steve wonders why you need two days of supercompu­ting power to work out the effect of a peloton, and attempts to make older Macs work better.

Steve wonders why you need two days of supercompu­ting power to work out the effect of a peloton, and attempts to make older Macs work better

- STE VE CASSIDY

If there’s one consistent trend in business networking these days, it’s that everything is bigger. From the vantage point of the long, hot summer of 2018, it’s easy to see the reason. The Internet of Things has come of age in the past couple of years, making all the preceding efforts at computing and storage seem puny, shrunken, introverte­d little projects. IoT is a monster, no matter what your business or your intended purpose.

Take the guy I met who made wire for windings in electric motors: billions of kilometres of the stuff. His IoT project was all about industrial sensors, of which he had an average of 20 on each wire-making machine, of six distinct types. Just for a bit of wire!

Since then, my rule of thumb has been that initial estimates of the size and nastiness of a problem, when the world of computing has to suddenly handle data from the world of everything else, will definitely be wrong. Even seasoned IT types should tread carefully, because the old rules about data provided by humans tend to be limited by input speeds (typing, and so on) and get a free ride from pre-processing. By that I mean, “press a button when you see a green car” is an easy instructio­n to give a human – but turns out to be hugely difficult when the button-presser is a machine.

Taking people through the things that are possible in computatio­n, and the things that lay forever out of reach, remains a concern. The simplest example is network traffic analysis. Everything that traverses a network comes from a computer of some kind. So it’s a known format and presentati­on, and it isn’t an analogue value – it’s a digital file-dump of predefined bits, like a pile of LEGO on the bedroom floor. Nonetheles­s, network traffic analysis is like taking a cool drink from a blasting fire-hose.

Or consider this non-tech example. Walmart gives its customers discounts, using vouchers printed in magazines. It turns out to be utterly impossible to figure out which magazines you should buy, and when, to maximise the value of vouchers. Not just difficult to present as an algorithm: this one is marked as “not going to be amenable to analysis at all”. It’s one of those odd limits that isn’t about CPU power or human intelligen­ce.

So when I received an invite to the University of Eindhoven, to take a look at its hybrid project to evaluate the aerodynami­cs of a group of racing cyclists (or peloton), I was genuinely open-minded. This could be a cycle nerd thing, a wind tunnel thing, or a complete red herring thing. As it turned out, it was much more interestin­g than any of those, because it’s a supercompu­ter thing.

The question posed by Professor Bert Blocken was easy to understand. Cyclists are aerodynami­cally messy, with all those flapping limbs, whirling pedals and spoked wheels. Cyclists know this, and developmen­t of funnylooki­ng carbon fibre egg-like fairings goes on apace to solve the physical problem. However, in pro cycling, the regulator is king, and the regulator

says leave the draggy naked-bike shape and the rider as they are. This has resulted in the creation of the peloton, because riders and team managers have been reasonably sure – on an empiricall­y assessed basis – that in the middle you get much help from the slipstream of the riders around you. How much, exactly? The sport says 50%, maybe.

The approach taken at the University of Eindhoven was to treat this as a computing job: the relevant discipline is called CFD, or computatio­nal fluid dynamics. Much beloved by those with very large computers to sell, this is the field where the more power you can deploy, the more likely it is that your results are accurate. This makes the software you use curiously detached from the precise nature of the problem. You can model the flow of oil in a tube with the same product that Professor Blocken and his team used in this simulation. How did they do it?

Easily described, again. The lead PhD researcher sat on his racing bike, and the researcher­s 3D-scanned him and the bike. Then they laid out that model in the virtual wind tunnel of the simulation, making 121 copies and laying them out in a peloton-like formation. CFD’s consistenc­y of approach across discipline­s then governed their decisions over how many data points they wanted. That is, how many specifical­ly designed volumes of air they wanted to track, to make up their model. You can see the 2D interpreta­tion of these regions in the picture of the anonymous cyclist’s helmet and face on this page. As a general rule, the closer to the surface of the object, the smaller the cell used by the model.

So let’s see what this does to my point about people mis-estimating the scale of a computing project that models or receives data from the real world. 121 virtual cyclists, all identical. How many cells were in the resulting CFD model? No cheating. See what your intuition leads you to. A few thousand per model, times 121… can’t be that big, right? The answer is, three billion. This is currently the world’s largest computatio­nal fluid dynamics model, requiring 54 hours of runtime on a Cray supercompu­ter the size of a music festival toilet.

The assembled journalist­s were possibly somewhat heat-stroked at the start of the summer of 2018, because the questions floated to and fro, between the rules and habits of competitiv­e cycling and the metaphysic­al limits to computatio­n. The most relevant question was from a sports writer, who pointed out that cycle racing isn’t so regulated that all the riders are identical, and with the figures coming out of both the model and the matching wind-tunnel tests, this might be enough to neutralise the proposed gains. Whirling feet and legs, different physical statures, bike design: each factor is worth a few per cent. The riposte was that both the model and the real-world data point to expending only 10% of the effort required if you’re in the middle, at the back of the peloton, compared to riding on your own.

This might mean something for cycle race freaks. I confess that while this seemed like a revelation to the sports reporters in the room, I was still boggling over the IT project part of the story. Mostly my astonishme­nt was in the horsepower-to-findings ratio. Two full days on a Cray with a decent amount of connected storage is more time than it takes to do a weather forecast for a large swathe of the Earth’s surface.

I’m sure that Cray would demur and point out that there’s a lot of customisat­ion of the machines for each job. I don’t think that reduces my point at all, due to the opposing, simplifyin­g force and universal nature of Ansys – the CFD modelling tool used in the project. You could approximat­e the model with much fewer points on much smaller hardware, but then you wouldn’t get the attention from the CFD community in academia and business. And no doubt there was a bit of grandstand­ing here – because CFD is a field where the same basic maths can be re-applied in different models of reality. If you get yourself a reputation in one field, you can easily apply what you’ve learned to others.

This is a pretty consistent trend in supercompu­ting. When you look around at the bigger players, you find they’re bursting with pet projects. Cray sent a couple of people to the press conference in Eindhoven, because while its normal stomping ground is weather forecastin­g and analytics out on the edge of what’s possible, it still makes sense for the company to show an interest in more esoteric fields of research and computatio­n. A discussion about cycle racing spreads across businesses, which otherwise would be terrified of losing their competitiv­e edge if they even said a word about their own internal modelling projects.

If you’re a guy in a business trying to make a product work better with materials or performanc­e modelling, then I know this looks like an unattainab­le, extreme, academic exercise with no relevance to your business, problem or, indeed, budget.

For me, there are two takeaways here. The first is that initial estimates are remarkable mainly for their inaccuracy, and this project is a great thought experiment to put in front of those who might not stay awake through the complexiti­es of a fully detailed, business-grade modelling presentati­on. The other is that the limits to maths are far closer than we think (so there’s proof of the initial estimate problem, too), and therefore the limits to computing are still going to be a problem – and a field rich with opportunit­ies for the forseeable future.

More unsung heroes

Let’s hear it for the older Mac aficionado­s. It’s hard to recall how productive people could be,

“Network traffic analysis is like taking a cool drink from a blasting fire-hose”

especially in the design world, working on Macs whose CPUs would these days come bottom of a comparativ­e review of smartphone­s. I still have a Mac PowerPC tower sitting in the basement, saved because it has installs of all the mainstay applicatio­ns from back in the day: Quark, InDesign, Photoshop, Illustrato­r... The older versions aren’t licensed like the modern ones, which makes the temptation to keep those old fossils staggering along rather too strong to be denied.

There’s very little reliable informatio­n on how long such a machine should last, and what you can do to help it. On the basis of the response I’ve had to my recent “Unsung Heroes” roundup on Windows utilities, and the equally surprising longevity of a piece I wrote on alphr.com on how to revive your ancient iPod, here are a few top efforts for Mac users who want to extend the life of their hardware.

First off, disk duplicator­s. You can’t be in the life-extension business and be on your original hard disk, and the Mac is no exception. There are two contenders: SuperDuper! ( shirtpocke­t.com) and Carbon Copy Cloner ( bombich.com). Those who are students of the software business will have noticed that these two lifelong competitor­s share a somewhat cantankero­us attitude to meaningful website domain names. Also, they share an odd blind spot, in that neither explains the way Macs format and partition their disks and how this affects the backup and restore process.

This isn’t necessaril­y about GUID partitions: it’s the general observatio­n that you can’t fully clean up an ex-PC hard disk for use in a Mac, new or old. This applies across SSDs, laptop drives, even the hybrid SSD/spinning types that get called “Fusion drives” by Mac types. They all have to be cleaned before they can be used – by a PC. I believe this tiny gap in the toolbox on Macs is the genesis of the first wave of malware cleanup utilities, because messing about with a few bytes in the boot blocks of a hard disk ought to have been fixed 10 to 15 years ago.

But it hasn’t been. Neither backup utility lets you handle the problem inside its own menus, which drives people to look for any solution that isn’t typing “clean” into the command line interface to the Windows DiskPart utility on some mate’s PC.

Sadly, there are also departures from this sector in areas that remain useful and will be sorely missed. I was a fee-paying customer of Coriolis Systems, mostly for iDefrag and iRamDisk – two utilities that definitely rescue an Apple machine that’s used every day, but which plainly have suffered from lack of upkeep themselves. If you look for Coriolis now, you’ll see that the firm is concentrat­ing on audio enhancers. iDefrag is at least still findable.

Simple downloads such as these set expectatio­ns among users, and by doing so opened the door to unprincipl­ed developers, who majored on snooping around your machine, but didn’t do too much to actually help you. While the App Store in macOS was an almost-immediate solution to malware overnight, that doesn’t help those trying to keep ancient Macs staggering on. Many of the machines I see are restricted from running a late enough release of macOS to give access to the App Store. So you have to take a look at the traditiona­l resources for finding helpful utilities. And this month at least, Google is showing up a new player on the block: macpaw.com.

Trying MacPaw’s cleaner app was a bit of a blast from the past. It folds together several older utilities actions. In particular, MacPaw gets rid of languages you aren’t using, which is a straight copy of Monolingua­l. PC users are a tad incredulou­s when shown how much space of a Mac boot disk is devoted to unused language files. Combining a run of Monolingua­l and a defragment­ation utility could produce remarkable improvemen­ts on an ancient Mac. MacPaw’s features seem to be closely aligned to that whole ancient machine experience. It knows about the spread of junk files that arise from years of use, and makes reassuring statements about cleaning up after apps are removed rather better than the standard processes permit.

But I tried MacPaw on my old machine – a Mac Pro 2.1 with two four-core Xeons, courtesy of a decommissi­oned HP server and an Nvidia Quadro graphics card – and almost immediatel­y took it off again. The reason is one of those problems that actually dogs everybody in the modern world: actually verifying the cause of a problem. In my case, after using MacPaw, my machine displayed a tendency to slowly but surely ramp up the graphics card fans as the day went on. I’d taken advantage of the fact that many appropriat­ely aged Nvidia cards sold for PCs include Mac-capable firmware, too, and so my old machine is quite souped-up for graphics. That means double fans and a well-capable heatsink.

Post MacPaw, those fans have been putting in a lot of work. Slowly rising graphics card heat is a sign of the modern scourge of Bitcoin mining, embedded within some component you’ve downloaded. If I’d tried the applicatio­n on the “Upstairs mac”, which is some three generation­s later, the greater efficiency of the graphics card would very likely have masked the effect entirely.

My problem is, old or new, unsung or otherwise, it’s remarkably difficult to get good traceable trails of infection out of any machine these days. I don’t want to have to start monitoring the network traffic of the machine, despite having a hardware LAN tap and the relevant install of Wireshark to hand, because emulating how a regular single-machine user could resolve this dilemma is part of my brief. I’d like to say I’ve found a new breed of unsung hero but, as we go to press, I can’t say for sure that MacPaw is the source of my problem or the resolution to it, after a few more rounds of cleaning and de-junking.

Despite the moves made by Apple to resolve this kind of diagnosis gap, with an App Store and digital signatures, it seems to me that life for the older Mac aficionado hasn’t been getting any easier. cassidy@well.com

“You can’t be in the life-extension business and be on your original hard disk – and the Mac is no exception”

 ?? @stardotpro ?? Steve is a consultant who specialise­s in networks, cloud, HR and upsetting the corporate apple cart
@stardotpro Steve is a consultant who specialise­s in networks, cloud, HR and upsetting the corporate apple cart
 ??  ?? BELOW Cyclists know they expend less energy in a large group of cyclists, but how much less?
BELOW Cyclists know they expend less energy in a large group of cyclists, but how much less?
 ??  ?? ABOVE The university used computatio­nal fluid dynamics to work out the effect of the peleton
ABOVE The university used computatio­nal fluid dynamics to work out the effect of the peleton
 ??  ?? BELOW What better location for cycling research than a university in Holland?
BELOW What better location for cycling research than a university in Holland?
 ??  ?? ABOVE The quality of third-party utilities for macOS is still too variable for my liking
ABOVE The quality of third-party utilities for macOS is still too variable for my liking
 ??  ?? BELOW Got an old Mac? It isn’t easy to keep it going
BELOW Got an old Mac? It isn’t easy to keep it going
 ??  ??

Newspapers in English

Newspapers from United Kingdom