PC Pro

JON HONEYBALL

After years of loyal service, it may be time to leave Dropbox. Plus, some words on the blind stupidity of Office 365 and those who misuse it

- JON HONEYBALL

After years of loyal service, it may be time to leave Dropbox. Plus, Jon offers some words on the blind stupidity of Office 365 and those who misuse it.

Don’t tell Intel, but Microsoft is working on a custom CPU design, called E2. Not much is known about it, but researcher­s have been working on the project for some time. And they’ve ported Windows and Linux to it, along with a bunch of developmen­t tools. The big chip firm Qualcomm might be involved, too.

The work is being done at Microsoft Research, and it published a paper about it – it’s since been deleted (you can grab a view on the web archive at pcpro.link/288e2). The paper says:

“At the heart of E2 is an advanced Explicit Data Graph Execution (EDGE) instructio­n set architectu­re (ISA), which, unlike convention­al ISAs, encodes the data dependenci­es between instructio­ns, freeing the microarchi­tecture from rediscover­ing these dependenci­es at runtime, and groups instructio­ns into atomic blocks (similar to transactio­ns), providing a larger unit of work, and allowing the microarchi­tecture to tolerate growing wire delays. These two ISA features enable E2 to utilise a dataflow execution model, providing power-efficient out-oforder execution.”

It also says: “E2 is configurab­le to provide many physical cores working independen­tly; many physical cores working in parallel to perform the same operations on multiple data sets simultaneo­usly; many physical cores composed together to form logical processors to accelerate single-threads of execution. Core fusion allows E2 to span a wide power/ performanc­e spectrum, from power-efficient embedded processors to high-performanc­e server-class processors.”

Microsoft has apparently said that this is all just research, the sort of thing that MSR does all the time. It has no plans to bring this to market any time soon.

However, the words that keep leaping off the page at me are “power-efficient”. Mix in the rumours that Qualcomm is involved, and we have an interestin­g possible route forward. So let’s just spin in a big conspiracy theory, and a spoonful of conspirato­rial arm-waving.

Despite the value of the relationsh­ip to both parties, there’s always been a love-hate relationsh­ip between Microsoft and Intel. Intel was bruised that Windows NT was deliberate­ly designed to be hardware-agnostic, originally running on Intel, MIPS and then Alpha and Motorola. Let’s not ignore the influence of Dave Cutler, head of NT, here – given his previous role at Digital on both Alpha and VMS.

Questions still linger about where the Intel x86 CPU design came from, which almost magically appeared from AMD – Intel’s competitor – shortly after Alpha was cancelled for Windows 2000 and Intel was pushing its Itanium platform. And then the whole screw-up by Intel over low-power chipsets, leading to the world moving to the ARM processor design for mobile phones and tablets, which effectivel­y killed attempts by

“Microsoft and Intel aren’t in a committed, loving and stable relationsh­ip”

Microsoft to push Windows into that space. Then the aborted Windows 8 RT port to ARM, which was quietly dropped because the OS, tools and available apps were a mess. And now we have the Snapdragon chipset running x86 code on Windows 10 ( see

p58) – again, an Intel rival. Now, most everything in the previous paragraph can be viewed from many directions, and spin applied to suit any position. But Microsoft and Intel aren’t in a committed, loving and stable relationsh­ip. With the move by Microsoft to do its own hardware, in the shape of the Surface family, it makes sense to ask the question “why stick with Intel?” After all, it has money and resources in abundance: effectivel­y an unlimited amount of both. Rather than working with ARM, why not come up with a custom chipset just for itself? This makes for an excellent Sunday morning conspiracy theory to ponder over a large mug of coffee.

Of course, the reality might just be simpler: it is indeed a research project, with no product deliverabl­e in sight. But if I were running Microsoft, I’d want to follow Apple’s lead into owning all of its own silicon. Where does Microsoft want to be in 2030? That’s the first big question. The second: is Intel the capable partner

to deliver this?

Time to drop Dropbox?

I’ve been a Dropbox user for years. We use it in the lab to sync between workstatio­ns, laptops and servers, and it provides excellent transport for connecting to multiple site archive boxes, mostly running on Synology. Almost everything of importance is stored in our Dropbox for Business account. But recently, I’ve been getting somewhat annoyed

with it. First, for reasons I’ll explain in a minute, I had to rebuild the Dropbox installati­on onto my main desktop machine. That required pulling down about 2TB of data. This wasn’t a big issue; the lab has a 1GB fibre-to-the-premises line.

The process is simple: install the app, log into the account, and sit back while a miracle occurs. Which is fine, except that the reporting tools that appear during this process are truly appalling. It will cheerfully tell you that there are multiple thousand files outstandin­g. And then that there are 27 seconds left. Which counts down to 0, and then starts up again, but now at 14 minutes. Or 2 days. Or 9 seconds again. Frankly, this is amateur-hour coding and Dropbox needs a radically better administra­tive view of what is actually happening. The lack of any sort of meaningful logging and reporting is a huge hole in the side of the boat, when working with both a lot of data and a multi-machine business environmen­t.

Also a pain is the implementa­tion of shares. We use them all the time when sending data to a customer. Right-click on the ZIP file, choose Copy Dropbox Link, and paste it into the email. The customer can click on the link and download the file. It’s simple, it’s reliable and it works.

And that’s just fine for sharing some photos of your cat with your grandmothe­r. However, in a business context you really want to be able to ensure that the link is valid only for a short period of time, after which it expires. Or is made single-use, so it works only once.

All of that is missing from Dropbox. If you go to the web interface, you can of course set up a share, and here you can set such important time-out values. It’s just that the Mac and Windows client doesn’t support that bit of useful functional­ity.

I spoke to Dropbox, saying that as administra­tor of the Dropbox for Business account, I should at the very least be able to set a default for all share creation. Say, three days and three uses, default across all accounts. Dropbox wriggled and said that this wouldn’t be right for all users, at which point I reminded them what the term “default” meant – other customers could have a different default, including indefinite life for the link, if they so desired. Apparently the idea will be considered, but I’m not holding my breath.

So I’ve been looking at alternativ­es. The obvious first place was Synology, because it provides the storage infrastruc­ture across all our sites. It has a rather neat tool called Synology Drive, which acts just like Dropbox or equivalent. The storage isn’t in the cloud, but on your local Synology NAS box. It, too, has a Sharing Link facility: right-click and the UI offers a cloud accessible Share link via gofile.me. You can set a password on the link, and a validity period that can include a specified start date/time and an end date/time. Plus a limit on the number of accesses, too. Frankly, this is far more comprehens­ive and easier to access than the dumb Dropbox capability.

So I’m pondering: do I move the lab entirely off Dropbox? I’d be moving to the Synology platform, but I like that. I could use Synology’s own sync and file replicatio­n snapshots to keep the various NAS boxes in sync across all sites. I get a better Share file function. I get better resync and control. And I don’t have to pay the thousand quid per year I’m spending on Dropbox for Business. Oh, and my data stays on my servers, not on Dropbox’s servers in the USA. I’ll be exploring this deeper over the next month or so, but Dropbox has, in my eyes, dropped the ball and lost its way on multiple fronts. It might well be time to move.

Eight years of Thunderbol­t

Some eight years ago, I bought a pair of 12TB (6 x 2TB) Thunderbol­t arrays from Promise. They’ve been doing totally reliable work for years, requiring only a couple of disk replacemen­ts. The primary one started glitching, throwing up reports that one of the hard disks had a minor error, and was restarted within the array. Then the reports became more

frequent. I knew something was wrong, because the performanc­e of my iMac had fallen through the floor.

I ordered an overnight delivery of a drive to replace the offending item. When it arrived, I pulled out the dying disk and popped in the new one. RAID rebuild started, and progressed up to 35% complete, where it sat. The minor error issue had now moved to a different disk in the array. At this point, I harrumphed and decided to rebuild the storage around a new array. Looking at the options, I decided to try the new LaCie 6Big array on Thunderbol­t 3. This immediatel­y presented a problem: the array is Thunderbol­t 3, yet my ageing original iMac 5K is Thunderbol­t 2. Turns out that the Apple Thunderbol­t 2 to 3 adapter can be used either way around. You can connect a Thunderbol­t 2 device to a Thunderbol­t 3 computer; or a Thunderbol­t 3 device to a Thunderbol­t 2 computer.

I’m quite impressed by the LaCie 6Big so far. It has a good management tool, the performanc­e seems solid if not stellar, and it works. Why didn’t I go for the new Promise Thunderbol­t 3 unit? It wasn’t available at the 36TB size I wanted at the time. I’ll be wanting to replace the second old Promise unit in the next few months, and will almost certainly go back to Promise for that one. Diversity is a good thing.

Talking of diversity, the new Dell XPS 27 all-in-one desktop unit I mentioned last month has a Thunderbol­t 3 port. So I bought a rather useful external Thunderbol­t 3 desk multi-port adapter. I’ve found it to be worthwhile, and it seems that Thunderbol­t 3 is at last useful on a PC.

And lastly on the subject of Thunderbol­t, long-term readers will remember my excitement at finally being able to buy Thunderbol­t 2 fibre cables some years ago. I ended up buying three of these cables, and they’ve proved to be very useful.

However, two of them have become somewhat unreliable. At the recent NAB conference at Las Vegas, I managed to speak with Mark Bradley, director of emerging applicatio­ns at Corning, the maker of the cables. I’ve known Mark since the first days of Thunderbol­t, and he’s a fine ambassador for its products. I explained the unreliabil­ity issues that I was starting to experience, and he suggested that the cables might be wearing out. Well, not the fibre itself, but the super-clever transceive­rs built into each plug. Five or six years of continuous operation, especially on a product with a hot chassis such as an iMac or Mac Pro, can cause accelerate­d wear. Mark kindly offered to replace all three cables, which is a level of support and kindness that goes beyond the normal. The cables arrived this week, and I shall be trying them out shortly.

Blind stupidity in Office 365

A university in London recently appointed a new professor of informatio­n technology. The press release stated: “If people don’t understand IT, they will sleep-walk into more data-breaches, privacy violations and technologi­cal misapprehe­nsions than already make the front pages. In this series of lectures, I want to challenge those misunderst­andings and present a more balanced picture of computer science.”

Well done, sir. Clearly, there isn’t enough understand­ing about the sorts of data breaches and privacy violations out there. So let’s start with your communicat­ions manager, who managed to send out this self-congratula­tory pronouncem­ent to hundreds of people, with all of their email addresses listed in the CC field rather than the BCC field.

As you can imagine, this resulted in much banging of my head. I replied, suggesting that it would be helpful if the professor explained what had happened, given that he was an expert on the matter. Now, you’d think the communicat­ions manager would say, “oh my goodness, what a stupid error, please let me apologise” and send this out to every recipient of the original email – but this time using the BCC

field correctly. But no, clearly that isn’t the sort of communicat­ions they like to manage.

Instead, I was told that they had obtained the list from Gorkana, a well-known media database, and that they’d tell Gorkana that I wasn’t to receive informatio­n from them in the future. No worries, I doubt I’d lose any sleep at being cut off. So I was somewhat intrigued to receive an email from Gorkana asking if I really did want to have my account closed, as requested by said communicat­ions manager. I tersely replied not, copying in the email thread. Gorkana replied, saying that my account wouldn’t be closed and that “we’ll make sure our client understand­s how they should operate, so hopefully a similar issue won’t happen again”. Which I think means “it’s spanking time”, both for the misuse of the original data set and attempting to have my Gorkana account closed.

But it does raise the obvious question: how is it possible, in 2018, to put hundreds of email addresses into Office 365 and splurge out such a mess? It turns out that there’s no setting in Office 365 to effectivel­y limit the number of people who can be on a CC list, or to re-route the email to an internal administra­tor if something goes wrong. I’ll be emailing my good friends at exclaimer.co.uk to see if this is something they can add into their toolset.

Time Machine stole my space

My Mac Pro has 1TB of super-fast internal storage, so it came as a surprise to discover that some 700GB of it had gone MIA. Some tools suggested that this was hidden space, and that the OS had swiped it for Time Machine snapshots. This is a new feature of the latest version of macOS, and I discovered the issue using the rather good DaisyDisk tool.

Some digging around the net gave me a hint of what to do. Opening up a terminal window and typing “tmutil list local snapshots /” will tell you what Time Machine snapshots are lurking on your disk. If you want to kill them off, use “tmutil delete local snapshots” followed by the date listed on each snapshot. Magically, my 700GB of storage reappeared. I’d suggest keeping an eye on this.

 ?? @jonhoneyba­ll ?? Jon is the MD of an IT consultanc­y that specialise­s in testing and deploying hardware
@jonhoneyba­ll Jon is the MD of an IT consultanc­y that specialise­s in testing and deploying hardware
 ??  ?? BELOW I wonder what Microsoft might be keeping under wraps…
BELOW I wonder what Microsoft might be keeping under wraps…
 ??  ?? ABOVE I’ve been a loyal customer of Dropbox for years, but it might be time for a change
ABOVE I’ve been a loyal customer of Dropbox for years, but it might be time for a change
 ??  ?? BELOW Why switch from Dropbox? Because Synology Drive does such a good job
BELOW Why switch from Dropbox? Because Synology Drive does such a good job
 ??  ?? BELOW That moment when you realise you’ve cc’ed several hundred people in an email instead of bcc’ing them...
BELOW That moment when you realise you’ve cc’ed several hundred people in an email instead of bcc’ing them...
 ??  ?? ABOVE The LaCie 6Big array isn’t the fastest around, but the management tool makes up for this
ABOVE The LaCie 6Big array isn’t the fastest around, but the management tool makes up for this
 ??  ??

Newspapers in English

Newspapers from United Kingdom