Whatever happened to FireWire?
USB’s main rival in the early years was FireWire. In 1987, Apple, IBM, and others started working together on a new high-speed interface. By 1995, it was ready, and it was something of a triumph. FireWire – or IEEE 1394, as it is officially known – could manage 400Mb/s in both directions simultaneously, supply up to 1.5A at 30V, and daisy chain up to 63 hotswappable devices. Apple added it to its Macs. Sony used it on its first generation of semi-pro digital video cameras. Microsoft and Intel took an interest. It looked as though it was going to be a thing.
On the cusp of success, the wheels started to fall off. The collaborative effort had also produced 261 patents across 10 companies. Apple decided it wanted a $1 per port royalty payment. A fuss was made, and eventually the cost was dropped to 25 cents, which was to be distributed to all parties. The damage was done, though. Intel pulled out in a huff, and added USB support to its motherboard chipsets, rather than FireWire. USB had reached version 2.0 by this point, and the speed difference had eroded. Motherboard manufacturers could add USB at very little cost, while supporting FireWire meant adding an extra controller chip and paying royalties. Most didn’t bother. Peripherals went with USB, and that was that.
Faster versions – FireWire 400 and 800 – couldn’t save it. In 2008, Apple started dropping it from new Macs; 2012 saw the last FireWire Apple product.
There were other issues that didn’t help: different noncompatible cables for each iteration, and confusing names (Sony called its implementation iLink, for example). FireWire survives here and there, mostly in digital video, but as a
mainstream PC technology, it is dead. Royalty payments can make you a fortune – just ask IBM. However, they are always resented. If the success of a product is in the hands of others, then asking for substantial royalty payments is also asking for trouble.