Premium
This is an archive article published on February 20, 2008

Double standards are good

At the time of European unification, I remember reading a report about problems...

.

At the time of European unification, I remember reading a report about problems that unification and standardisation created. One of these problems concerned railway lines — down the years, most railway lines in Europe had been laid with the objective of keeping the Germans out, leading to disparate gauges. Now that this objective had been junked by history, railway lines had to be re-laid. Those who remember travelling to the Northeast several years ago, or even elsewhere, will also recall the nuisance that passengers and freight had to confront because of trans-shipment from broad gauge to meter gauge. International travel is still a nuisance because of the 110 volt/220 volt difference. The first thing one does is to look for compatible plugs and adaptors. Had there been common electrical standards, the world would have been better off, and in particular, consumers. As consumers, we want our devices to plug and play and this becomes even more important in a globalised world. Since standards originate within industry, sometimes this standardisation occurs naturally. At other times, it doesn’t. It hasn’t happened for electrical standards, but PCs and mobile phones don’t confront the voltage problem, though they do confront the plug problem. Haven’t you ever had to charge your mobile in the bathroom?

Compared to electrical standards, consumer electronics have evolved towards harmonisation more readily. Cameras, even digital ones, are a case in point, though memory cards differ and are often mutually incompatible. Nor have mobile phones quite evolved towards common standards. This raises several issues. First, common standards are better from a consumer welfare perspective. However, since technology evolves through industry action and different firms within the same industry are often driven by competitive motives, standards are bound to be different. Therefore, instead of looking for a common standard and straight-jacketing technological evolution, we should perhaps be more realistic and accept the existence of multiple standards. But we should want these standards to be compatible and work with each other and this is the inter-operability issue. Why should I care whether standards used by my cell phone are the same as standards used by my PC or DVD player, as long as they work together? Second, what role do we want the government to play in this? There can be international agreements, but those have little sanctity unless they have government backing. The WTO’s SPS (sanitary and phytosanitary measures) and TBT (technical barriers to trade) are instances. Had there not been WTO agreements to provide the backbone, these would have had little sanctity.

Who would have cared about the International Standards Organisation (ISO) otherwise? But bringing in the government is fraught with problems. No one can anticipate advances in technology, least of all the government. Nor is it always the case that the ‘best’ technology, however defined, always wins in the marketplace. The VCRs offer the best counter-example. The average life-cycle for technology is short, even more so for software. But the underlying data may need to last for decades and one shouldn’t get stuck with a single standard. The sensible course for any government is to offer choice in standards, including choice in inter-operability. Unlike many other countries, that’s the route we adopted for cell phones and it can’t be anyone’s case that we are worse off. To the best of my understanding, there is a difference between inter-operability and the open source versus proprietary software angle. In any event, the difference between open source and proprietary software is getting blurred. Indian government action reveals a preference for open source, often perceived as free software, though quantification of both costs and benefits remain suspect. But let’s leave that debate aside. Does open source automatically guarantee inter-operability?

Story continues below this ad

Evidently not, why else should an Open Solutions Alliance be formed to ensure inter-operability across open-source software? Stated differently, inter-operability needs to be guaranteed across both types of software and this brings one to the ODF (open document format) versus OOXML (office open extensible mark up language) debate. The ODF is primarily authored by Sun, IBM and Google, while OOXML is Microsoft. The ODF has already been approved by ISO, after a six-year review, while OOXML was rejected as an international standard in September 2007. But there is a review (the ballot resolution meeting) that comes up on February 25, 2008 and countries can change their original votes. India and BIS (Bureau of Indian Standards) voted against OOXML in September.

Let’s separate the arguments against OOXML. First, since ODF already exists, why do we need a second standard? That’s not convincing. There are several areas (including digital pictures) where more than one standard exists. The ODF and OOXML don’t quite have the same end-user in mind. Second, costs will be higher with OOXML. This may be overstated, since costs of ODF are underestimated and those of OOXML overestimated. Third, in September, why did Microsoft try to push through a fast-track approval process in 30 days? There simply wasn’t enough time, the document had 6,000 pages. Incidentally, ISO does allow fast-tracking, and more time has lapsed since September.

Fourth, there are technical problems with OOXML. That’s a fair point and these should be addressed by European Computer Manufacturer’s Association (ECMA), which sets global IT and consumer electronic standards, and has also recognised OOXML. For OOXML to be recognised on February 25, two-thirds of those voting must agree and not more than one-fourth should say no. In September 2007, 53 per cent said yes and 26 per cent said no. When India votes now, if technical objections have been met, BIS should vote for choice, not for its lack. Yes, there is non-transparency in determining global standards and often in SPS and TBT cases; developed country standards have been pushed down developing country throats. But that seems to be a non-issue in the present case, despite US having voted in favour of OOXML in September. And one should be realistic. ISO recommendations aren’t mandatory. India, or any other country, doesn’t have to accept them. Almost certainly, regardless of what ISO decrees, both ODF and OOXML will be around and perhaps will eventually converge. In a long-term sense, the ISO vote is irrelevant. But in the short and medium-term, there are problems of transition. In addition, the Indian vote should be judged on its philosophical merits. Has India voted for competition and choice, or not?

The writer is a noted economist bdebroy@gmail.com

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement