At a press conference in April, the Santa Cruz Operation announced an
incremental step in forwarding the movement begun last October to develop
new generations of the Unix operating system. In April SCO shared the stage
not with its allies Hewlett-Packard and Novell, but with a group of OEM
systems vendors: Data General, ICL, NCR and Unisys, and with Intel, which
will supply the processors on which the systems will run. Also supporting
the announcement via videotaped statements were OEM partners Compaq, Olivetti
and Siemens Nixdorf, and an array of independent software vendors (ISVs),
most prominently Oracle.
The purpose of the gathering appeared to be to reassure the world that the
development process is going according to plan. All the OEMs present were
already known to be involved. SCO UnixWare version 2.1--the first release
under SCO's aegis--had been released in February. The merged version of
SCO's own Open Server and UnixWare (code-named Gemini) is still scheduled
for next year and the 64-bit system, being developed with HP, for 1998.
There were no surprises at the event, but at this stage in a long-term process
no one wants surprises. Rather, whatever significance there was lay in demonstrating
that the partners are avoiding the wrangling and contradictions that marred
past Unix vendor collaborations.
The announcement fo-cused on Unix systems running on Intel chips and the
goal of establishing a volume market for them. As such, there was no reason
for HP, Sun Microsystems or other major vendors of RISC-based systems to
appear.
The OEM Position
In these days of atrophied hardware margins, second-level systems vendors
such as these OEMs are unwilling to bear the full weight of development
costs that don't demonstrably add value to their products. "Their orientation
is not in technology leadership," says Alok Mohan, SCO president and
CEO. "The cost of developing Unix has been constantly escalating. Economically
it becomes untenable." SCO, its staff enhanced by the experienced Unix
developers inherited from Novell, has taken on that role.
The spin that SCO and others brought to the announcement was that SCO UnixWare
will compete in the "midrange enterprise market." For larger "enterprises,"
this phrase describes primarily the middle tier of three-tier architectures.
Although the participants would not say so, their effort at this point seems
to have as a main thrust the attempt to slow the incursion of Microsoft
Windows NT onto such second-tier (or departmental) servers.
This impression was strengthened by the speakers' vow to attract large numbers
of ISVs to the unified platform. Portions of the Unix industry have long
envied Microsoft's domination of ISV loyalties. Once again, this initiative
will try to attract ISVs to develop popular applications on--or at least
port them to--Unix by promising a single development target that offers
binary compatibility on multiple platforms. As in the past, the viability
of high-volume sales of Unix depends upon application availability.
Although SCO is driving the process, its OEM partners are not merely passive
recipients of the technology. All have substantial investments in their
own Unix variant, even if they don't want to continue to go it alone. (Each
also resells Open Server as an option.) Harmonizing the old with the new
will be a delicate matter. No one will want to give up features they see
as differentiating them in the market.
"There will a challenge about where people do standards and where they
do value-add. It'll be a tough balance for all vendors," says Philip
Johnson, director of advanced operating environments for International Data
Corp. (IDC) in Mountain View, CA. SCO finds itself on the middle of this
seesaw. "A lot will be determined by how well SCO delivers on commitments
they've made," Johnson says.
Can They All Get Along?
A key question is what the OEMs will contribute to SCO UnixWare from the
higher capabilities of their own established Unix variants. For example,
DG, ICL and NCR all indicated that they want to assist in adding features
for clustering and fail-over, including non-uniform memory access (NUMA).
How will the partners decide which implementations to adopt or combine?
How will the winners sell the losers on the outcome?
Add to this dilemma the necessity each vendor feels to reassure its own
installed base of customers that it will not desert their investments. They
may have to perform what Johnson calls "a juggling act between this
idea of the future and their current product lines." There exists a
danger of sending mixed messages or of having them misunderstood. Ironically,
the only one of the participants that does not also have a serious NT commitment
is SCO itself.
If some of this talk sounds familiar, that's because it is. Many customers
have stopped listening to promises of vendor "coopetition." The
various consortia, with their slow-moving, consensus-based processes, have
failed to stem the tide away from open systems. However, this case may be
different. It is not just "another arm of the standards movement, which
can't deliver products," said Ninian Eadie, ICL group director for
technology, based in London, at the announcement. He insisted that this
initiative will result in a product that is a de facto standard, responsive
to market demand.
Mohan of SCO also emphasizes that this is not a consortium activity. "This
cannot be a democratic process," he says. "There has to be a decision-maker.
We'll consult with the OEM partners, but in the end we will do what is right
for the business case. Everybody has to bend some, and they know that.
"When people are marching in step, it is harder for just one to take
a tangential path," Mohan continues. 'The cost of taking a separate
path is higher now."
SCO on the Line
When Novell gave up last fall its role in directing the future of Unix,
SCO gained a centrality it never had before. Many eyes, both friendly and
hostile, will be judging its performance. "This is a bet-your-company
situation for SCO," says Johnson of IDC. "They're getting competition
from NT, and soon they'll be under some pressure from Linux [at the low
end]. SCO has got to move toward the enterprise, and this is the way to
do it."
Mohan takes a more sanguine view of his company's prospects. "We have
bought ourselves a range of possibilities," he says. "Even the
not-so-nice scenarios are better than where we were in the past."
So far, despite evident pressure, the Unix-on-Intel convergence has been
able to set its own pace. But the day of reckoning probably is less than
two years off, and it is in the hands of Intel, not SCO. "A key gate
exists at the shipment of Merced [Intel's next generation of chip],"
Johnson says. "It will ship even if SCO hasn't finished its work."
"We have to be there," Mohan acknowledges. "This is a highly
leveraged R&D model. A lot of the code we're talking about is already
available. We'll make big strides in the next 18 months."
--Jeffrey Bartlett
In these days of online rhapsody, the origins of the Internet in the
matrix of open systems often are taken for granted, if not ignored completely.
Perhaps, then, it is worth asking whether the Internet boom has been good
for open systems.
A chorus of analysts, observers and Unix industry veterans agrees that not
only has the Internet enhanced the momentum of open systems adoption, it
has become the new generation of open systems. To a large extent,
they say, the Internet has carried the open systems baton to a level of
success that few thought possible two years ago.
"It's a massive demonstration of what the power of open systems can
do," says David Bernstein, an independent computing consultant in the
San Francisco Bay Area and a former Unix developer with the Santa Cruz Operation.
"As kind of a giant, distributed, open operating system, it's the ultimate
accomplishment of open systems."
David Smith, research director of the Gartner Group in Nashua, NH, says,
"The Internet is the next bastion of open systems. Some people may
have thought open systems was a niche or a bunch of hype--which it was--but
now people can see some of the benefits without that hype."
It is a fact that Unix and related open systems technologies made the Internet
happen. "The Internet came out of TCP/IP and the interoperability that
was produced by the Unix community," says Doug Michels, executive vice
president and chief technical officer of SCO. "It embodies the spirit
of open systems that we've always tried to achieve. One of the tenets of
open systems has been to define interfaces where portability and interoperability
were guaranteed, and the Internet has made some of those interfaces more
obvious and more important."
The New Generation
The Internet started with universal messaging--a concept so simple and so
much in demand that today electronic mail is taken for granted. Then easy
access to widely dispersed information became realizable through the interface
of the World Wide Web. Now applications themselves have begun to circulate
over the Web under the auspices of the Java programming language. The result
has been what some call a new universal computing client: the Web browser.
"The browser in effect becomes an environment to which information
can be written at the first revision level, and in time applications can
also be written, so it won't matter whether that browser is running on top
of a Macintosh, a PC or a Unix system," says Philip Johnson, director
of advanced operating environments for International Data Corp. in Mountain
View, CA.
The concept is carried even further by Rikki Kirzner, a director of Blanc
& Otus public relations in San Francisco and formerly an open systems
analyst with Dataquest. "The Internet made open systems irrelevant,
because it's a great leveler," Kirzner says. "It took the operating
system to the next level. In the same way that standardizing on operating
systems made hardware the black box, the Internet has made operating systems
an 'I don't care' situation. Now we don't have to worry about how to talk
to the operating system."
Johnson believes that the introduction of Java was a big step toward the
universal client. "While there's always been an interconnectivity created
by the Internet, what's changed is the Web browser hosting programming languages
and tools that go beyond the browser," he says. "Now you're not
just sharing data but having immediate online access to data. That's a full
generational shift. All of that is transparent and is done in a totally
open fashion, because all clients can get to it with equal accessibility."
Accepting What's Open
These changes also have ramifications for the dynamics of the IT industry.
"I see emerging a universality that we dreamed about," says Michael
Goulde, executive editor of the Patricia Seybold Group in Boston. "The
original goal of Unix was that any user anywhere could get to any data on
any system. Now, in just the couple of years that the Web has been around,
we've already overcome some seemingly insurmountable barriers, because proprietary
interests were not behind what was happening. Instead of being vendor-driven
and driven by proprietary strategic motivation, it was driven by what people
wanted to accomplish, which was supposed to be what open systems was all
about anyway."
A measure of the Internet's power to promote openness is that both Microsoft
and IBM's Lotus division, makers of proprietary PC applications, have been
forced to accommodate Internet standards. Microsoft has adopted TCP/IP as
the core networking protocol for Windows, supports the common Web protocols
and has announced that it will incorporate Java into its Web products. Lotus
Notes documents have been assigned to an open systems Web format, Hypertext
Markup Language. "The Internet is an incredible phenomenon that even
these larger software companies have given in to," Bernstein says.
That's a dramatic turnaround. A short time ago, Microsoft and the binary
compatibility of its software with the PC threatened to overwhelm Unix,
especially after the introduction of the Unix-like Windows NT operating
system. The Unix world, with its emphasis on programming interfaces, still
had multiple platforms that required tedious porting of applications. "Microsoft
was able to dominate with a proprietary system because of people's desire
to have a binary standard," Bernstein says. "But now that the
network has become the important platform, away from the individual computer,
it's more important to be connected in order to run something. The center
of compatibility has moved away from the machine. That has provided a new
lease on life for Unix systems, because their ability to adapt to the network
is better than any other operating system's."
The result is that the Internet has carried not only the open systems banner
but Unix along with it. "The single-machine binary standard, which
nearly rolled over Unix like a truck, has now gotten rolled over itself
by the synergistic connectivity standard," Bernstein says. If that's
true, the Internet may not have received enough hype, and open systems may
indeed have won the war.
--Don Dugdale