MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.


I'm not curious

The UNIX evolution: A history of innovation reaches an unprecedented 20-year milestone

Published on 11 April 16
0
0

The next BriefingsDirect expert panel discussion examines the illustrious 20-year history of the UNIX operating system environment as an industry-wide and global standard success story.

It's not often that you reach a multi-decade anniversary in information technology, especially where the technology's relevance remains so high and the promise of more innovation and value is so needed and promising.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To chart this unprecedented journey of interoperable software and stewardship success, we're joined by a distinguished panel: Andrew Josey, Director of Standards at The Open Group; Darrin Johnson, Director of Solaris Engineering at Oracle; Tom Mathews, distinguished engineer of Power Systems at IBM, and Jeff Kyle, Director of Mission-Critical Solutions at Hewlett Packard Enterprise. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: UNIX has been evolving during probably the most dynamic time in business and technology. How is it that UNIX remains so prominent, a standard that has clung to its roots, with ongoing compatibility and interoperability? How has it been able to maintain its relevance in such a dynamic world?

Josey: Thank you, Dana. As you know UNIX was started in Bell Labs by Ken Thompson and Dennis Ritchie back in 1969. It was a very innovative, a very different approach, an approach that has endured over time. We're seeing, during that time, a lot of work going on in different standards bodies.

The UNIX evolution: A history of innovation reaches an unprecedented 20-year milestone - Image 1

Josey

We saw, in the early '80s, the UNIX wars, almost different fractured versions, different versions of the operating system, many of them incompatible with each other and then the standards bodies bringing them together.

We saw efforts such as the IEEE POSIX, and then X/OPEN. Later, The Open Group was formed to bring that all together when the different vendors realized the benefits of building a standard platform on which you can innovate.

So, over time, the standards have added more and more common interfaces, raising the bar upon which you can place that innovation. Over time, we've seen changes like in the mid-'90s, when there was a shift from 32-bit to 64 bit computing.

At that time, people asked, "How will we do that? Will we do it the same way?" So the UNIX vendors came to what, at that time, was X/OPEN. We had an initiative called the Large File Summit and we agreed the common way to do that. That was a very smooth transition.

Today, everybody takes it for granted that the UNIX systems are scalable, powerful, and reliable, and this is all built on that 64-bit platform, and multi-processor, and all these capabilities.

That's where we're seeing the standards come in allowing the philosophy, the enduring, adaptable pace, and that’s the UNIX platform that's relevant today. We're saying it is today’s virtualization, cloud, and big data, which is also driven by UNIX systems in the back office.

The Open Group involvement

Gardner: So while we're looking at UNIX’s 40-year history, we're focusing on the 20-year anniversary of the single UNIX specification and the ability to certify against that, and that’s what The Open Group has been involved in, right?

Josey: We were given the UNIX trademark from Novell back in, I think it was 1993, and at that point, the major vendors came together to agree on a common specification. At the time, its code name was Spec 1170. There were actually 1168 interfaces in the Spec, but we wanted to round up and, apparently, that was also the amount of money that they spent at the dinner after they completed the spec.


So, we adopted that specification and we have been running certification programs against that.

Gardner: Darrin, with the dynamic nature of our industry now -- with cloud, hybrid cloud, mobile, and a tightening between development and operations -- how is it that UNIX remains relevant, given these things that no one really saw coming 20 years ago?

Johnson: I think I can speak for everybody here that all our companies provide cloud services, whether it’s public cloud, private cloud, or hybrid cloud, and whether it’s infrastructure as a service (IaaS), software as a service (SaaS), or any of the other as a service options. The interesting thing is that to really be able to provide that consistency and that capability to our customers, we rely on a foundation -- and that foundation is UNIX.

The UNIX evolution: A history of innovation reaches an unprecedented 20-year milestone - Image 2

Johnson

So our customers, even though they can maybe start with IBM, have choice. In turn, from a company perspective, instead of having to reinvent the wheel all the time for the customer or for our own internal development, it allows us to focus on the value-add, the services, the capabilities that build upon that foundation of UNIX.

So, something that may be 20 years old, or actually 40 years from the original version of UNIX, has evolved with such a solid foundation that we can innovate on.

Gardner: And what’s the common thread around that relevance? Is it the fact that it is consistently certified, that you have assurance that what's running in one place will run into another on any hardware? How is it that the common spec has been so instrumental in making this a powerful underpinning for so much modern technology?

Josey: A solid foundation is built upon standards, because we can have, like you mentioned, assurance. If you look at the certification process, there are more than 45,000 test cases that give assurance to developers, to customers that there's going to be determinism. All of the IT people that I have talked to say that a deterministic behavior is critical, because when it’s non-deterministic, things go wrong. Having that assurance enables us to focus on what sits on top of it, rather than does the ‘ls’ command work right or can we know how much space is in a file system. Those are givens. We can focus on the innovation instead.

Gardner: Over the past decades, UNIX has found itself at the highest echelon of high-performance computing, in high-performance cloud environments. Then, it goes down to the desktop as well as into mobile devices, pervasively, and as micro-devices, embedded and real-time computing. How has that also benefited from standards, that you have a common code base up and down the spectrum, from micro to macro?

Several components

Johnson: If you look at the standard, it contains several components, and it's really modular in a way that, depending on your need, you can pick a piece of it and support that. Maybe you don't need the complete operating system for a highly scalable environment. Maybe you just need a micro-controller. You can pick the standard, so there is consistency at that level, and then that feeds into the development environment in which an engineer may be developing something.

That scales. Let’s say you need a lot of other services in a large data center where you still have that consisting throughout. Whether it’s Solaris, AIX, HP-UX, Linux, or even FreeBSD, there's a consistency because of those elements of the standard.

Gardner: Developers are, of course, essentially making any platform continue over time, the chicken and the egg relationship, the more apps the more relevant the platform, the stronger and more pervasive the platform the more likely the apps. So, Jeff, for developers, what are some of the primary benefits of UNIX and how has that contributed to its longevity?

Kyle: As was said for developers, it’s the consistency that really matters. UNIX standards develop and deliver consistency. As we look at this, we talk about consistent APIs, consistent command line, and consistent integration between users and applications.

The UNIX evolution: A history of innovation reaches an unprecedented 20-year milestone - Image 3

Kyle

This allows the developers to focus a lot more on interesting challenges and customer value at the application and user level. They don’t have to focus so much on interoperability issues between OSes or even interoperability issues between versions of the single OS. Developers can easily support multiple architectures in heterogeneous environments, and in today’s virtualized cloud-ready world, it’s critical.

Gardner: And while we talk about the past story with UNIX, there's a lot of runway to the future. Developers are now looking at issues around mobile development, cloud-first development. How is UNIX playing a role there?

Kyle: The development that’s coming out of all of our organizations and more organizations is focused first on cloud. It’s focused first on fully virtualized environment. It’s not just the interoperability with applications, but it is the interoperability between, as I said before, the heterogeneous environments, the multiple architectures.

In the end, customers are still trying to do the same things that they always have. They're trying to use applications in technology to get data from one place to another and more effectively and efficiently use that data to make business decisions. That’s happening more and more "mobile-y," right?

I think every HP-UX, AIX, Solaris, and UNIX system out there is fully connected to a mobile world and the Internet of Things (IoT). We're securing it more than any customers realize.

Gardner: Tom, let’s talk a little bit about the hardware side and the ability to recognize that cost and risk have a huge part of decision-making for customers, for enterprises. What is it about UNIX now, and into the future, that allows a hardware approach that keeps those cost risks down, that makes that a powerful combination for platform?

Scale up

Mathews: The hardware approach for the UNIX has traditionally been scale-up. There are a lot of virtues and customer values around scale-up. It’s a much simpler environment to administer, versus the scale-out environment that’s going to have a lot more components and complexity. So that’s a big value.

The UNIX evolution: A history of innovation reaches an unprecedented 20-year milestone - Image 4

Mathews

The other core value that is important to many of our customers is that there has been a very strong focus on reliability, availability, and scalability. At the end of the day, those three words are very important to our customers. I know that they're important to the people that run our systems, because having those values allows them to sleep right at night and have weekends with their families and so forth. In addition to just running the business, things have to stay up -- and it has been that way for a long time, 7×24×365.

So these three elements -- reliability and availability and scalability -- have been a big focus, and a lot of that has been delivered through the hardware environment, and in addition to the standards.

The other thing that is critical, and this is really a very important area where the standards figure in, is around investment protection. Our customers make investments in middleware and applications and they can’t afford to re-gen those investments continuously as they move through generations of operating systems and so forth.

The standards play into that significantly. They provide the stable environment. In the standards test suite right now, there are something like 45,000 tests for testing for standards. So it's stability, reliability, availability, and serviceability in this investment-protection element.

Gardner: Now, we've looked at UNIX through the lens of developers, hardware, and also performance and risk. But another thing that people might not appreciate is a close relationship between UNIX and the advancement of the Internet and the World Wide Web. The very first web servers were primarily UNIX. It was the de-facto standard. And then service providers, those folks hosting websites were hosting the Internet itself, were using UNIX for performance and reliability reasons.

So, Darrin, tell us about the network side of this. Why has UNIX been so prevalent along the way when the high-performance networks, and then the very important performance characteristics of a web environment, came to bear?

Johnson: Again, it’s about the interconnectedness. Back in my younger years, having to interface Ethernet with AppleTalk, with picking your various technologies, just the interfacing took so much time and effort.

Any standard, whether it’s Ethernet or UNIX, helps bring things together in a way that you don’t have to think about how to get data from one point to another. Mobility really is about moving data from one place to another in a quick fashion where you can do transactions in microseconds, milliseconds, or seconds. You want some assurance in the data that you send from one place to another. But it's also about making sure of, and this is a topic that’s really important today, security.

Knowing that when you have data going from one point to another point, it's secured and on each node, or each point, security continues, and so standards and making sure that IBM interacts with Oracle, interacts with HPE, really assures our customers. And the people that don’t even see the transactions going on, they can have some level of confidence that they're going to have reliable, high-performance, and secure networks.

Standardization certification

Gardner: Well, let’s dig a little bit into this notion of standardization certification, of putting things through their conformity paces. Some folks might be impatient going through that. They want to just get out there with the technology and use it, but a level of discipline and making sure that things work well can bear great fruit for those who are willing to go through that process.

Andrew, tell us about the standard process and how that’s changed over the past 20 years, perhaps to not only continue that legacy of interoperability, but perhaps also increase the speed and the usability of the standards process itself.

Josey: Since then, we've made quite a few changes in the way that we're doing the standards development ourselves. It used to be that a group of us would meet behind closed doors in different locations, and there were three of such groups of standard developers.

There was an IEEE group, an X/Open (later to become an Open Group group), and an International Standards Group. Often, they were same people who had to keep going to these same meetings, and seeing the same people but wearing different hats. As I said, it was very much behind closed doors.

As it got toward the end of the 1990s, people were starting to say that we were spending too much money doing the same thing, basically producing a pile of standards that were very similar but different. So in late 1997-1998, we formed something that we call the Austin Group.

It was basically The Open Group’s members. Sun, IBM, and HP came to The Open Group at that time, and said, "Look, we have to go and talk to IEEE, we have to talk to ISO about bringing all the experts together in a single place to do the standard. So starting in 1998, we met in Austin, at the IBM facility -- hence the name The Austin Group -- and we started on that road.

Since then, we developed a single set of books. On the front cover, we stamped the designation of it being an IEEE standard, an Open Group standard, or an International Standard. So technical folks only have to go to a single place, do the work once, and then we put it through the adoption processes of the individual organizations.

As we got into the new millennium, we changed our way as well. We don’t physically go and meet anywhere, anymore. We do everything virtually and we've adopted some of the approaches of open source projects, for example an open bug tracker (MantisBT).

Anybody can access the bug tracker file, file a bug against the standard and see all the comments that go in against a bug, so we are completely transparent. With the Austin Group, we allow anybody to participate. You don't have to be a member of IEEE or an international delegate any more to participate.

We've had lot of input and continue to have a lot of input from the open-source community. We've had prominent members of Linux and Open Source communities such as maintainers of key subsystems such as glibc command and utilities. They would come to us because they want to get involved, they see the value in standards.

They want to come to a common agreement on how the shell should work, how this utility should work, how they can pull POSIX threads and things into their environments, how they can find those edge cases. We also had innovation from Linux coming into the standard.

In the mid-2000s, we started to look at and say that new APIs in Linux should also be in UNIX. So in the mid-2000s, we added, I think, four specifications that we developed based on Linux interfaces from the GNU Project. So in the areas of internationalization and common APIs, that’s one thing we have always wanted to do is to look at raising that bar of common functionality.

Linux and open-source systems are very much working with the standard as much as anybody else.

Process and mechanics

Johnson: There's something I’d like to add about the process and the mechanics, because in my organization I own it. There are a couple of key points. One is, it’s great that we have an organization like The Open Group that not only helps create the standard or manage the standard, but is also developing the test suites for certification. So it’s one organization working with the community, Austin Group, and of course IEEE and The Open Group members to create a test certification suite.

If anyone of our organizations had to create or manage that separately, that’s a huge expense. They do that for them, that’s part of the service, and they have evolved that and it’s grown. I don’t know what it was originally, but 45,000 tests have grown, and they’ve made it more efficient in terms of the process. And it’s a collaborative process. If we have an issue, is it our issues, is it the test read issue. There's a great responsiveness.

So kudos to The Open Group, because they make it easy for us to certify, that’s really our obligation to get into that discipline, but if we factor it into the typical quality assurance process as we release the operating system, whether it’s an update or a patch, or whatever, then it just becomes pretty obvious. The next major release that you want to certify, you've done most of the heavy lifting. Again, The Open Group makes it really easy to do that.

Mathews: Another element that’s important on this cost point is goes back to the standards and the cost of doing development. Imagine being a software ISV. Imagine a world where there were no standards. That world existed at one point in time. What that caused is this, ISVs had to spend significant effort to port their to each platform.

This is because the interfaces and the capabilities on all of those platforms will be different. You will see difference all of the way across. Now with the standards, of course, ISVs basically develop for only one platform: the platform defined by the standards.

So that’s been crucial. It’s that the standards have actually encouraged innovation in the software industry because that just made it easier for developers to develop, and it's less costly for them to provide their stuff across the broad range of platforms.

So that’s been crucial. We have three people from the major UNIX vendors on the panel, but there are other players there, too, and the standards have been critical over time for everybody, particularly when the UNIX market was made up of a lot of vendors.

Gardner: So we understand the value of standards and we see the role that a neutral third-party can play to keep those standards on track and moving rapidly. Are there some lessons from UNIX of the past 20 years that we can apply to some of the new areas where standards are newly needed? I'm thinking about cloud interoperability, hybrid cloud, so that you could run on-premises and then have those applications seamlessly move to a public cloud environment and back.

Andrew, starting with you, what it is about the UNIX model and The Open Group certification and standardization model that we might apply to such efforts as OpenStack, or Cloud Foundry, or some other efforts to make a seamless environment for the hybrid cloud?


Exciting problem

Josey: In our standards process, we're very much able to take on almost any problem, and this certainly would be a very exciting problem for us to tackle to bring parties together. We're able to bring different parties together, looking for commonality to try and build the consensus.

We get people in the room to talk through the different points of view. What The Open Group is able to do is to provide a safe harbor where the different vendors can come in and not be seen as talking in an anti-competitive position, but actually discussing the differences and their implementations and deciding what’s the best common way to go forward who is setting a standard.

Gardner: Anyone else on the relationship between UNIX and hybrid cloud in the next several years?

Johnson: I can talk to it a little bit. The real opportunity, and I hope people reading this, and especially the OpenStack community listens, is that true innovation can be best done on a foundation. In OpenStack, it’s a number of communities that are loosely affiliated delivering great progress, but there is interoperability, and it’s not with intent, but it’s just people are moving fast. If some foundation elements can be built, that's great for them because then we, as vendors, can more easily support the solutions that these communities are bringing to us, and then we can deliver to our customers.

Cloud computing is the Wild West. We have Azure, OpenStack, AWS, and could benefit from some consistency. Now I know that each of our companies will go to great lengths to make sure that our customers don't see that inconsistency. So we bear the burden for that, but what if we could spend more time helping the communities be more successful rather than, as I mentioned before, reinventing the wheel? There is a real opportunity to have that synergy.

Kyle: In hybrid cloud environments, what UNIX brings to customers is security, reliability, and flexibility. So the Wild West comment is very true, but UNIX can present that secure, reliable foundation to a hybrid cloud environment for customers.

Gardner: Let’s look at this not just through the lens of technology but some of the more intangible human cultural issues like trust. It seems to me that, at the end of the day, what would make something successful as long as UNIX has been successful is if enough people from different ecosystems, from different vantage points, have enough trust in that process, in that technology. And through the mutual interdependency of the people in that ecosystem they keep it moving forward. So let’s look at this from the issue of trust and why we think that that's going to enable a long history for UNIX to continue.

Josey: We like to think The Open Group is a trusted party for building standards and that we hold the specification in trust for the industry and do the best thing for it. We're fully committed always to continue working in that area. We're basically the secretariat, and so we're enabling our customers to save a lot of cost. We're able to divide up the cost. If The Open Group does something once, that’s much cheaper than everybody doing the same thing themselves.

Gardner: Darrin, do you agree with my premise that trust has been an important ingredient that has allowed UNIX to be so successful? How do we keep that going?

One word: Open

Johnson: The foundation of UNIX, even going back to the original development, but certainly since standards came about is the one word open. You can have an open dialogue to which anybody is invited. In the case of the Austin Group, it’s everybody. In the case of any of the efforts around UNIX, it’s an open process, it’s open involvement, and in the case of The Open Group, which is kind of another open, it’s vendor-neutral. Their goal is to find a vendor-neutral solution.

Also look at this way. We have IBM, HPE, and Oracle sitting here, and I’ll say virtually Linux. Other communities that are participating are coming to mutual agreements, and this is what we believe is best.

And you know what, it’s open to disagreement. We disagree all the time, but in the end what we deliver and execute is of mutual agreement, so it’s open, it’s deterministic, and we all agree on it.

If I were a customer, IT professional, or even a developer, I'd be going, "This foundation is something on which I want to innovate, because I can trust that it will be consistent." The Open Group is not going to go away any time soon, celebrating 20 years of supporting the standard. There's going to be another 20 years.

And the great thing is that there is lot of opportunity to innovate in computer science in general, but the standard is building that foundation, taking advantage of topics like security, virtualization, mobility, and the list goes on. We even have opportunity to in a open way build something that people can rust.

Gardner: Tom, openness and trust, a good model for the next 20 years?

Mathews: It is a good model. Darrin touched on it. If we need proof of it, we have 20 years in proof of it. The Open Group has brought together major competitors and, as Darrin said, it’s always been very open, and people have always -- even with disagreement -- come to a common consensus around stuff. So The Open Group has been very effective establishing that kind of environment, that kind of trust.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: The Open Group.

This blog is listed under Open Source and Operating Systems Community

Related Posts:
Post a Comment

Please notify me the replies via email.

Important:
  • We hope the conversations that take place on MyTechLogy.com will be constructive and thought-provoking.
  • To ensure the quality of the discussion, our moderators may review/edit the comments for clarity and relevance.
  • Comments that are promotional, mean-spirited, or off-topic may be deleted per the moderators' judgment.
You may also be interested in
 
Awards & Accolades for MyTechLogy
Winner of
REDHERRING
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url

Back to Top