Why is protocol design important for today’s ICT professionals

The term ‘protocol’ was arguably first applied to computer and communications systems in 1967 (Bartlett). The idea of protocol engineering was published in 1983 (Piatkowski). These notions are thus, apparently, relatively recent. In our fast-moving world, even quite recent ideas tend to become quickly obsoleted. In many cases, however, what is “obsoleted” is just the hype or buzzword status, not the idea itself. In this vein, one might observe: “Protocols are what telecom people used to do back in the good old eighties. Nothing important to worry about now, right?” Wrong!

The idea of a protocol is of sweeping and lasting importance. Very generally, a protocol is defined as a set of rules that govern the behaviour of an entity, be it a technical object, a human, or a social group. The protocol rules may be strict or fuzzy; they may be adhered to or violated; they may be concerned with forms (syntax), meanings (semantics), or goals (pragmatics); they may be expressed more or less formally/mathematically; finally, protocols are both, what is described, and what is embodied by an object that actually acts according to, or against, a given protocol.

2200 years ago the Greek historian-scientist Polybius specified a complex reliable broadcast protocol used for military purposes. Communication protocols allowed the operation of the early wireless optical message-switching communication system (Chappe’s optical telegraph), addressing problems that are surprisingly similar to the current ones. Signalling protocols were the “nerve system” of railway networks emerging in the 19th century. Their various imperfections, which caused loss of life, are studied to this day. Faulty protocol design was blamed for the massive failure of the American telephone system in 1990, which cost $60 million in revenue loss. Protocols, in one guise or another, have been with us for centuries. Dealing with protocols is dealing with fundamental problems that make complex systems … well … complex. These problems will not just go away — they must be properly addressed, because not addressing them properly has repeatedly proved to be far too costly. In fact, protocols are so important and ubiquitous that one may talk of a protocol view on systems. Although this is not the only possible “world-view”, it forms a powerful framework for the analysis and design of distributed reactive systems, including those native to ICT. I am not aware of any real-life ICT system, in the design and operation of which the protocol view could be avoided altogether.

It is conceivable to use a stable set of protocol implementations as off-the-shelf black-box components. In this way, all the protocol-related issues could be boxed off and forgotten. Such idea might be attractive and useful in localized contexts — think of the pre-installed TCP/IP suite. It is, however, completely impractical on a global scale, because ICT is a live domain. Some of its sub-domains (with their protocols) die away, others are born. Whole new ecosystems emerge, such as IoT with its “constrained-resources” paradigm, for which protocols tailored to specific technical circumstances had to be developed. New protocol-related needs keep arising, and it is ICT engineers that have to deal with them. The IoT setting is also a good illustration of a blurred boundary between “being an ICT professional” and “being involved in the development of ICT artifacts”. Strictly speaking, “professionals” are certified members of a profession, practicing for a living. The lay understanding is slightly different: ‘professional’ is equated with ‘responsible, and equipped with knowledge and skills to account for this responsibility’. IoT is staffed, in a large part, by groups of enthusiastic and devoted “amateurs”. Most of them would like to eschew the happy-go-lucky tag, and to be regarded as professionals. For this to happen, they must be aware that there is important protocol-related knowledge and skills to be mastered, and that, for the sake of responsibility, they will need to actually use this knowledge and skills. Responsibility inevitably extends over various uses of technology. Surprisingly many uses of IoT technology are quite critical, as in health-related and behaviour-change applications, and critical applications require dependable solutions. The peculiarity of protocol engineering is that an ad hoc approach is a very weak basis for making any dependability/correctness claims. Luckily, protocol engineering also provides a set of highly developed concepts, methods, and tools for building trustworthy protocol-intensive systems. Not using them is irresponsible, and thus unprofessional.

Within ICT, many schools have developed their own versions of the protocol framework. One particularly influential version is OSI, with its stripped-down notion of a protocol. Everybody knows that “OSI is dead” (i.e., this particular standardization project is terminated). If ICT protocols in general are conflated with OSI protocols (a popular misconception), then one might be tempted to think that “protocols themselves are dead”, i.e., that they are an issue of only historic interest. In view of what was said before, this is nonsense. Furthermore, the OSI setting remains important for at least one fundamental reason. Namely, the pioneering work on protocol engineering was conducted by standardization organizations (ISO, CCITT, ITU-T, ETSI) within the OSI framework. This, together with a well motivated drive towards formality (the FM/FDT movement), yielded advanced theories, methods, languages, and tools for the development of protocols, in particular — for their specification (SDL, MSC) and testing (TTCN). The pragmatics of their use, and also some terminology (esp. in the case of TTCN) remained closely tied with OSI. What happened after the demise of the OSI project? Instead of burying those OSI-related instruments, their proponents and maintainers developed them further, removing any explicit ties with OSI protocols, and introducing various devices for the handling of even most “exotic” protocols. This can be clearly seen in the development of the TTCN-3 testing framework and language, which is now used across the whole ICT domain. Apart from a steady stream of improvements and new releases of already established instruments, new (semi-)formal tools and languages keep emerging, such as TPLan and TDL. They are presented as “typically protocol-oriented”. They have emerged because in ICT there is ongoing need for professionally dealing with protocol-related problems. Fulfilling this need was deemed important, with funding to match.

So much for the importance of protocol design, with its proper, professional, formal and semi-formal tools, for today’s ICT professionals.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s