*Smartphones are great, but …

You may want to read the paper Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity, by A. F. Ward et al., published in Journal of the Association for Consumer Research (April 2017). DOI: 10.1086/691462.

Based on their experiments, the authors claim that “the mere presence of one’s smartphone may reduce available cognitive capacity and impair cognitive functioning, even when consumers are successful at remaining focused on the task at hand.” The “mere presence” means that users “do not interact with or receive notifications from their phones.”

The authors illustrate their findings with the figure below. It shows that, when it comes to “working memory capacity” and “fluid intelligence,” you are better off if you place your smartphone further away from yourself (a bag is better than a desk, and another room is still better).


If you’d like to learn about so-called ambient displays, which allow you to receive information without smartphones or other screen-based computers, consider taking our course AKIR (in Polish).

*FIWARE Context Broker becomes a Connecting Europe Facility (CEF) Building Block

The FIWARE Context Broker, a reusable context-handling component, has been adopted by CEF (Connecting Europe Facility) as a so-called building block:

The Context Broker makes it possible to store and access context-information using simple context models. It offers a RESTful API and supports subscriptions and notifications.  Learn more here.

Notably, we offer a FIWARE Context Broker-based lab exercise in our course AKIR (Context-Aware IoT Applications).  The objective is to teach students about FIWARE context modeling and introduce them to Orion Context Broker. The exercise was developed at ZSUT by Szymon Caban, as his BS thesis.

Why is protocol design important for today’s ICT professionals

The term ‘protocol’ was arguably first applied to computer and communications systems in 1967 (Bartlett). The idea of protocol engineering was published in 1983 (Piatkowski). These notions are thus, apparently, relatively recent. In our fast-moving world, even quite recent ideas tend to become quickly obsoleted. In many cases, however, what is “obsoleted” is just the hype or buzzword status, not the idea itself. In this vein, one might observe: “Protocols are what telecom people used to do back in the good old eighties. Nothing important to worry about now, right?” Wrong!

The idea of a protocol is of sweeping and lasting importance. Very generally, a protocol is defined as a set of rules that govern the behaviour of an entity, be it a technical object, a human, or a social group. The protocol rules may be strict or fuzzy; they may be adhered to or violated; they may be concerned with forms (syntax), meanings (semantics), or goals (pragmatics); they may be expressed more or less formally/mathematically; finally, protocols are both, what is described, and what is embodied by an object that actually acts according to, or against, a given protocol.

2200 years ago the Greek historian-scientist Polybius specified a complex reliable broadcast protocol used for military purposes. Communication protocols allowed the operation of the early wireless optical message-switching communication system (Chappe’s optical telegraph), addressing problems that are surprisingly similar to the current ones. Signalling protocols were the “nerve system” of railway networks emerging in the 19th century. Their various imperfections, which caused loss of life, are studied to this day. Faulty protocol design was blamed for the massive failure of the American telephone system in 1990, which cost $60 million in revenue loss. Protocols, in one guise or another, have been with us for centuries. Dealing with protocols is dealing with fundamental problems that make complex systems … well … complex. These problems will not just go away — they must be properly addressed, because not addressing them properly has repeatedly proved to be far too costly. In fact, protocols are so important and ubiquitous that one may talk of a protocol view on systems. Although this is not the only possible “world-view”, it forms a powerful framework for the analysis and design of distributed reactive systems, including those native to ICT. I am not aware of any real-life ICT system, in the design and operation of which the protocol view could be avoided altogether.

It is conceivable to use a stable set of protocol implementations as off-the-shelf black-box components. In this way, all the protocol-related issues could be boxed off and forgotten. Such idea might be attractive and useful in localized contexts — think of the pre-installed TCP/IP suite. It is, however, completely impractical on a global scale, because ICT is a live domain. Some of its sub-domains (with their protocols) die away, others are born. Whole new ecosystems emerge, such as IoT with its “constrained-resources” paradigm, for which protocols tailored to specific technical circumstances had to be developed. New protocol-related needs keep arising, and it is ICT engineers that have to deal with them. The IoT setting is also a good illustration of a blurred boundary between “being an ICT professional” and “being involved in the development of ICT artifacts”. Strictly speaking, “professionals” are certified members of a profession, practicing for a living. The lay understanding is slightly different: ‘professional’ is equated with ‘responsible, and equipped with knowledge and skills to account for this responsibility’. IoT is staffed, in a large part, by groups of enthusiastic and devoted “amateurs”. Most of them would like to eschew the happy-go-lucky tag, and to be regarded as professionals. For this to happen, they must be aware that there is important protocol-related knowledge and skills to be mastered, and that, for the sake of responsibility, they will need to actually use this knowledge and skills. Responsibility inevitably extends over various uses of technology. Surprisingly many uses of IoT technology are quite critical, as in health-related and behaviour-change applications, and critical applications require dependable solutions. The peculiarity of protocol engineering is that an ad hoc approach is a very weak basis for making any dependability/correctness claims. Luckily, protocol engineering also provides a set of highly developed concepts, methods, and tools for building trustworthy protocol-intensive systems. Not using them is irresponsible, and thus unprofessional.

Within ICT, many schools have developed their own versions of the protocol framework. One particularly influential version is OSI, with its stripped-down notion of a protocol. Everybody knows that “OSI is dead” (i.e., this particular standardization project is terminated). If ICT protocols in general are conflated with OSI protocols (a popular misconception), then one might be tempted to think that “protocols themselves are dead”, i.e., that they are an issue of only historic interest. In view of what was said before, this is nonsense. Furthermore, the OSI setting remains important for at least one fundamental reason. Namely, the pioneering work on protocol engineering was conducted by standardization organizations (ISO, CCITT, ITU-T, ETSI) within the OSI framework. This, together with a well motivated drive towards formality (the FM/FDT movement), yielded advanced theories, methods, languages, and tools for the development of protocols, in particular — for their specification (SDL, MSC) and testing (TTCN). The pragmatics of their use, and also some terminology (esp. in the case of TTCN) remained closely tied with OSI. What happened after the demise of the OSI project? Instead of burying those OSI-related instruments, their proponents and maintainers developed them further, removing any explicit ties with OSI protocols, and introducing various devices for the handling of even most “exotic” protocols. This can be clearly seen in the development of the TTCN-3 testing framework and language, which is now used across the whole ICT domain. Apart from a steady stream of improvements and new releases of already established instruments, new (semi-)formal tools and languages keep emerging, such as TPLan and TDL. They are presented as “typically protocol-oriented”. They have emerged because in ICT there is ongoing need for professionally dealing with protocol-related problems. Fulfilling this need was deemed important, with funding to match.

So much for the importance of protocol design, with its proper, professional, formal and semi-formal tools, for today’s ICT professionals.

Technical review meeting of the Horizon 2020 microMole project


On November 27th and 28th, the European Horizon 2020 project microMole held its interim review meeting in Berlin, Germany. During the review a number of project prototypes were demonstrated, including the crawler robot for installing the microMole ring system (in the photo). CNSD presented a prototype of a battery-powered 6LoWPAN wireless sensor network that collects measurements from pH and conductivity sensors placed in the sewers, and sends those measurements to a management station.

New speciality at master-level studies

A new speciality of Master’s studies in Telecommunications was introduced at the Faculty of Electronics and Information Technology in the fall semester of 2017. The speciality is “Computer Networks and Cybersecurity” and is offered by the Institute of Telecommunications. The programme of the new Master’s studies encompasses 7 obligatory and 17 optional courses. The courses originate from all the 3 divisions of the Institute of Telecommunications, which enables students to further profile their education.

Computer Networks and Services Division offers 2 obligatory courses of “Softwarised Networks” (PROST), and “Computer Networks Planning” (OAST), and 4 optional courses of “Computer Networks Software” (OPSYT), “Managing Softwarised Networks” (ZAPST), “Context-Aware IoT Applications” (AKIR), and “Computer Networks Services and Applications” (UAT). All of the courses are entirely new, and have been developed around a uniting idea of providing knowledge on the architecture and management of new generation software-based networks, especially 5G networks, and the development of 5G-based services and applications, in particular IoE applications. Currently, the courses are offered in Polish.