Technical Debt and the Need for a Multidisciplinary Approach to Cybersecurity

Introduction

Cybersecurity is big business. The Cyber Observer Website [1] has a plethora of figures about the increasing costs of cybersecurity breaches and the need for Information Security professionals. The evidence is incontrovertible – cybersecurity threats are the plague of the online industry in general – and IT departments in particular.

Many such topical websites focus on the ‘standard’ facets of cybersecurity – by which I mean penetration testing, threat analysis, the management of malware, its diagnosis and prevention. The above website [1], however, in quoting the eight most common causes of data breaches, produces the following list:

  • Weak and Stolen Credentials, a.k.a. Passwords
  • Back Doors, Application Vulnerabilities
  • Malware
  • Social Engineering
  • Too Many Permissions
  • Insider Threats
  • Improper Configuration and User Error [1]

It is interesting to note that in the above list over half of the threats are related to a lack of user education and technical errors within the software artefact itself. So while a focus on ‘traditional’ cybersecurity skills (the aforementioned facets) is an invaluable means of diagnosing and investigating vulnerabilities and even establishing the attack surface of a product or system, it is often not the best way of mitigating such errors once discovered.  Although many anti-malware vendors will happily sell you multiple products that will hide or patch security holes, the best cure is prevention – by which I mean ensuring that the vulnerabilities do not exist in the first place.

Quality, Security & Technical Debt

Traditionally software engineering degrees (and to a lesser extent computer science degrees) have not explicitly focused on security considerations and quality which inevitably has led to a profusion of security-related bugs or faults akin to those mentioned in the list above. These bugs ultimately contribute to an increased attack surface and increase costs by becoming part of the system’s technical debt – caused by implementing an alternative (often temporary) solution.

Martin Fowler of Thoughtworks wrote an article that described technical debt in a quadrant model as shown below [3]:

The model proposes four factors that affect technical debt: recklessness, prudence, deliberateness and inadvertence. He makes the point that technical debt is a metaphor and as such has value in representing the effect of design decisions made with the combinations of some or all of the above factors. In particular, it also illustrates that technical debt decisions can be made due to a lack of training and education (see bottom left quadrant).

Another contributing factor to the technical debt phenomenon is the agile development and management process. The triple constraint model [4], often used as part of that process, is the classic project management model used to illustrate the major constraints on any project – that is that any project is constrained by cost, scope and time but only two can be controlled at any instant – and that model has contributed to justifying technical debt in the deliberate/ prudent quadrant. However, if the project manager had an understanding of architectural design then s/he might make a different decision around the cost and impact of technical debt.

Another factor contributing to the debt challenge is expressed by Davis in ‘Great Software Debates’ [2] where he notes “Most of us become specialists in just one area. To complicate matters, few of us meet interdisciplinary people in the workforce, so there are few roles to mimic. Yet, software product planning is critical to development success and absolutely requires knowledge of multiple disciplines.” The demands of agile software development – including the increasing emphasis on cybersecurity – have only accelerated this need.

Education To The Rescue?

The reality is that there is a myriad of factors that contribute to the technical debt – and by implication, the security  – of any software system. A contemporary lead developer, architect or project manager needs a good understanding of as many of those factors as possible which implies a good breadth of knowledge – and this is what modern cybersecurity degrees, such as those provided by the University of Essex Online (UoEO) (amongst others) aim to supply. The UoEO degree, for example, includes modules in secure development, secure systems and a research project, as well as more traditional cybersecurity modules in networks and information systems management, risk management and digital forensics to deliver as wide a base of technical and policy-based knowledge as possible. This is designed to help address a number of the issues discussed in this article – for example, looking at the list of data breaches provided above it provides modules that address each of them – producing much more capable staff with an enhanced appreciation for security and armed with the skills to ‘pay down’ technical debt more quickly – or even avoid occurring it in the first place. That same institution offers a similar conversion masters in computer science that addresses the need for multi-disciplinary project managers as well. 

An investment in this kind of education should be seen by employers as a way of offsetting the cost of technical debt that would be incurred by a lack of training and/or misinformation as well as a way of mitigating security risks and reducing attack surfaces. From an individual perspective gaining multidisciplinary qualifications such as these in growing and in-demand industry segments should be seen as a worthwhile investment in professional development.

Douglas Millward is a highly experienced computer consultant and technical architect, currently providing consultancy for the University of Essex Online (UoEO) as an SME and visiting lecturer.

References

[1] Cyber Observer. 2020. 29 Must-Know Cybersecurity Statistics For 2020 | Cyber Observer. [online] Available at: <https://www.cyber-observer.com/cyber-news-29-statistics-for-2020-cyber-observer/> [Accessed 14 April 2020].

[2] Davis, A. and Zweig, A., 2015. The Missing Piece of Software Development. Great Software Debates, pp.125-128.

[3] Fowler, M., 2014. Bliki: Technicaldebtquadrant. [online] martinfowler.com. Available at: <https://martinfowler.com/bliki/TechnicalDebtQuadrant.html> [Accessed 14 April 2020].

[4] Prince2.com. 2020. Project Management Triangle: A Triple Constraints Overview | UK. [online] Available at: <https://www.prince2.com/uk/blog/project-triangle-constraints> [Accessed 14 April 2020].

What are the issues with the UK adoption of Huawei for 5G infrastructure?

Recently the UK news outlets and media feeds have moved their focus from Brexit onto 5G. One of the biggest topics of conversation has been the use of Huawei as a supplier and concerns about the risk to national security and intellectual property (IP) of allowing such use (see above). What are the risks inherent in such a deployment, and what mitigations can the UK take to ameliorate such concerns?

Many parts of the UK industry, both public and private sector, have been arguing for some time that the release of 5G technology is a crucial step in our evolution to a digital economy, and the reality is that such a transition will need to utilise equipment from the ‘big three’ of cellular communications – that is Huawei, Nokia or Ericsson.  The arguments for Huawei are around cost and the fact that they already have a presence in our existing 3G and 4G infrastructure. The cost of replacing the existing hardware would be punitive, at least according to the UK Government. But is that the only option? Can we avoid throwing the baby out with the bathwater and still assuage the concerns of the anti-Huawei lobbyists?

Background

The components that will be used in the upcoming 5G infrastructure consist of two parts – hardware and software. The first thing to bear in mind is that hardware components on their own pose little risk. Indeed, Huawei themselves do not manufacture hardware at the chip level – and their branded hardware – be that cell phones or network components – contain thousands of components produced in China and other locations (such as Taiwan), with only a small percentage actually being designed by Huawei.

The main risk comes from the software associated with these components. Software exists at three levels which are: (going from lowest to highest level) Microcode, Firmware and application software. Microcode was invented by Maurice Wilkes, a British Computer Scientist, in the fifties, although it was popularised by the IBM System 360 in the sixties. Microcode is used to define or augment the instructions provided by a given CPU. Modern x86 processors (such as those developed by Intel and AMD) still use microcode, and updates can be delivered by application software, often the operating system. Intel has used Microcode updates to fix bugs in their CPUs, and recent updates were released to try and mitigate the Spectre and Meltdown exploits.[1]

Firmware is the next level up, it provides various functions from a basic abstraction layer, through APIs (application programming interfaces) to a full-blown OS (such as Intel’s Minix based management engine (IME) built into many Intel supplied PC motherboards).[2] Firmware is currently receiving a lot of attention from security researchers looking at exploits such as BadUSB (where a humble USB stick/ thumb drive can pretend to be another device and literally hack your hardware as soon as its plugged in).[3] 

My definition of application software is necessarily broad – I include the operating system, applications plus any scripts and configuration files necessary to allow a device to perform the task(s) it was designed to do. Many appliances and embedded systems (such as network devices, firewalls, routers and suchlike) often use open-source software as the main OS due to considerations around cost and licensing. However many manufacturers use patched versions of such software, and if those patches are not made public they can become a security risk themselves.

Mitigating the risk

Mitigations need to be focussed at each of the software layers discussed above. Microcode is the most difficult potential risk to mitigate – by its nature, it is very hardware-specific and in reality, it is unlikely anyone will be able to produce alternative microcode except the hardware manufacturers themselves. The only realistic mitigation is to audit any microcode updates and to use automated testing to ensure that any changes do not compromise the operation or security of the system.

Firmware can be replaced – indeed there is an initiative within the open-source movement to provide FOSS (free and open-source software) alternatives to many proprietary firmware solutions including the firmware (aka BIOS) software for a selection of PC motherboards. As far as network appliances are concerned there are several leading solutions for SDN (software-defined networking) based on, and available as, open-source code.

Many network component suppliers utilise open-source based solutions as their main operating system and network stack – most of these are based on Linux technology.

Finally, the hardware designs themselves should be audited and reviewed – most of the Huawei SoC (system on a chip) solutions use CPU designs from ARM – a formerly UK based and owned processor design company – which was recently purchased by the Japanese owned Softbank group. Most of Huawei’s designs are created by HiSilicon, one of its subsidiaries.

Conclusion

Network hardware is quickly becoming ubiquitous and an increasing number of components are available from sources in China. Trying to compete with that by manufacturing components in the UK is a non-starter as well as the wrong direction to focus our attention and resources. The way to ensure we manage both national security and IP is to focus on the weakest link – the software components. As part of their recommendation to the UK Government, the National Cyber Security Centre (NCSC) said, in highlighting Huawei as an HRV (high risk vendor)[4]:

“Our experience has shown that Huawei’s cybersecurity and engineering quality is low and its processes opaque. For example, the HCSEC Oversight Board raised significant concerns in 2018 about Huawei’s engineering processes. Its 2019 report confirmed that “no material progress” had been made by Huawei in the remediation of technical issues reported in the 2018 report and highlighted “further significant technical issues” that had not previously been identified.”

In summary, the UK Government should ensure that any hardware that is purchased has open specifications and designs that can be easily audited and is also compatible with FOSS solutions. As discussed above, Microcode needs to be freely available, auditable and test cases should be created and automated. Firmware should be based on open-source software with as few modifications and patches as possible to make it easily supported and updated. Initially, there will be a need to utilise the firmware supplied by the manufacturer but there should be plans in place to replace it with Uk created, or even better, open-sourced code.

As for the application-level code again it should be open source wherever possible. Many hardware vendors already use open source code for their consumer-level equipment, this just requires that similar code is used in their enterprise hardware – some of the biggest suppliers do this already.

Finally, some of the responsibility for protecting UK IP from data loss and theft must be devolved to users and corporations. Virtual private networks (VPNs) and data encryption must be used much more widely to ensure that if security issues are found and exploited in the network stack, the data that is carried over those links will still be protected.

A focus on purchasing and deploying open hardware that is compatible with FOSS components and utilising UK resources to create NCSC audited and approved firmware, operating system and application stack would maximise the use of available resources and provide the optimal solution to protecting Uk national security and IP.

Douglas Millward is a highly experienced consultant and systems architect, having worked for some of the largest consulting companies in the world. He is also a qualified higher education lecturer and is currently creating learning materials and delivering courses for Udemy, Kaplan International and the University of Essex Online. Many of the topics covered in this article are covered in a programme of learning he has developed called HANDS-ON, of which the first module covering hardware is available on Udemy now at the link below:

https://www.udemy.com/course/fault-finding-and-troubleshooting-pc-motherboards/?referralCode=86B31DF43E7064CB267F

www.tech-sourcery.co.uk 

References

[1] https://en.wikipedia.org/wiki/Spectre_(security_vulnerability)

[2] https://itsfoss.com/fact-intel-minix-case/ 

[3]This thumbdrive hacks computers. “BadUSB” exploit makes devices turn “evil”

[4]https://www.ncsc.gov.uk/guidance/ncsc-advice-on-the-use-of-equipment-from-high-risk-vendors-in-uk-telecoms-networks#section_5  

ICT vs. Computing

History

The 29th January, 2020 marks forty years since the microcomputer revolution made its mark on the UK market. The product that did this was the Sinclair ZX80 computer – which kicked off the UK ‘home micro’ revolution.

The ZX80 was not the first microcomputer – there were many that predated it including the Micral from France in the early seventies (which worked much more like a minicomputer) or the hobbyist machines from the USA including the Altair 8800 and the Apple I. It wasn’t even the first UK microcomputer – it had been preceded by the Scrumpi SC/MP evaluation platform in 1975 and the Nascom I kit/ system released in 1977 (based on a series of articles published in Wireless World in a similar marketing model to that used by the aforementioned Altair). The ZX80s unique selling point was a combination of two features – first it was available as a complete system including built-in keyboard, case and ROM – it only needed a TV as a display and an (optional) cassette recorder for long term storage; secondly it was available fully assembled for less than 100 UK pounds (GBP). The combination of these two factors made the ZX80 unique and set a sufficiently low cost entry point that many felt it worth taking a gamble on.

The success of the ZX80 led to the development of the ZX81 and ZX Spectrum from Sinclair plus an explosion of machines from UK based competitors, not to mention an increase in imports of similar systems from around the world.

Computers in Education

One of the main competitors was the Acorn Atom which was of a similar specification (albeit higher priced) to the ZX80 and powered by a different CPU. Acorn was a key rival as it had been formed by an ex-Sinclair employee – Chris Curry – a rivalry that was dramatised by the BBC in the ‘Micro Men’ docudrama. Acorn’s main contribution to the revolution was not the Atom however but its successor – the Acorn Proton  – which became much better known as the BBC micro. This machine was the lynchpin of the BBC computer literacy project (CLP) [1] – an ambitious programme of TV broadcasts and supporting literature built around the capabilities of the hardware – the specification of which the BBC also designed. The combination of the CLP and the supporting hardware formed a perfect storm – the UK government agreed to part fund the purchase of computing hardware for schools (provided they were UK built machines – which effectively meant Acorn/ BBC micro, Research Machines or Sinclair Spectrum). Many schools went the Acorn route as it was supported by excellent learning materials from the CLP making the teachers job much easier.

Birth of the UK Software Industry

Both the Sinclair machines and the Acorn systems led to the emergence of the UK software industry – when they were released there was little commercial software available for either system leading to the birth of cottage software developers writing custom software supplied on cassette for both systems. The fact that these ‘home micros’ booted up into a programming language – BASIC – encouraged users to either develop their own software or often type in listings that appeared in the proliferation of magazines that supported and promoted the microcomputer revolution. Many of today’s developers – be they gamers or more business oriented programmers – attribute their choice of career to their introduction to, and education in, computers provided by the CLP and its associated machines.

From Creators to Consumers

By the end of the eighties, the choice (and funding) of equipment devolved from central government to local education authorities [2]. Driven by pressure from major industry players as well as parents, schools started to focus on the industry standard IBM PC compatibles rather than home micros. These systems, although more expensive, came with easily available business applications from the likes of Microsoft, Lotus and IBM. This led to a focus on the use of computers as business tools and classroom activities focused more on the use of applications rather than more creative pursuits – in effect schools were producing students that were computer users/ consumers rather than content creators.

The Decline of Computer Science in UK Schools

The IT syllabus was changed to focus on ICT – information and communication technology – rather than computing and as N.C.C.Brown et al [3] note the reputation of ICT in schools worsened as it was seen as a low value discipline that was dull and unchallenging, a reputation that was further eroded by schools using ICT as an easy way to boost their position in the league tables by entering many students for the qualification. The result was a confusion by students about the difference between ICT and computer science, and a lack of interest in computer science at more advanced levels. (as an aside I personally experienced this as a university lecturer in the mid noughties, meeting a number of students who were shocked at how different a computer science degree was than what they had been taught at school).

Rebooting the Curriculum  

In 2012 many factions – including the UK Government, the royal society and UK industry in general all expressed their opinions that the UK curriculum for IT was not suitable and students were leaving school with insufficient and inadequate I.T. skills to meet the needs of industry. The education secretary at the time – Michael Gove – launched an initiative to restructure the curriculum which led to a new offering that was released in 2014 and renamed the ICT provision to Computing.

Since then, a number of reviews and reports have consistently criticised the new offering, and a paper by Laura Larke [4] published last year was damning, claiming that teachers were not implementing the new curriculum because they had neither the resources nor the skills to do it justice. Furthermore, an article by Ben Wohl in 2017 [5] argued that the new curriculum had shifted too far with its emphasis purely on programming / coding to the detriment of other aspects of computing and computer science. 

Reviewing the Options

If both the old curriculum and the new fall short – what should schools be doing to address the requirements of all parties – that is teachers, students and industry? 

If we review the lessons learned from the CLP it becomes clear that although it encouraged and nurtured a generation of programmers (developers) it also provided opportunities for a generation of hardware ‘hackers’ and developers – all the early machines provided some form of expansion bus which allowed third parties to produce interfaces that would support serial communications, printers and even storage interfaces. 

This approach has been resurrected by the Raspberry Pi (RPi) foundation who developed a low cost, single board computer aimed at encouraging students to engage with computer science again. Just like the home micros of yore, it comes equipped with standard expansion ports (USB and HDMI) as well as a number of GPIO (general purpose input/output) pins that fulfill the same function as the older hardware expansion bus. This functionality has led to the  RPi being used as the basis of several ‘homebrew’ microprocessor projects as well as the resurgence of the ‘maker’ culture.

The BBC also launched another computer initiative in 2015, known as the BBC micro:bit. This is a self contained, sub-credit-card sized single board computer that has a built in LED display, and several sensors and switches that allow the user to interact with the board directly (i.e without external keyboard or joystick). Again it has an expansion slot that allows it to be interfaced to external project boards.

Although both of the above boards support hardware expansion and are used in maker-style hardware projects the emphasis is still very much on code development and programming – and as noted by Wohl [5] many users want technical experience that does not depend on programming.

The conclusions from both Wohl [5} and Larke [4] are clear – there needs to be a number of computer science topics – a continuum ranging from computer science with its emphasis on programming; through a computing topic focusing on practical, hands-on, real world skills that will encourage computational thinking while simultaneously addressing the everyday problems of computer use; through to a renewed ICT offering that focuses more on the creative use of COTS (commercial off the shelf) and Open Source applications. In tandem there also needs to be an investment in CPD for teachers and resources to allow the curriculum to be delivered and taught to the appropriate standard.

Alternative Pathways and the Synergistic View

My personal recommendation is that there should be a more infrastructure focused course – the computing topic mentioned above – that pulls together a grounding in the key aspects of computer systems – that is hardware, networks, operating systems and security. It should provide an integrated, system-oriented and synergised view that covers current trends and be based on the ubiquitous PC as it still provides an almost universal platform and supports far more applications, operating systems and tools than any other platform available. I am currently developing a program of modules that are available either via Udemy or as face to face courses. The programme is called HANDS-ON and it includes modules on Hardware, Analytics, Networking, Drivers, Security and Operating Systems. It takes a practical, hands on approach (as the name suggests) and focuses on fault-finding and troubleshooting techniques and approaches in each area. In addition, lesson plans can be made available for teachers, together with ‘teach the teachers’ and CPD support.

The first module (hardware) is now available at the following link: https://www.udemy.com/course/fault-finding-and-troubleshooting-pc-motherboards/?couponCode=2F8A240F668D8E0A1A98 

(which provides a limited time preview of the module).

The new UK curriculum for computing defines nine topics at key stage three, of which up to seven would be addressed by the modules above.

Douglas Millward is a highly experienced consultant and systems architect, having worked for some of the largest consulting companies in the world. He is also a qualified higher education lecturer and is currently creating learning materials and delivering courses for Udemy, Kaplan International and the University of Essex Online.

www.tech-sourcery.co.uk 

https://www.linkedin.com/pulse/ict-vs-computing-do-changes-uk-national-curriculum-deliver-millward

References

[1] Allen, D. and Lowry, S. (2020). BBC Computer Literacy Project Archive. [online] Computer-literacy-project.pilots.bbcconnectedstudio.co.uk. Available at: https://computer-literacy-project.pilots.bbcconnectedstudio.co.uk/ [Accessed 17 Jan. 2020].

[2] Smith, B. (2015). Computing | Brian Smith. [online] Briansmithonline.com. Available at: https://briansmithonline.com/computing/ [Accessed 17 Jan. 2020].

[3] Brown, N., Sentance, S., Crick, T. and Humphreys, S. (2014). Restart. ACM Transactions on Computing Education, 14(2), pp.1-22.

[4] Larke, L. (2019). Agentic neglect: Teachers as gatekeepers of England’s national computing curriculum. British Journal of Educational Technology, 50(3), pp.1137-1150.

[5] Wohl, B. (2017). Coding the curriculum: new computer science GCSE fails to make the grade. [online] The Conversation. Available at: https://theconversation.com/coding-the-curriculum-new-computer-science-gcse-fails-to-make-the-grade-79780 [Accessed 17 Jan. 2020].