CHAPTER 8
Technology risk management

Take calculated risks. That is quite different from being rash.

George S. Patton

 

Technology risk management today is very different from what it was ten, or even five, years ago. In our hyper-connected world, the inherent cyber threats of doing business have increased, requiring new risk management practices to protect your company. Risk, security and legal professionals are struggling to keep up as businesses suddenly find themselves dealing with a fundamental shift in their value chains, requiring new strategies and business models to stay competitive. This may sound scary, but you have to remember that the cyber threat pales in comparison with the risk of doing nothing and choosing to ignore the need to embrace digital.

As discussed in part II, data underpins the future growth of your business, which means your organisation must develop a strong understanding of data regulation, specifically that relating to confidentiality and privacy. Legislation surrounding data is constantly changing and can be a minefield, so it’s important your organisation continuously updates policies accordingly and employs or engages expertise in this area. Ensuring growth also requires a fundamental shift in how you approach technology licensing. Perpetual, one-off licences are now an outdated practice; subscriptions, or pay-as-you-go solutions, are the new norm. At the core of this change is the increasing demand for cloud services. Gartner projects a rise in global cloud computing spend from US$67 billion in 2015 to US$162 billion in 2020 — a compound annual growth rate of 19 per cent. Amazon, Microsoft, Google and many others are riding this wave, providing organisations with the opportunity to move from costly, self-managed data centre facilities to cloud-based solutions that treat computing power as a utility.

But the move to the cloud does not come without challenges. In a 2016 research report, Cybersecurity Ventures predicted cybercrime will cost the world US$6 trillion annually by 2021, up from US$3 trillion in 2015, and will be ‘more profitable than the global trade of all major illegal drugs combined’. Given the statistics, it’s not surprising that cybersecurity is a hot topic in the boardroom. However, moving to the cloud does not make you more susceptible to cyberattacks. In fact, if executed properly, quite the opposite is true.

Another trend over recent years has been the explosion in the use of open-source software. (Software is open source when its code is freely available, and can be modified and distributed by anyone.) Open-source software plays an important role in many software projects, because it allows organisations to repurpose software that solves someone else’s problems and adapt it to solve their own problems. For lawyers and risk professionals this can be a headache, because the licensing considerations are immense and the technology nuance and understanding that’s required is often outside the training or experience of non-specialists.

How, then, can you approach technology risk management — specifically the key areas of regulatory compliance, cybersecurity, cloud computing and open-source software? In this chapter, we outline topics you should add to your risk agenda. And, as risk management must evolve in line with changing markets and regulations, we advocate an ongoing process for updating technology risk management to protect your brand, public trust and reputation as you grow and innovate.

Navigating the new rules of cybersecurity

With more platforms connecting machines and people than ever before, the risk of security breaches has never been higher. The simple fact is that the greater the number of people connecting with your business and sharing their data, the higher the risk. As we explored in chapters 5 and 6, platforms and systems of intelligence are complex in their engineering. As this complexity increases, so too does the cybersecurity risk.

A joint report conducted by IBM and the Ponemon Institute found that 25 per cent of data breaches were caused by a system glitch, 28 per cent by human error, and 48 per cent by malicious or criminal attacks. Vulnerabilities are inevitable, but putting the right processes in place can dramatically reduce cybersecurity risk. Here, we outline five ways to do that.

ENSURE CONTROL AND CONSISTENCY OF CUSTOMER DATA

If data about your customers is lost or stolen, the resulting reputational damage and cost can cripple your business. Industry reports estimate that over 30 per cent of customers discontinue their relationship with a company after a data breach. For this reason, you should be clear about the way you manage all customer-related data. Having this data scattered across numerous applications can lead to you simply losing track of where it is all stored, and inevitably there will be different standards applied to its safekeeping across the different data stores. Therefore, you need to put sensitive customer data in one system or database, ensuring you then have a consistent approach to managing it. Having all your data in one spot makes it much easier to protect, as you can enforce common standards and practices. You should also consider the way in which sensitive data is managed, whether it is ‘at rest’ (data that’s been written and is not being accessed or transmitted) or ‘in motion’ (data that is in the process of being transferred via the internet or a network between separate storage locations).

REFRESH INTERNAL CONTROLS AND POLICIES

Most companies have spent decades implementing control frameworks in an effort to safeguard their internal IT functions. Annual audit and testing processes provide assurance about server security, permissions, access points and the like. However, when you amplify your efforts to connect with customers online, your risk profile changes. As you collect more customer data, you create more opportunities for cyberattacks. This risk is compounded by the bring-your-own-device-to-work movement that many companies are adopting. Refresh your policies and controls to deal with these situations, and increase the frequency of review and assurance activities. There are advisers you can contact and frameworks you can adopt to ensure you have the basics covered for your industry standard. Consider, however, whether the bar you or they set is high enough for your business objectives and values. Don’t hold back on enlisting specialists to challenge the concept of ‘best practice’.

TRUST AND RESPECT THE CLOUD

There is a common presupposition that the cloud is riskier than using on-premise servers. This is not true. At every level, there is no comparison between the security of the cloud and that of your own data centre. The ongoing proliferation of cloud incentivises companies to prevent even the most sophisticated hacks and viruses. Trusted cybersecurity brands are actively increasing the sophistication of their threat detection, decryption, and virus or malware removal tools to provide robust protection specific to the cloud, whereas data centre security innovations are not advancing at the same rapid pace.

As a further benefit to your business, the cloud makes it extremely easy to access and launch global software solutions, including security programs, and it’s all done with the click of a button. This agility puts greater emphasis on the design and configuration of systems within the cloud, whereas your on-premise solutions have to be customised and upgraded independently. Cloud vendors offer a wide range of services and tools, and release new ones every week, so having a process in place to ensure the appropriate vetting and configuration of services is vital. Here are some pragmatic ways to achieve this:

  • As you would in the development of software, develop a roadmap of the services you plan to adopt from cloud vendors and communicate this to your team.
  • Regularly — say, once a quarter — step back, and understand what’s available and how this could be applied to the solutions you are working on.
  • Ensure you have a strategy to allow innovation and experimentation without constraints. Look to employ what is called a ‘sandbox environment’ to test and trial new ideas and initiatives in a simulated environment before going live. For live systems, use a ‘locked down’ environment where your security and risk management controls are well defined.

SOFTWARE ENGINEERING QUALITY IS CRITICAL

Throughout this book, we have emphasised the importance of responding to the pace of change. Do not make haste by compromising on software engineering quality. Neglecting quality could be the end of your business, as you will burn through unnecessary time managing software development risks, including security, performance, scale or misalignment with your strategy. Software engineering quality can be difficult to define, and is not as simple as determining whether or not the software ‘works’. To determine if quality is at the right level, engage with your software engineers and other managers to understand the effectiveness of updating or implementing new software in your business. If most of the answers to the following questions are ‘yes’, you likely have a problem. You should then enlist experts to help you work through these challenges.

  • Do your managers report that systems and programs seem harder and more complex than they were before a major software update or new implementation occurred?
  • Does your team fix a bug only to find it reappears in a week or so?
  • Do users report that the performance of software is variable across the application?
  • Do testing efforts seem exhausting, and increasingly complex, even when they relate to minor changes?
  • Do the cost or time estimates associated with changes seem disproportionate to the intuitive level of effort involved?
  • Is your software development team’s velocity decreasing? When you were building the first version of your product, you could develop a new feature quickly, and your team used to build lots of them every iteration. Is their output slower now?
  • How is morale? Is your engineering team easily demotivated by complexity or change?

There’s a big difference between software development and software engineering. The difference relates largely to skill, but generally speaking, software developers focus on writing code with specific functional outcomes in mind, whereas software engineers take a more holistic, logical approach to solving complex problems in the context of scale and adaptability. Don’t skimp on the right level of experience, and invest in working with engineers when you have more complex problems that need to perform and scale.

PEOPLE ARE YOUR GREATEST RISK

Software and data are ultimately controlled or accessed by people. It’s not just the underworld hackers that you need to be worried about. Keep in mind the risk associated with your staff, including both honest mistakes and deliberately disruptive actions. Having adequate controls in place is vital, but the challenge is to ensure people adhere to them. It’s also important that these controls don’t slow the pace of day-to-day operations with unnecessary protocols. For example, limiting access to sensitive customer-related data in order to maintain data integrity and security will impede productivity. You can help reduce the risk of people compromising data security — deliberately or unwittingly — through training programs and selective audits.

As interactions with your clients move online, the trust associated with your brand is constantly at risk of cybersecurity attack. As a pre-digital incumbent, your reference point has been your on-premise IT systems, where most of your risk has been managed through a series of well-defined, relatively durable controls, with these systems being updated only occasionally to cover most risks. However, as the time between development cycles decreases and the pace of these cycles increases, redesigning your technology risk controls will be integral to ensuring you can keep up with the speed of the market while safeguarding your business from potential threats.

Open-source software may be your friend

Open source is not the bogeyman the software billionaires would like us to believe. It is software designed to be shared and, as such, is managed by an online community, typically of volunteer software engineers, that facilitates sharing and collective improvement. In fact, open source is now so pervasive in software that it is almost impossible to avoid, but it nevertheless remains a hot topic for lawyers and risk professionals. From a strategic standpoint, you must consider the opportunities that open-source software provides — ‘pre-baked’ solutions that you can adapt to your requirements.

Businesses around the world are using open-source software to reduce costs, apply focus to innovation efforts, and solve problems that it would be impossible or unprofitable for one company to tackle alone. Detection of phishing attacks, analysing genomes and geospatial mapping are just a few of the open-source projects underway to improve corporate and public services.

Open-source software can also be used to build new business models and create digital moats, which may sound counterintuitive to those who are used to the world of patents. In 2017, computer chip maker Nvidia decided to make the designs of one of its chips publicly available using a licensing agreement to support sharing, even though it used the same designs in some of the chips it was selling. The company’s need for scale and distribution, however, was more valuable than a patent for a deep-learning chip. Nvidia decided the patent was not a significant competitive advantage. Having its technology embedded into hardware devices and integrated with third-party software solutions around the world was, in contrast, a massive competitive advantage. For Nvidia, making its design open source means that a host of companies are now building out an entire ecosystem around its technology. In other words, Nvidia has created a platform.

While there are many open-source licensing agreements to take into account (around 2500, in fact), don’t be deterred by this. There is only one key point to understand: though open-source software is free to use, it is not free from legal obligations. Depending on the maturity of the open-source project and the size of the community supporting it, it can sometimes be unreliable and could potentially pose a security risk given its open nature. Proper due diligence must be conducted, though large open-source projects are typically well designed, documented and reviewed.

Consider the following six points if you want to properly manage and unlock the benefits of open-source software:

  1. Put in place an open-source management policy that dictates which types of open-source licences are acceptable and not acceptable for your organisation. This is a specialised area and largely the realm of lawyers and risk professionals.
  2. Think about the use of open-source software from a security standpoint, taking into account the inherent benefits of crowdsourced improvement, but also the drawbacks in the public and open nature of the code.
  3. Ensure you have an effective code approval process so staff who may not appreciate the risks can still access the benefits of open-source software.
  4. Define and implement a policy for open-source software updates. This will provide guidance to your software development teams on what to update, when and why.
  5. Consider that the solution you wish to build may not be the first of its kind, and that an open-source solution very likely already exists or can be modified to meet your needs. Sites like GitHub, as largely open-source code repositories, are useful references to determine whether a solution already exists.
  6. Consider open-source based business models like Nvidia, but ensure you have the in-house capabilities to support such business models, both technically and legally. Early engagement with risk and legal professionals will help in defining the process you should follow.

Open source unlocks a wealth of possibilities. Like the platform concept, it allows collaboration and openness, which can accelerate business and customer value. As with anything in the digital realm, it comes with risk that you can overcome and should not see as a deterrent. If you address the six points we’ve just outlined, you can mitigate most of the associated risk.

Busting the myths of cloud computing

Several years ago, the hype around cloud computing was very much just that — hype! Few cloud vendors were able to provide the level of service they advertised. Today this is no longer the case. The benefits you can realise from the cloud are so significant that there is now no excuse not to move your workloads onto the cloud. Gartner predicts that by 2020, ‘a corporate “no-cloud” policy will be as rare as a “no-internet” policy is today’. In fact, Gartner expects that by 2020, more computing power will be sold by cloud providers than is sold and deployed into enterprise data centres.

The cloud brings to the table an extremely cost-effective way to develop and launch software solutions. One of the primary advantages of the cloud is elasticity. When buying on-premise solutions, typically you would purchase a server that could handle peak workloads, plus some additional buffer just to be on the safe side. This meant that for much of the year, these servers were operating well below maximum capacity. As your business grew, you would eventually have to replace these servers with bigger ones, but again you would have to think long term, purchasing servers that could handle future peak workloads. As you can appreciate, if your business is growing quickly, or if you have demanding, cyclical workloads, on-premise can become very costly. The cloud eradicates this problem as you only ever pay for what you use, meaning there is no excess capacity sitting idle. Other advantages of the cloud include global reach, reduced complexity with respect to system management and maintenance, and rapid implementation cycles.

Of course there will be challenges, but don’t use them as an excuse for staying away from the cloud. Armed with an understanding of potential problems, you are in a good position to lead the charge. Here, we bust five common myths about cloud computing.

SOME APPLICATIONS DON’T RUN ON THE CLOUD

There are still some, though not many, who say you can’t reconfigure an on-premise solution to work effectively in the cloud and, as a result, removing your dependence on physical infrastructure results in complexity and duplicated costs. This is simply not true. It is unsubstantiated, old-school IT thinking. If you hear this rebuttal, ask at a technical level what the limitation is and seek a second opinion from an advisory firm, because given the suite of solutions cloud vendors provide, almost anything is possible.

THE LEGAL AND RISK HURDLES ARE TOO HIGH

The best way to tackle this argument is to contrast the legalities and risks of moving to the cloud against where you are today. Consider the rights and obligations you have (or don’t have) in place with your current providers, and compare these with the terms of an agreement with a cloud services leader like Google, Microsoft or Amazon. You will see by comparison that cloud vendors take away the complexity of managing physical infrastructure, offer highly competitive SLAs (service-level agreements) that meet most business requirements and provide best-in-class security, which is almost impossible for an individual enterprise to replicate. In fact, in most cases, you’d be doing yourself a disservice by not moving the majority of your IT systems to the cloud.

IT WILL COST MORE

Take another look at your total cost of ownership equation, factoring in forward replacement cycles and, more importantly, the opportunity cost associated with maintaining complex infrastructure. Sure, there may be a substantial outlay upfront when moving to the cloud. But by focusing too much on the immediate pennies, you might be missing out on huge savings down the road.

WE ALREADY HAVE OUR OWN PRIVATE CLOUD

While having your own private cloud is not a bad argument, and having your own private cloud can feel like a good position to be in, there are some issues to consider:

  • You still need to manage the underlying infrastructure and equipment.
  • Nothing is really shared, so you cannot utilise the economies of scale leveraged by the large cloud providers, which lead to lower costs, faster innovation cycles and better overall solutions.
  • You have no SLA in terms of availability or any other outcome; performance is managed in-house.
  • You likely have a lower security profile and fewer audit accreditations compared with a cloud vendor.
  • You still need to worry about it — day in, day out.

IT’S ONLY FOR DEVELOPMENT, TESTING AND NEW PROJECTS

Yes, the cloud is great for developing and testing new initiatives, but limiting the cloud to forward-looking projects suggests the infrastructure you have in place today is better than anything else on the market, which is simply not the case. Moving workloads to the cloud allows your IT team to spend less time ‘keeping the lights on’, and more time focusing on innovation and digital pursuits. Why anchor your IT team to managing physical infrastructure when they could be focused on unlocking new revenue streams?

Cloud computing is here, whether or not you are yet comfortable with it. The cloud provides unequivocal advantages, and the arguments against it no longer hold up. If you have not already, it is imperative that you put an effective cloud policy in place and begin to migrate your IT systems as soon as possible. Unlocking the benefits of the cloud is integral to providing a better experience for your customers and driving efficiencies internally. That said, cloud computing is not a competitive advantage — it is a basic delivery model. If you take too long to transition across, you’ll be stuck playing catch-up with your competitors.


..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset