“Change is the law of life. And those who look only to the past or present are certain to miss the future.” – John F. Kennedy

Early in my IT career, I worked as a Novell LAN Administrator for a government contractor. (For the millennials out there, Novell was a network operating system that was really popular in the 1990s when you still needed to dial-in to the internet…so LANs or “high speed” Local Area Networks were the hip, cool thing).

In a 150-year old government building, at the end of a long hallway of high-walled cubicles, sat my first real IT mentor, Chris, a Novell guru and an amateur philosopher. Chris taught me how to troubleshoot. One day, I remember struggling with a network printing issue, and I probably spent a few hours trying to solve the problem until I decided to go ask Chris for help.

Chris looked at me as I went through a dissertation on how I tried to solve the issue and calmly said, “Assume Everything is Broken.” I blankly looked back and asked him to just come look at the problem. I wanted the lazy way out and just have Chris fix it…so I could go on to less difficult tasks.

Chris didn’t let me off the hook and told me to sit down and think about the problem differently to first find out what works. I went back to my cube and methodically started from the assumption that everything is broken and was able to narrow the issue down to a faulty interface card and solve the problem.

Today, every device is becoming part of a network, from your car to your light bulbs. Phones can create their own LANs or be in the cloud. This hyper-networked world has created a risk management nightmare for enterprises. There is a dizzying array of cybersecurity companies and tools and “buzzwords” for trying to secure this constantly changing environment – from automated AI-driven threat detection and response, along with instrumented orchestration and identity management, and governance and endpoint protection and user behavior analytics and network monitoring – the list goes on and on.

The goal of putting all this cybersecurity stuff in the enterprise seems generally to be driven by an assumption that everything is at risk (broken)…so we need more actionable data about our devices, networks and people so that we can reduce risk to our enterprise, brand, sensitive data, critical assets, etc.

Assuming that everything is broken or vulnerable means that we as an industry may need to rethink “everything,” including our legacy network architecture, which was initially designed based on the need for a perimeter or moat around the enterprise to protect the crown jewels. Many of the technologies used for securing network communications that are still operational today are 20, 30 or 60 years old.

Below are some legacy technology examples:

  • Passwords – the first computer password was developed by MIT in 1961 – though it is still used by many enterprises as a single means for authentication for most or all of their employees. Enterprises are breached regularly because many passwords can be compromised fairly easily. Most enterprises still do not employ multi-factor authentication (MFA) for all network endpoint devices.
  • Firewalls – or packet filters (as 1st generation firewalls were called) were invented in 1989. Firewalls have been an integral network architecture component over the last 30 years for protecting enterprises from the nefarious actors operating on the public internet. Firewalls have improved greatly over the last 10 years, and now firewalls provide more granular security for applications and users. The problem is even newer Next Generation Firewalls must be integrated with several other technologies such as multi-factor authentication and user access monitoring and recording to substantially reduce risk to critical systems and data. This adds complexity and costs that can make it too onerous for many under-resourced IT or OT enterprises.
  • VPNs – Virtual Private Networks (VPNs) have been around since 1996 and are still used by enterprises to connect to critical data and systems. VPNs provide a secure channel of communication but do not protect access to individual critical systems. The problem is when VPN credentials are compromised on an endpoint, the attacker may now have a secure channel to exfiltrate data from your enterprise.
  • Jump Servers – have been around since the 1990s and were created for secure access between two dissimilar security zones. Basically, a jump server is a machine that provides a check point so you can connect to other more sensitive systems. The problem with jump servers is that many are built on systems that are not hardened or patched regularly and many times expose insecure communication protocols across security zones and out to the Internet.

These technologies, which are part of most enterprises’ network architectures, were initially designed to assume only specific things were broken or at risk – they were point solutions to solve specific problems with either securing a communications channel or user access to a system.

If you took a poll in the 19th century on what folks needed to get from their farm to town faster, they would have unanimously said a faster horse. They couldn’t have anticipated the automobile – just as computer scientists in the 1990s were not thinking about smart toasters and networked fish tanks. We don’t need better passwords, firewalls, VPNs or jump servers – we need a better holistic secure network architecture.

Zero Trust Networks – A 21st Century Approach

The art of war teaches us to rely not on the likelihood of the enemy’s not coming, but on our own readiness to receive him; not on the chance of his not attacking, but rather on the fact that we have made our position unassailable. – Sun Tzu

Legacy network architecture assumes a perimeter of security that those outside of the network are not trusted and those inside the network are trusted. Zero-Trust network architecture recognizes that perimeter security is just a component of an establishment of trust between a user or machine and its connection to a specific resource.

Zero Trust calls for enterprises to leverage micro-segmentation and granular perimeter enforcement based on users, their locations and other data to determine whether to trust a user, machine or application seeking access to a particular part of the enterprise.

The concept of zero trust was built on the premise that you cannot trust the network, device or user independent of one another.  Each individual connection between a user and a system must be authenticated, authorized, encrypted, audited and monitored.

While cybersecurity tools such as threat detection and response are still needed, they will be more effective if they are deployed in conjunction with a zero-trust network architecture.

If you assume everything you are protecting in your enterprise today is broken, you can start testing and validating to see what is working and then build a secure network architecture based on zero-trust principles for 2020 instead of “checking the box” with 20-year-old integrated security technology.