Security Laboratory

Security Laboratory

Sec Lab: Security Products

In 1995 if you wanted a security product, you downloaded the source and compiled it on your Sun 3, today we buy supported commercial products: this series on the security lab is to introduce you to some of the products out there and, when possible, the movers and shakers that are part of the team that creates these products.

Other Related Articles in Sec Lab: Security Products

An Interview with Bret Jordan, a Security Architect for Identity Engines

By Stephen Northcutt, Google+

Bret is a special treasure in the information security community, he is at home in the academic as well as the so called "real world", the commercial marketplace. He has a lot of cryptography domain expertise, so you may want to dust off your SANS Security Essentials or CISSP training materials. We certainly appreciate his taking the time to share OpenSEA and other initiatives with us. Thank you Bret! How did you first become involved in information security?

My fascination with network security and looking for needles in haystacks began back in the mid 80's when I started reverse engineering my Commodore 64 games (I had pages and pages of machine code printouts all around my room) to try and get them to do things that the developers never intended. I became more involved in formal network security, security processes and risk management while at the University of Utah as I began the massive undertaking of rebuilding the entire College of Engineering network from the ground up, and then later in 1999-2000 as we prepared for the 2002 Winter Olympic Games.

I know you have been highly involved in both SANS and the CISSP CBK, how did that come to pass?

I was first introduced to the SANs Institute and ISC2 while I was at the University of Utah as Director of Networking in the College of Engineering; but, it was not until I was working for a large corporate enterprise as a Senior Security Engineer, that I actually took the exams and completed my GCIH (Hacker Techniques, Exploits and Incident Handling), GAWN (Assessing Wireless Networks), GREM (Reverse-Engineering Malware - Hands-On), and CISSP certifications.

These days you are clearly focused on the network authentication problem domain with the extensible authentication protocol, as well as on the work the Trusted Computing Group is doing. How did you get involved in this?

I got my start with 802.1X [1] in 2002-03 at the University of Utah where we deployed one of the first 802.1X networks (wired/wireless using TTLS/PAP[2] with an MIT based Kerberos backend) across a large section of campus. To support this network we also designed and built a federated RADIUS mesh to link all the decentralized departmental account databases so that everyone could easily use the network with their own authoritative user account via the EduPerson EduID. This infrastructure was also set up in such a way as to allow federation across Universities to assist network access for visiting faculty and students. We documented that groundbreaking effort in a white paper available at: ( [3]

Can you share a bit more of your perspective about 802.1X, what is a useful application for this standard?

802.1X is an IEEE standard for authenticating clients to the network. The primary architectural components include a piece of software on the client (called a supplicant), an authenticator (usually a layer 2 switch or wireless access point), and an authentication server (RADIUS server). Through the use of EAP[4] methods (EAP-TTLS, EAP-TLS, EAP-PEAP, EAP-AKA, EAP-SIM, etc) a user can securely authenticate to the network via strong cryptographic means. With new technologies and standards like TNC [5] in addition to 802.1X we are now able to get posture information from a client before its allowed on the network.

The benefit to an 802.1X network with TNC posture checking versus a standard Web Auth or Server Auth network is that a client is not permitted access to the network, or the ability to talk to any devices other than the authenticator, until it has successfully authenticated and passed the organization’s posture policies. In standard Web Auth or Server Auth networks, a user can jump on and start running Nessus, port scans and other fun goodies before they have authenticated and before you have guaranteed that the client meets the organization’s minimum posture policy (Anti-Virus, Anti-Spyware, Firewall, Anti-Phishing, etc.) DNS text record tunneling is a great way around Web Auth and Server Auth based networks.

I have talked to people that don't think 802.1X is that important, that it is just a way to make RADIUS sound like it is fresh and exciting. What is your take?

I am dedicated to the development and success of 802.1X; I personally believe that it is the way to authenticate clients at the edge of the network. I also believe that it solves a large problem we currently have with the overall security of the network. Most networks today deploy what I refer to as the escargot model of security. They are hard and crunchy on the outside (WAN/NAP connection with persistent based firewalls with deep packet inspection, SNORT/intrusion detection devices, NTop/netflow analysis tools, traffic logging, etc) and soft and chewy on the inside. With 802.1X, posture analysis, and a good policy based authenticator like Identity Engines’ Ignition Server, we can keep the average malicious user and infected client off the network before they even connect. I recognize that 802.1X is not an end-all security solution, but it fills a major part of a layered defense in depth security model for an organization.

So you think 802.1X will "stick" as a standard?

Over the next 12 months we will see more and more vendors support 802.1X into their product lines, and more academic institutions and businesses of all sizes will deploy 802.1X. It is easy to use, easy to configure in the network, and can, with the right tools, be easy to deploy.

OK, we will see, in the words of Tannenbaum,“The nicest thing about standards is that there are so many of them to choose from.”[6] So you have left the academic world of the University of Utah and gone into the commercial space, can you tell us a bit about that?

I joined Identity Engines as a Security Architect because I believe in the company’s 802.1X vision. I am energized by working with brilliant people and, most importantly, they let me work on challenging projects. It was really important to me to work for a company that is solving real problems and lets me contribute to the overall success of the business. Since joining Identity Engines I have been working on their 802.1X solution and the TNC/IMC connectors (for gathering posture data on the client) with the lead developers of the Open1X XSupplicant. We bring a lot of experience with 802.1X in large heterogeneous and high performance networks to Identity Engines. My goals for 802.1X are wider adoption, easier to use troubleshooting tools, and more interoperability testing across vendors.

This past May, at Interop Las Vegas, the OpenSEA Foundation was announced. I am a founding Contributor Member as well as member of the Project Management Committee for Open1X, the first official project of the Foundation. OpenSEA represents a collection of networking and security technology companies and university research institutions that are dedicated to the development and promotion of the opensource 802.1X XSupplicant. The founding organizations include: Extreme Networks, Identity Engines, Infoblox, Symantec Corporation, Tipping Point, Trapeze Networks, and, the UK's education and research network (UKERNA).

While the Open1X XSupplicant is an opensource project, Identity Engines is trying to help further the initiative and solidify it as the “Firefox of supplicants.” All of Identity Engines’ work on the supplicant itself will remain opensource. The value add that Identity Engines is going to offer is in the form of proprietary modules, ease of configuration, enterprise support and other enhancements as time goes on. Linux and Mac OSX users will love the fact that the UI we are writing for Windows is written in QT; thus, Linux, Mac OSX and other *nix operating systems will have a fully functionally UI that looks, feels, and configures exactly the same as the Windows version (after all, it is the same code with a few IFDEFs here and there.) The Open1X XSupplicant will be the first truly cross-platform solution on the market.

Some of the other projects I am working on for Identity Engines are: building an automated test harness for testing 802.1X networks; designing and building various diagnostic, troubleshooting, and attack tools for 802.1X networks; and, building a process and infrastructure to handle all the community feedback, testing, and patches to the Open1X XSupplicant.

Thank you, that is great and we wish you and Identity Engines all the luck in the world. Can you tell us just a bit about yourself, what do you do for fun?

When I am not working, I like to spend time with my wife and two children, make candy and chocolates for the neighbors, go on hikes in the mountains by my house, play chess, and study cryptography. In my free time between the family going to sleep and about 4 AM (yes, I do not sleep much) I have been writing a new Certificate and CSR management tool. People will really like my approach as it will actually be useable and not such a pain to implement or use. Professionally, my goals are to get more involved in various standards organizations like TNC/IFMAP, IETF, and the IEEE. I would like to help build security in, from the foundation up, and really make a difference.