experts opine on various security questions for networkworld

Via Rothman, I read a piece on NetworkWorld where several security experts are asked their reaction to various subjects. I thought I would give my opinion on those opinions below.

1. There’s security in obscurity.
David Lacey: 7 – I think Lacey’s opinion is a practical one. There really is measurable deterrent and even prevention due to some level of obscurity. Lacey leaves open some argument on the definition of security, whether obscurity assures actually stopping anything over time, and whether he meant obscurity alone is enough.
Nick Selby: 4 – I understand the point, but it was made poorly. Selby essentially takes the side of saying obscurity offers no real security value over time.
Bruce Schneier: 8 – Schneier goes realistic and brings up that, on some level or other, all security is based in some form of obscurity. Even my password is “obscurity” because it can also be discovered by someone else. I sympathize with this viewpoint, as a start to the road Lacey goes down. Schneier goes on to say despite this, security increases when you can minimize the reliance on obscurity.
Peter Johnson: 5 – Johnson echoes Selby that obscurity has no value over time.
John Pescatore: 8 – Pescatore hits my opinion square: obscurity certainly brings value, but security should not rely solely on obscurity.
Richard Stiennon: 2 – I think Stiennon is talking about security through ignorance/ostrich-hole.
Andrew Yeomans: 7 – Yeomans also sides with obscurity having no value over time. He brings up the great point that once obscurity is lost, it is game over.

Conclusion: I think most agree on three pieces to this. 1) Obscurity offers some value (or is part of the bedrock of security) early on, 2) Obscurity falls in value over time, and 3) decreased reliance on obscurity is better. I wonder if, really, the opinion is that obscurity adds value until it is broken…sort of a, “I’ll get away with it and it’s good, until I’m caught.”

2. Open source software is more secure than closed source.
Andrew Yeomans: 7 – Neither is more secure, but at least you can have some power in your own hands to review code or fix it with open source.
David Lacey: 6 – Neither is more secure.
Bruce Schneier: 6 – Schneier rides the fence by saying open source can potentially be more secure.
Peter Johnson: 2 – Unfortunately, I’m not sure where Johnson was going with this. You can see open source, but have more hoops when supporting it?
John Pescatore: 5 – Pescatore bounces around as well. Open source typically has less of an SDLC, but may deter developers from including junk.

Conclusion: Open source is not magically more secure just because it is open source, but it certainly has potential. And at least with open source, we all can see what we’re getting and maybe even improve upon it. with closed source, we have to trust the creators.

3. Regulatory compliance is a good measure of security.
David Lacey: 8 – Honestly, I have to find it important that Lacey backs his opinion with his own experience. He might be correct that compliancy could indicate a tendency to be more secure. And is that maybe the real value for compliance? Not to be ultimately secure, but to promote a culture to get there?
Nick Selby: 6 – Not a bad response! But certainly not helpful. 🙂
Richard Stiennon: 7 – I think it is dubious that one can be extremely secure but not compliant; but the opposite is certainly true.
Bruce Schneier: 9 – Schneier attacks the regulations as opposed to the act of meeting compliancy. If you’re compliant to a great regulation, then it is a good measure. Really, that’s not a bad approach!
Andrew Yeomans: 9 – I think Yeomans is saying that compliance helps raise the bottomline, but it may misguide people who rely on it too much and not on their own expertise; some security measures could get removed while others with no value get implemented.
Peter Johnson: 8 – Johnson goes after regulation quality as well, but also acknowledges that even regulations can be followed in varying ways, some poor.
John Pescatore: 8 – Pescatore basically says do your own security first, and then fit into compliance requirements after that. I think this is great…if you know what you’re doing.

Conclusion: The initial responses from “No” to “Yes” are interesting, but I think it all comes down to how well the regulations are made. Unfortunately, I have to side a bit with Johnson and Pescatore who seem more inclined to still do their own security measures and use compliance as a business afterthought. I mean, really, how specific and secure can we make regulations over an industry that has so many different people and ways of doing things and systems and homegrown… Yeah. I think Compliance will remain a way to raise the bottom line, but for anyone with expertise, it will remain an afterthought.

4. There’s no way to measure security return on investment
David Lacey: 9 – I really like Lacey’s response. He doesn’t really say you can or cannot get security ROI, but you can analyze your own history and make predictions based on that. It’s still not guaranteed, but at least you can gain some measures.
Bruce Schneier: 6 – Schneier takes the agnostic approach; nothing works now, but someday we might solve this. Not terribly helpful, but maybe realistic.
Andrew Yeomans: 7 – Yeomans makes a distinction I like to make when talking about business security approaches and even ROI: If your *business* is actually security, you’re different than the millions of other businesses. Yeomans then dances on the fringe of enabling business through security (itself very arguable for non-security businesses), so I’m not sure where he’s going there.
John Pescatore: 7 – My only problem with Pescatore’s argument is the problem of determining when a security issue is a business need and how to value that. I’d counter that very, very few security spends results in meeting a business need, at least in the cyber aspects.
Richard Stiennon: 4 – I just didn’t buy this. Probably me being dense.

Conclusion: I think we can say if there is a way to measure security ROI, we don’t know it yet. I’d agree with Scheier’s agnostic approach. However, that’s not really an answer, so I’d side more with Lacey’s approach of analyzing history and trying to be consistent over time, while realizing this isn’t an exact science; more like an educated guess. I would also think about what Yeoman says about security issues becoming minimum requirements to business. Kind of lik ea roof or maybe the eventual need for security guards based on your business sector? The more one thinks about security ROI, the more one becomes like Schneier!

5. The Russian cybermafia is to blame for the worst online crime.
Richard Stiennon: 5
David Lacey: 5
Andrew Yeomans: 5
Bruce Schneier: 6
John Pescatore: 7
Peter Johnson: 4

Conclusion: In my mind, it is interesting to think about what is worst or who is worst when it comes to threat profiling, but I sympathize most with Pescatore and even Schneier’s unspoken point: I really don’t know or care, and neither knowing nor caring should change how I secure my assets. There are, however, people who should and do care about threats and tackling them head on, but I’m not in one of those organizations.

6. Antivirus software is essential to prevent malware.
David Lacey: 7
Andrew Yeomans: 7
Bruce Schneier: 7
Peter Johnson: 7
John Pescatore: 7
Richard Stiennon: 7

Conclusion: I think everyone basically says that antivirus software helps, but is not perfect. It is just another piece in a blended approach to security. It is a common best practice and part of everyone’s short list of security “needs.”

7. Outsourcing security is riskier than staying in-house.
David Lacey: 8 – I agree, you lose control and visibility!
Bruce Schneier: 8 – I agree, people are risky!
Peter Johnson: 7 – At first I don’t agree, but really, from a higher level I do have to agree. Basically Johnson says if it is done correctly, then it doesn’t matter which side does it.
John Pescatore: 8 – I agree, for many businesses, an MSSP can make sense!
Richard Stiennon: 5 – I slightly agree. I’m not sure I would say outsourcers can hire better people; I think some certainly can, but there are plenty of resourceful, talented people that can be found inhouse. Are they better at reacting? Yes, at a large scale like analyzing malware and issuing signatures, but are they better at reacting in my network, for instance? Maybe not. If they detect something wrong, it is still on inhouse persons to do something about it.
Andrew Yeomans: 10 – I agree, and Yeomans has the best wholistic comment of the group!

Conclusion: It still comes down to, “It depends.” Yes, you can make gains from outsourcing such as 24/7 response, leveraging actual specialized experts, etc. I would also throw in that not all outsourcers are the same. Much like Jerry Maguire would say, “Less client, more personal interactions,” outsourcers can fall into the same boat trying to service too many clients with substandard analysts and tools that don’t scale. Extreme example: McAfee’s HackerSafe brand.

8. Biometrics is the best authentication.
John Pescatore: 4 – Sadly, in the movies it seems biometrics is the most-often broken!
Andrew Yeomans: 8 – Yeah, I see biometrics still being a nuisance for large-scale use; see Selby below.
David Lacey: 8 – I agree with Lacey, it is an ideal approach. It is something we all have and it should be unique and not necessarily easy to physically fake (notwithstanding the digital representation of it). But it is not perfected, nor do I know where I stand when it comes to privacy. Going down the biometrics road long enough leads us to DNA sequencing. But that obviously has privacy drawbacks…
Peter Johnson: 5 – I’m not sure I’d go down the road Johnson is, in blaming implementation or how ease of changing it.
Bruce Schneier: 8 – Fair enough!
Nick Selby: 8 – Yeah, biometrics is not a reality for large scales yet, nor in the next 10 years if I may throw down a number.

Conclusion: Biometrics should theoretically be viable, but we’re just not there yet with false positives, how secure it is, and how easy it is.

9. Digital certificates identify a Web site.
Richard Stiennon: 5 – I obviously don’t reward the joke answers, even if I do guffaw.
Andrew Yeomans: 9 – Good points!
Bruce Schneier: 9 – Exactly! No one really cares or understands it.
David Lacey: 9 – Exactly!
Peter Johnson: 9 – Exactly, and yet still major sites don’t use EV SSL certs. EV SSL was a poor solution but it certainly sounds good to those selling them! Props to the first major web browser that STOPS forcing this.
John Pescatore: 9 – Good points!

Conclusion: This was a poor solution to a problem I don’t think we properly understood. It certainly makes money for the certificate authorities, and I know newer browsers are alarming on lack of EV SSL, but come on. Digital certs have a place in the enterprise for things like VPNs, but the only people who care or understand them are the tech experts; everyone else couldn’t care less.

10. Employees can be trained to behave securely and resist social engineering online.
Richard Stiennon: 4
John Pescatore: 5
Andrew Yeomans: 8
Nick Selby: 8
Bruce Schneier: 9
Peter Johnson: 6 – I didn’t really understand this answer.
David Lacey: 8

Conclusion: I think we accept that people are human and we will always make human mistakes. We can train and try to raise the bar, but ultimately, even with our best intentions we still will make mistakes. I think the only place you get a high degree of success with this will be defense/government facilities where not following the strict rules could result in human death (social engineer your way past the MPs?)

bonus: Compare and contrast social engineering with security through obscurity. If neither will ever ensure security wholly, do you still give it value because it will never be perfect?

11. Don’t worry, the government has a secret cyber-defense capability.
Nick Selby: 5
David Lacey: 7 – Us civilians can only guess at the capabilities of the government, but I would be willing to bet they are at least 5 years ahead of what most of us think, and 10 times more successful than we’d like to think.
Andrew Yeomans: 7
Bruce Schneier: 6
Peter Johnson: 9 – In fact, I’d go so far as to say the relationship between our privacy and our cyber-defense is indirect. When one goes up, the other goes down. A sad truth, which is why we have so much friction right now between the two and finding that sweet spot (or at least getting people to believe we’re at that sweet spot).
John Pescatore: 7

12. The longer the key length, the stronger the encryption.
Andrew Yeomans: 8
David Lacey: 8
Bruce Schneier: 8 – When I read this question, I knew Schneier would be all over this.
Peter Johnson: 8
John Pescatore: 8

Conclusion: The situation is far more complicated than just key length.