Skip to content

You Cannot Hack Yourself Secure

You Cannot Hack Yourself Secure

Published on

Are hacking and penetration testing the great solution to your security woes? That’s what you’ll hear from security conference speakers, who focus more on these topics than any other discipline in cybersecurity. That heavy focus on hacking is misguided.

You can’t hack yourself secure. Regardless of how much effort you place into breaking your product, application, or system, the security properties of the unit under test will be no better at the conclusion of the engagement. The analysis itself only identifies potential security issues and does nothing to fix them. You must invest effort separately in addressing the problems. Hacking with no secure development and design results in the unit under test offering no net positive gain from the experience.

Don’t get me wrong: Security research, development of exploits, and collaborating with vendors to fix issues are not bad things. My beef is that they are not the only things or the most critical things for consideration. Breaking gets all the press and all the publicity, and this does not provide the best overall return on investment for your company.

The hyper-focus on hacking is the broken concept. Focus on securing the entire development lifecycle instead.

Go beyond the breakers

Hacking and penetration testing fit into the category of “breaker” activities. To understand what the breaker is, you must explore the concepts of penetration testing, bug bounties, and red teaming.

Penetration testing is the process of evaluating a system or application as an outside attacker would, using all available tools, techniques, and mechanisms to compromise the scope of the test. The penetration tester then reports back to the hiring company with the results. Watch out for the penetration testing company that provides discount tests. They are more than likely vulnerability scans disguised as a penetration test.

Crowdsourced penetration testing plays out through bug bounties. Bug bounties let you control the factors and cost of external security testing by setting a budget and then having a group of independent testers scour your solution for security problems. You pay out rewards for those findings that you confirm are real issues with your product or application.

Red teaming is the industry name for a group of folks working together to break into your system from the outside. Red team has its roots in the US intelligence community, but the name has since become synonymous with penetration testing efforts.

Hacking conferences are the most popular in the realm of security. When you look at the session list for a hacking conference, you see that it’s all about the breaking. There is session after session highlighting the process security researchers went through to exploit a product or system, and the efforts they went through to assist the vendor in patching the problem and providing a solution. Sessions also cover new tools that have been created to assist other testers in breaking their things.

In the past 90 days on Amazon, 135 books that contain the word “hack” were released in the “Computers and Technology” section, and zero with the words “secure coding.”

The lack of industry emphasis on secure coding is the problem that we face; our focus as an industry is on breaking, and not on building things securely from the very start.

The findings of penetration testers and hackers were introduced by developers that built the system or application at some point in the past. What if we could influence those people earlier in the process, fix the problems, and then have much fewer issues for the hackers and pen testers to find? That is real security, but it isn’t something that takes the grand stage at conferences.

Secure the lifecycle, people

As a security pro for more than 20 years, I’m not too fond of the focus the industry places on shiny new exploits and hacks. Focus on securing the entire development lifecycle, and you’ll save money and resources, but you’ll also strip away some of the glam of the exploit cycle.

Let’s consider the cost of a typical penetration test, and then think about what you could do with those resources in the development process. A standard penetration test starts at around $15,000 but can quickly rise above $100,000, depending on the scope. We’ll meet in the middle at $57,500 for purposes of discussion.

What if you invested this money in education instead of another penetration test? How many developers could you educate for that amount of money? You would need to find an educational provider that could reach a good percentage of your developer population to see the benefit, but even enabling ten developers with solid application/software security knowledge prepares them to better deal with the security challenges they encounter within the code they write.

A tip of the hat to the unsung hero developers

My call to action is one that will likely fall on deaf ears. Hacking and exploits are exciting and, as an industry, we worship the best hackers who find the most significant problems. They take center stage at the biggest conferences and are featured on cable news and in magazines when they discover the next big security problem.

They receive all the attention because risk and difficulty are what sell. Most people will not tune into a cable news broadcast to hear how a developer did the right thing for security and performed a threat model, which resulted in the discovery of a potential vulnerability condition, with the correction of that vulnerability condition before the code was exploitable in production.

Despite all the glitz and glam of the exploit cycle, remember that the unsung heroes are the developers who do the right thing for security. They’ll never get to deliver a keynote at an industry conference. But without them, we’ll never solve our security woes. So I tip my hat to the unsung developers who understand that we cannot hack ourselves secure.