Application Security Podcast

Dr. Anita D’Amico -- Do certain types of developers or teams write more secure code?

March 25, 2021
Season
Episode

Show Notes

Dr. Anita D’Amico is the CEO of Code Dx, which provides Application Security Orchestration and Correlation solutions to industry and government. Her roots are in experimental psychology and human factors. Her attention is now focused on enhancing the decisions and work processes of software developers and AppSec analysts to make code more secure. Anita joins us to discuss research she has done answering the question, "do certain types of developers or teams write more secure code?" Being a security culture fanatic, this topic is near and dear for me. We hope you enjoy this conversation with...Dr. Anita D'Amico.

Transcript

SUMMARY KEYWORDS

people, developers, security, code, research, northrop grumman, vulnerabilities, culture, work, warfare, called, thinking, human factors, performance, factor, team, managers, secure, world, safety

SPEAKERS

Chris Romeo, Dr. Anita D’Amico

Chris Romeo  00:00

Dr. Anita D'Amico is the CEO of Code Dx, which provides application security orchestration and correlation solutions to industry and government. Her roots are in experimental psychology and human factors. Her attention is now focused on enhancing the decisions and work processes of software developers, and AppSec analysts to help make code more secure. Anita joins us to discuss research she's done answering the question, do certain types of developers or teams write more secure code? Being a security culture fanatic, this topic is near and dear for me. We hope you enjoy this conversation with Dr. Anita D'Amico. You cannot hack yourself secure. Everyone wants to focus on the offensive side of the equation. The challenge is that developers get bored with hacking broken pieces of code after a while. Sure, it's a shiny, cool new thing in the beginning, but how about one year later? At Security Journey, we focus on long-term sustainable security culture with the developers as defenders. Our approach integrates experimentation together with learning. We believe that developers need hands-on experience, but not at the expense of fundamental knowledge. Visit www.securityjourney.com to sign up for a free trial of the security dojo or schedule a demo. Hey, folks, welcome to this episode of the Application Security Podcast. I'm flying solo for this episode. This is Chris Romeo, CEO of security journey. The topic that we have for today is we're going to ask the question, do certain types of developers or teams write more secure code? I'm joined by Dr. Anita D'Amico, who is someone who has thought a lot about this, and I've had a chance to meet and get to know Anita over her time starting Code Dx and bringing that along, but Anita, our audience is always on the edge of their seat, because they know the first question that I'm always going to ask everybody is, what's your security origin story? Or how did you get into this crazy world that we call application security?

Dr. Anita D’Amico  02:20

Well, my story is probably different than anybody else. I'm an experimental psychologist, by training, my PhD is in experimental psychology. I never took a computer science course in my life. The way I got involved was that I was working at Grumman when Northrop bought Grumman. One of the things that they did was they selected certain people from Grumman to go into a class, I guess you could say that it was a class that in which they would inculcate us with the Northrop culture. It was a bunch of us that were out in California, and we were learning the entire Northrop Grumman way of doing program management, and they would bring guest speakers. They were like division presidents. The person who was the Executive Vice President of Business Strategy, Dr. James Roche came into the class. This is the person who made decisions about mergers and acquisitions, the entire vision of the company. By the way, he later on became the Secretary of the Air Force. Okay. He stood there, and this is in the mid 90s, and said, you know what the future of this company is, and we all sat there going, that is your job. He said it's information warfare, he said, Northrop Grumman is going to be a leader in information warfare. He said, who knows what information warfare is, and we all sat there like, oh, we don't know what and nobody knows this. He said, Well, I'm telling you, this is the future of the company. Then he went and talked about something else. After this two week course was over, I went back to my office, and this was in the early stages of the internet, you didn't like go onto the internet and look up stuff, you had to go to the library. I went through all these, aviation Space Week, whatever it's called the Aviation Week weeklies, looking for what is the story about information warfare, and I really couldn't find anything other than these statements right. I wound up writing a memo to Dr. Roche. In fact, I was chicken I had three other people cosign it, but I wrote this memo that said, you said The future of the company is an information warfare. Well, I'm sitting here in data systems in the research group, and I would think if anybody knows what information warfare is, it would be us and I don't know what it is. Anybody I asked, they don't know what it is, but I can guess and so here's some thoughts. I think it might be this, and not knowing exactly what it is, but here's 10 ideas of what we might do. Sent the memo off and now, this is before everybody was comfortable with email. It went into a Manila pack, a manila envelope that you seal it up with a little thread, went into the pouch that got sent from New York out to California, and I forgot about it. About a week later, I get an irate call from my new boss. He said, Did you write a letter to Dr. Roche? I said, Yeah. He goes, you didn't run it through channels. I was new to Northrop Grumman, and Grumman was a family organization, where you saw the President of your division, you said, Hey, Jerry. But in Northrop, it was run like a military command and there was a chain of command. I had leftover like three, I didn't talk to my boss, or my boss's boss, or my boss's boss's boss. I went right to the top and said, you don't know what information warfare is, here's 10 ideas. So I was in hot water, and they convened a video conference. Now this is in the 1990s, and it was sort of like the Get Smart cone of silence. We went into some conference room, and it was really dim, and there were the people in California and I was in the New York group. They had like all the division presidents there, and I'm sitting there and a couple of my co-signers are sitting there. Dr. Roche says, so, he's a huge man, huge man with a booming voice, So I understand that some people think that we don't know what information warfare is. He said, Well, let me tell you what I think it is. Then he goes into this whole thing about radars and electronic warfare and I thought, I am so out of my league. I am just slinking into the chair, and then he said, so I think we need somebody to write a position paper on where the company is in information warfare. Now who might that be? Oh, smarty pants. So he asked me to write a position paper on what information warfare it is, and I did. Then he called me to his office in California and he wanted to discuss it personally. Then he said, Here's what I think we need to do. We need to put together an integrated product team, an IP team, remember those days IPTs? He said, I want you to pick 10 people, any 10 people in the company, and come up with a strategy for information warfare for Northrop Grumman. So I learned about navigation warfare, I didn't even know that there was something called nav war. There were people who were working in that electronic warfare, communications warfare, cybersecurity, people didn't even use that term, they used information assurance, and I put together a team of 10 people, and we studied the area and the directions that Northrop Grumman should go in information warfare. That's how I got involved in cybersecurity. It was once again an example of my getting involved in something I knew nothing about. If I had a motto to my career, it has been that ignorance has never stopped me from success.

Chris Romeo  09:24

That'll be the title of the biography.

Dr. Anita D’Amico  09:30

So, I wandered into it, and I have never left.

Chris Romeo  09:35

Yeah, that's definitely a fine way to find your way into it. I bet you were sitting just quaking in that room. Is this an execution chamber? What's happening here? There's a lot of witnesses. That's incredible though, that speaks to your ability to not be afraid to go up and put something on paper and look what happened. A lot of people, anybody that's sitting there thinking, I'm afraid to do this because of the consequences, this as an example of how that can work out for you. That's incredible. As CEO of Code Dx, you've come from the world of government into the world of commercial tools, trying to solve problems that a lot of customers have out there as they're working through correlating different results from different tools. That's something that the Code Dx, that's your specialty.

Dr. Anita D’Amico  10:39

We started out as a research project and wound up being a commercial success.

Chris Romeo  10:48

Okay, so you came from a research project that was being done to further the government, trying to help the government be better at solving this problem directly.

Dr. Anita D’Amico  11:01

After I left Northrop Grumman, I started a cybersecurity research division of a software company. The software company was applied visions, and I started secure decisions. Most people know me from security decisions. We did all kinds of cybersecurity Rr&Dd. We worked for DARPA, for IARPA, for the Department of Homeland Security, Air Force Research Lab, Naval Research Lab, and we were working on research, and we would develop proofs of concept and prototypes of new capabilities in cybersecurity. Code Dx started as a small business innovation research project that was funded by the Department of Homeland Security, we got $99,000 in six months to come up with a proof of concept. Remember, my motto, ignorance has never stopped me from success. Ken Proll and I worked on it, and here we are, years later, and we have a commercial company, and we have big customers, both government and commercial, that are buying our products in application security.

Chris Romeo  12:10

That's awesome to hear how your career, that trajectory that that you've been on, and how those pieces fit together.

Dr. Anita D’Amico  12:17

And the research part of it. By the way, I did research in that led to Code Dx. But the topic that we're going to talk about today, human factors in secure code development, also came from a research project that I started with DARPA. It's always been started with some kind of research.

Chris Romeo  12:39

That's a good transition that takes us into the the primary thing that we wanted to talk about today. I know, it was, I think, prior to the pandemic, on that last conference year, I know that you talked about this at a couple of different big events. This is research that got a lot of attention and got accepted in a lot of big conferences. It's fascinating to me, as someone who has focused much of the last 10 years of my career on security culture, and how do we help developers get better at security? How do we incentivize them? How do we understand them better? It was fascinating the first time I heard you deliver this talk about this topic. Let's define, when you say human factors, I think our audience could interpret that a lot of different ways. I'd like to get some definitions. First of all, when you say human factors, what does that mean from your perspective?

Dr. Anita D’Amico  13:37

Human factors are the properties of people and their environment, that affect their performance, the way they do their job. They could be psychological human factors, for example, somebody's ability to focus their attention could affect how well they perform, or whether they have good short term or long term memory, and that's a psychological human factor. Their style of making a decision could be a human factor that affects their performance. Then there are group psychological human factors, for example, how people collaborate, how they resolve conflict, how they communicate, communication styles are human factors that affect job performance. That's the psychological, there's also physiological human factors that affect performance. For example, how tired you are, how much sleep you've had, how much stamina you have, your health - people who have the common cold just perform their jobs less well than they do when they don't have a cold. Then there's something called circadian rhythm. Circadian rhythm is the changes that your body goes through during the course of the day, sometimes like in the afternoons, you feel a little sluggish, whereas in the morning, you're a little bit more bright eyed and bushy tailed, well, that actually has to do with physiological changes in your body, and that affects your performance. Then the last type of human factor is environmental. For example, lighting will affect certain types of job performance, or the temperature, or whether or not you're in a distracting environment. Those are examples of human factors and the human factors that affect performance.

Chris Romeo  15:36

How do those, when we think about quality and security of code, how does quality and I love the fact that in the research that you've done, you're thinking about quality and security together? Because that's been one of those issues, where it's like, a lot of people don't see the light in that, they don't see that there's such a direct correlation, like, can you have a secure piece of code that has a low quality? Like, there's that connection there? How do you see these things fit together with quality and security?

Dr. Anita D’Amico  16:11

Well, I was really interested in looking at human factors, and how they affect code quality and code security because I thought it was another way of finding vulnerabilities. Most of the effort on finding vulnerabilities has to do with looking at the code. If you say, Well, okay, you want to start a code analyzer and you find vulnerabilities or you do a manual code review and you find vulnerabilities. First of all, when you when you're doing a manual code review, where do you even start? Where do you start looking for the problems? Same thing with if you run source code analyzers where there's all kinds of issues that come out, which ones you want to look at first? I thought maybe there was something about the people who write the code that could orient us as to where to look for vulnerabilities in the code. Maybe if we knew something about the developer, their experience, the time of day that they committed the code, how many people were on the team, whether they were distracted or not, could we then take that information and use it to orient us as to where to look for vulnerabilities in code? It was a little bit different perspective and that's how I started getting interested in the human factors that affect secure code development.

Chris Romeo  17:40

I'm curious about the data set now. Tell me a little bit about the depth of data that you were able to collect and where's that data coming from. It'll help us as we start to talk about conclusions that you had from it to understand the source of the data.

Dr. Anita D’Amico  17:58

Well, first of all, everything that I'll talk about today was based on research, scientific research that was done. There's research that's been done and most of it's been done in software engineering, some of it's been done in other areas, like in team performance. When you're looking for the factors that affect code development, most of the work is actually done on open-source because there's a lot of open-source. There are universities, such as Rochester Institute of Technology, that regularly look at open-source, and, and they look at the publicly disclosed vulnerabilities. Now, here's the way the research is typically done. You have an open-source repo, like Chromium, or Apache struts, something like that. We know how many publicly disclosed vulnerabilities there are in Chromium because people report it and it gets logged. The way the researchers typically do the work is that they take all the files in a code repo, like an open-source repo, and they make this pile over here that has no publicly disclosed vulnerabilities. This pile over here that has publicly disclosed vulnerabilities. Now I use the word publicly disclosed, because the pile that doesn't, doesn't mean they don't have vulnerabilities, just means that they haven't been disclosed. You have these two piles and what they do is they look for what are the things that maximize the difference between those two things. What are the real things that discriminate between those piles? The things that I'll talk about today, a lot of it was based on research like that using Chromium, Apache struts, Linux kernel, and those types of repos. The other way to do the research is to take proprietary code and to run static analyzers on it, that find quality and security issues, and then you look at where all the problems are. Then you say, Okay, now what do we know about the development team? How many people were in the team? Were they doing multiple things at once? What was their environment like? Do the research that way.

Chris Romeo  20:22

Was there surveys that went to these teams that they could choose the different kinds of factors that impacted them? Or you were able to collect information about their whole environment? Or how did you get to some of those human factors that we talked about at the beginning?

Dr. Anita D’Amico  20:39

First of all, most of what I'll talk about today was based on the open-source, but we did do proprietary work as well. There are other companies, like Microsoft, who also did proprietary work, and I'll report on some of that. When we started the research project, it was very opportunistic, it was more or less, what can we collect? This was a project I was working on that was funded by DARPA, and I was at Secure Decisions. In fact, Secure Decisions is still continuing to do this work after I left, and we actually instrumented the environment of some of the developers. We asked them, things like, when they came in the morning, how many hours of sleep did you get last night? What are you listening to? Those types of things, it's very difficult to collect this kind of information and it was a hard thing to do. Most of the work is based on on open-source projects.

Chris Romeo  21:51

Okay. When we think about what you were able to conclude based on certain types of developers or teams like, do we have the opportunity to create super developers and super development teams? As a result of what you were able to analyze there? Are there some hidden things we can untap to create super developers that are passionate about security? Is there some secret sauce here that we can share?

Dr. Anita D’Amico  22:23

Well, I would say, rather than focus on one of the super developers, I think it's more what are the things that we shouldn't be doing? What are the things to avoid? There's the things that we think you should avoid, and there are things that people mythically may think are a factor, and maybe they're not. I think at the end, I could give you some ideas on the types of things that managers should keep in mind if they want their team to write good quality and secure code, there are certain things that they should do and certain things that they should have their team avoid doing.

Chris Romeo  23:07

I definitely want to come back to that. Let's look at a couple of case studies, this last 12 months has been strange for everybody. There's a lot of things you could get people to argue about on the internet. I think if we put that out there, and we would just get a bunch of likes, like, nobody's going to say, No, it wasn't, this was a normal year. We think about how software development teams have been working. A lot of teams historically have been co-located, they've been in the same office, have been able to share the whiteboards, and do these types of things. A lot of people believe this is the way development should be done. Things have been done differently over the past year, what's your perspective on this? Is remote a good thing? Is that something that we should continue even when the world returns back to normal? What's your opinion of this?

Dr. Anita D’Amico  23:51

Well, I'm not gonna give you my opinion on saying that, I'm going to tell you what the dataset says.

Chris Romeo  23:56

That's a scientist answer right there.

Dr. Anita D’Amico  24:00

There's one study on this that Microsoft did. They did a study of the post release failures in Windows Vista, and the Office 2010 binaries. It was done a while ago. What they did was they compared the work product of different groups, and they were specifically interested in colocation. They compared the failure rate of code that was developed by teams that were co-located like in the same office, that were in the same building, that were in the same campus, that were in the same region, or as far away as different continents. They were really interested in how remote is remote. They found no difference in the failure rates between the teams. None had people who were different by a continent did not necessarily have any higher failure rate than those who were in the same office area. Now, I've presented this research a couple of times, and one of the comments that comes in from time to time is that well, Microsoft has really optimized remote. That's a fair comment. It may be that if you haven't optimized the way you do remote work and collaboration, that it may have been effective if it's a non optimal. That research showed that there really wasn't any difference in whether you were co-located or not. There were factors that actually did predict whether or not they were going to have post release failures. It wasn't colocation. There was other human factor.

Chris Romeo  25:59

That's interesting because when I think about how would I answer that question, purely based on my opinion, which is, one of the only things I'm an expert in is my own opinion. I would have answered it exactly the opposite way, just based on gut instinct of saying, but of course, when people are together, it's going to be better because I can just talk over the cube wall or whatever. But I mean, the data speaks. That's how we're able to draw conclusions there is by saying, hey, the data is telling us a story, based on analysis that was done.

Dr. Anita D’Amico  26:30

Yeah, well, you just said you could talk across the cube wall. One of the things that's worthwhile studying is whether or not there's good collaboration. Even though you're remote, if you're on Slack, I know our software development team, especially since we have been working all remotely during the pandemic, they're on Slack all the time, and they work as a cohesive unit. There are ways of collaborating without just calling over the cubicle wall.

Chris Romeo  27:09

I think it's easy to see that coming out of the pandemic here, work has been changed forever. Nobody's gonna go back to the way things were exactly on January 1st of 2020, or December, November 2020. There's a new setup for work and that's just going to be the way it is. Now, let me give you another example because I know during the pandemic, a lot of people have been working strange hours because they're juggling, taking care of schooling their children online, they're juggling just a lot of things, maybe caring for their parents, and they're like, I'm gonna squeeze in some work here from 10pm until 2am because I gotta get my job done, but I got a lot of other responsibilities. Does the data tell us that there's an impact on the quality or security of the software they produce if they're working in the middle of the night?

Dr. Anita D’Amico  28:05

Yes, it does. The data shows that late night commits had more bugs in them than the morning commits. There's a study that looked at the commits that were made between midnight and 4am, and found that the bugs in them were statistically higher than the commits that were made between 7am and noon. I remember when we looked at this research, and I just looked at it and said, Oh, of course I said, it's circadian rhythm. I did my doctoral dissertation on the effect of sleep deprivation and circadian rhythm on watch standing officers on the ship. I know from all the research that your body changes over the course of a day, that it has a diurnal cycle. The cycle is that at about six o'clock in the morning, there are certain chemicals in your blood called catecholamines, that increase your level of alertness. They go up about six o'clock in the morning and they stay elevated until about two in the afternoon, then they dip and then they come back up at about four in the afternoon. Now think about that post lunch nap that you'd like to take. That feeling, Oh, man, it's two o'clock. I need a cup of coffee. Well, it's not because you had a big lunch. It's because the catecholamines in your body dip around two o'clock in the afternoon, come up about four. Then they stay up there until about 10 o'clock at night. From 10 o'clock at night until about six in the morning they go down dramatically. Now, there are all kinds of research out there that shows that human performance follows your circadian cycle. Whether you are driving a truck or piloting an airplane, or sewing, your performance is going to dip when those chemicals in your body dip. When I saw this work that said that late night commits have more bugs, it was like, Well, yeah, they did it when they were the least alert, physically, that's a physiological human factor that affects their job performance.

Chris Romeo  30:38

You're basically saying that we should be sleeping from 10pm until 6am every day, if we want to have that optimum performance.

Dr. Anita D’Amico  30:47

That's right, you should be. Now, there are some people who can change their circadian cycles, but theyre few and far between. People who are on a permanent shift, some of them are able to change, like if they always work nights, but most of them can't. This actually becomes a physiological and mental stressor on them.

Chris Romeo  31:15

Does age play into this at all? I'm thinking about my my technological career. I got into computers, when I was really young, it was a long, long time ago. But I used to be in more of a rhythm where from 10pm until 2am was an optimum time, it felt like an optimum time. Of course, I was young, and maybe I didn't know any better. Does any of the research say that there's an age factor that impacts this?

Dr. Anita D’Amico  31:45

Not that I know of. Even though I believe all the things that I just said, I personally violate this on a regular basis. I find if I'm in a groove, I mean, it could be at two o'clock in the morning and be writing and who knows whether it may not be as good as what I wrote it two o'clock in the afternoon, but I violate this on a regular basis. Getting back to your original question about the change in schedules, there's people working in the pandemic, and they're taking care of their kids and they're working these odd cycles. There's another piece of research that I think is very relevant to software engineers, and to anybody. That is the amount of hours that you have been awake. It's a very strong effect, that if you have been awake for 17 hours or more. After that point, your performance goes down and 17 hours, it's not that much. I mean, 16 hours is, okay, you're ready to go to bed. If you got eight hours sleep, that would mean, okay, it's just about bedtime. As soon as you start working past that 16th hour, into the 17, 18, 19 hours without sleep, performance goes down. Now, I'm not saying it does with software engineers, but in other areas, like in driving, piloting an airplane, health care, they did notable decrement in performance, so there's no reason to believe that it would be different in software engineering. In fact, one of the studies showed that the work that you do after you're awake for 19 hours, is about equivalent to a blood alcohol level of about 0.05%. We have kind of like a buzz-on. It's such a strong effect that the Department of Transportation says that a truck driver can't drive more than 11 hours. When you look at the medical community, the medical community has rules for their residents that basically say, you can only work no more than 80 hours a week. When you break that down, it comes down to like 11 hours a day. I would caution, any software engineer or their manager to make sure that they are not working more than 11 hours a day because your codes probably going to be affected by it.

Chris Romeo  34:37

I want to get your take on one other situation before we bring it back because I definitely want to land the plane. There was another aviation route. I want to land the plane on that some of those tips that we have for software developers and managers, but maybe give me some quick data on this other situation where we always talk about developers and flow like, especially in a DevOps world, like, you know, it might be somebody's in the zone, but flow seems to be the DevOpsy, I don't even know if that's a word DevOpsy. But the DevOpsy kind of word is like the developers, they're in their flow, they're doing their thing. Don't knock them out of that because this is the social network with, the movie with Mark Zuckerberg had his headphones on and he wasn't even acknowledging other people around him because he was so dialed into what he was doing. What's the impact then of being pulled in multiple directions and interruptions and things that have just, seems like they become even more of a part of our modern everyday world of work is like, I can't get five minutes just to sit here and focus.

Dr. Anita D’Amico  35:47

Well, if you do context switching, it does affect your performance. Look at the open-source world, so there's ways of looking at this in an open-source repo. Some of the research is looking at the files that have been written. There's something called unfocused contribution. It's an indicator of how much attention a developer is actually focusing on specific files. Unfocused contribution goes up when a developer is contributing not just to that file, but to a whole bunch of other ones in close succession. They're sharing their time across a bunch of files. It also goes up when there's a lot of unique contributors to a file. What the research shows is that the more unfocused attention on a file, the more insecure the code is, remember the piles of files that have vulnerabilities in the pile that don't? Well, the pile that has the vulnerabilities is far more likely to have in it, code that was developed by developers who basically sprinkled their time across a whole bunch of different files. That's an indirect measure of attention. But it kind of speaks to the idea that you have to focus your attention on just one or two things or else your code is going to suffer.

Chris Romeo  37:36

Makes me think, there was an article written a long time ago, back in 2009. It got a lot of attention on Hacker News; the makers schedule, manager schedule. It talked about this idea that, it's not as much from a direct interruption perspective, but from the fact that like, someone who's writing code, someone who's a maker has a very different approach to the world. For them to be successful, it has to be very different. The manager will stack eight meetings, or 9-10 meetings a day, one after another. If you do that to a maker or a developer, they'll never get anything done. They need blocks of time where they just can go, they can just turn everything else off and focus. It sounds like what the data is saying, from various studies that you're bringing to us here, is that someone who is trapped in that world of not being able to stop and focus, they're going to introduce more vulnerabilities in the code that the writing.

Dr. Anita D’Amico  38:36

Absolutely. We see this, by the way, not just with open-source, but we were able to look at four software projects and look at static code analysis. We ran source code analyzers on the code, and then we looked at how that correlated with the level of focus. We found that the less focused the developers, the more they spread themselves around, the more static code analysis findings, quality, and security. I think it's a pretty robust human factor that we're looking at there. I would say that developers really should have concentrated time.

Chris Romeo  39:24

As a developer, you think, oh, I want to support my other teammates. I want to be available for communication when in fact, you're actually harming your code and potentially the other people because they're not stopping and focusing either because they keep coming to you and people keep asking questions and stuff. It's very eye opening. I want to transition and make sure we get enough time because I want to think through and get your list of what software developer managers should be thinking about to care for their people well, in regards to making security, but also, I mean, make it applicable for a software developer too, maybe their managers not listening. For someone who could apply these things themselves to make their life better, or if you're a manager to make your team's life better.

Dr. Anita D’Amico  40:12

I'm going to draw attention to something that I hinted at before, remember, we talked about the Microsoft study. I said that there were differences, there was a human factor that made a difference between the failure rates between the groups, but it wasn't colocation. What it was, was number of developers, that the more developers that are on a project, the higher the failure rate. This is something that we see over and over again, in the literature, that as you add developers to a team, and it's above a certain amount, there's no golden number. The studies that I've looked at, seem to have a inflection point at about eight or nine, that once you get to that ninth developer, you have significant impact on the quality and the security of the code. For example, if you take a look at the Linux kernel source code, as soon as you get to nine developers, there is 16 times more likely to have a vulnerability in the code than with fewer developers, 16 times. In Chromium, once you get to that ninth developer, the code is 68 times more likely to have a vulnerability in it than if you have fewer developers. I mean, that's pretty dramatic. They found in the Microsoft study that what made a difference in the failure rates was number of developers. In answer to your question of what should managers be thinking about in terms of ensuring that their workplace is developed in such a way that you have good quality and secure code, I would say, number one, limit the size of your teams. Try not to have more than nine developers on one thing, break them out into smaller groups. Another thing is that you should, if you have somebody who has committed code after midnight, be sure to check it. I would make a rule in general, that people should not work more than 11 hours. I would strongly discourage them from working on something critical after 11 o'clock at night. Those are a couple of the guidelines. Another thing that I think is important, and we haven't really talked about, but I know it's important to you, is security culture. There aren't many research projects that have been done, which actually collected data on security culture in software development, but there are in other areas. If you go into the health care, for example, and you look at the number of poor outcomes, which in the health care, unfortunately, means, it's a funny way of saying somebody got really sick or somebody died. You actually look at what contributed to that, you find that if there wasn't a good safety culture, you're more likely to get poor outcomes. When you go in and you intervene, and you actually create a safety culture. You reward for a safety culture, that the outcomes become much better, that the patients get better faster and live longer. You see that and you also see it in other areas, like in aviation, having a security culture is important. I don't see any reason why those results wouldn't also apply in software development. I know that's right alley.

Chris Romeo  44:29

When you think about the safety culture, it's an interesting parallel to security because I get an opportunity to work with a utility company in the Midwest, and their safety culture is top notch. The challenge of making that transition to a security culture is they've realized that if they don't have a strong safety culture, people in their company die. Because these are the people that are out working on the lines and doing those types of things, if they're gonna have a safety culture, they can't just have it for the men and women driving out in the trucks that are repairing lines, it has to be ingrained in everything that they do so that it is ingrained in the people in the field, the people managing people in the field, the people working in the service center, the people working in the corporate business side. That's always going to be one of the challenges when we try to say, Could we just take a safety culture and bring it to the world of security? I wish we could.

Dr. Anita D’Amico  45:31

Well, why not? I mean because look, all those areas that you just talked about, they're using software. Software is everywhere. If you're in the power grid, if you're working in power, if you're working in healthcare, all those areas, aviation, there's software behind it, and software is actually part of the safety critical system. The development of that safety critical software should be treated with a safety culture. I don't think we think about that enough.

Chris Romeo  46:07

I'm gonna keep working in that direction. That's my goal. I want to see us get to the point where we have the same approach to security culture that you have, whether it's aviation, whether it's utilities and power, that same kind of approach. That's always my goal. Another thing that I had to chuckle when I was kind of thinking about the team size, the finding, or the conclusion you had about the team size. It made me think about Jeff Bezos had the, I guess he's not the CEO of Amazon anymore, but he had this thing called the two pizza rule. No team could be bigger than being able to eat lunch with two pizzas. I think that's the same thing, I think nine people, what the research has shown, it's pretty consistent with what Bezos did at Amazon.

Dr. Anita D’Amico  46:58

Although the developers, I know, that's about six people for two pizzas.

Chris Romeo  47:04

Six people for two pizzas. Maybe we got to make a slightly modified rule, I was thinking about that. Anita, thank you so much for taking us through and sharing the various things that you've uncovered in this research. I know it's given me a lot of things to think about. I think there's some really practical things here that I hope a lot of our audience, as software developers, can take back to what they do in their daily jobs, those that are managers and listening to this, hopefully, they can take this and say, not only does it make our software more secure, but it makes the lives of our developers better. If the rules you're practicing here are a better quality of life, it's better work life balance, and it just happens to provide more security. There's a lot of positive things here. But thank you so much for taking the time to walk us through this and we look forward to having another conversation with you in the future.

Dr. Anita D’Amico  47:54

I'd be delighted. Nice to see you again, Chris.

Chris Romeo  47:59

Thanks for listening to the Application Security Podcast. You'll find the show on Twitter @AppSecPodcast or on the web at www.securityjourney.com/application-security-podcast. You can also find Chris on Twitter @edgeroute and Robert @RobertHurlbut. Remember, security is a journey, not a destination.

Need more information about Security Journey? Get in touch.

Ready to start your journey?

Book a Demo