SecurityMetrics Summit: Penetration Testing FAQs

Watch to learn the most pressing penetration testing FAQs.

Having issues accessing the video above? Watch the video here.

Penetration Testing FAQs

Listen to Senior Director of Penetration Testing, Chad Horton (CISSP) discuss the most pressing penetration testing questions he gets, such as: 

  • What is the purpose of a penetration test?
  • What should you scope for your penetration test?
  • How often do you need a penetration test?
  • How is a penetration test performed?
  • How can you prepare for a penetration test?
  • How to choose a penetration test provider that's best for your organization?

This presentation was part of our recent virtual conference SecurityMetrics Summit, where 18 experts discuss the latest news and best practices for cybersecurity, PCI compliance, and other compliance mandates in 8 presentations.

Transcript of Penetration Testing FAQs

Hi, everyone. Welcome to SecurityMetrics Summit. Thank you for joining me on my presentation on penetration testing FAQ.

My name is Chad Horton and over the last sixteen years I've both been performing penetration tests while leading a team of penetration testers.

It seems like one of

Chapter

What do penetration testers actually do?

the most common questions I get when meeting someone new is what do you do for a living? Which has proven to be a harder question to answer than one would originally think. You see, I originally started by stating what I do in a straightforward manner. I lead a team of penetration testers.

However, as I quickly learned after telling my neighbor who's a nurse this, this jargon or term is overloaded and means come something completely different to the medical industry. So I tried a different tactic the next time. I lead a team of security assessors to which everyone assumed that I must install alarm systems during the summer. Again, I changed my approach and I tried using a more trendy term trying to build off what people see in TV and in the movies.

I said I lead a team of ethical hackers.

Unfortunately, this led to similar confusion where people would respond, is there such a thing as an ethical hacker?

All of this to point out that much the jargon in the industry is both overloaded and heavily impacted by Hollywood. As such, even when trying to explain a simple concept of what I do for a living, it causes a lot of confusion.

When I started performing penetration tests in two thousand five, it was an unheard of term. Today, the amount of content around the subject has absolutely ballooned, but it often feels like the noise to to data ratio is so high that many misconceptions still remain. As such, today, I wanted to answer some of the most commonly common questions I receive from both potential and current customers by using metaphors to relatable objects and topics to hopefully maximize understanding.

It is my hope that by the end of this presentation, you feel more confident in talking about your engagement

Chapter

Presentation outline

the next time you go to do a penetration test.

To accomplish this goal, I thought about all the questions I've received over the past sixteen years, and most of them really stem from a foundational misunderstanding of what a penetration test is. More often than not, by simply clearing up this confusion, the question itself gets answered. For this reason, I've tried to proactively answer a common FAQ question by focusing this presentation on expanding your foundational knowledge of penetration testing in the fall in the following areas.

First off, the purpose of a penetration test, how the scope of a penetration test both supports the purpose and impacts the outcome of the assessment, how a penetration test is performed and specifically how this differs from a vulnerability scan, And finally, what steps organizations can take to prepare for an upcoming assessment.

I understand that these sections are broad and may not at the surface appear to answer the questions that you have today specific to your organization.

But by expanding your knowledge and understanding of these areas, each of you should feel more confident in talking to a provider like Security Metrics during your next engagement to ensure that your organizational objectives are met.

Chapter

What is the purpose of a penetration test?

Let's dive in by talking about the purpose of a penetration test.

I will be using bridges as a metaphor for digital environments, web servers, mobile applications, and business networks that each of our organizations build, design, and maintain.

With over one million bridges in the United States alone, I would guess that everyone watching this presentation has been on a bridge and understands the importance of stability and security of such a structure.

This understanding will allow us to use concepts that are very visual to gain a better understanding of that which is often abstracted away in the digital world.

If you were to build a bridge today, there are three ways that that bridge could fail. You could create an insecure design where the bridge simply doesn't function as intended.

You could rely on materials that don't have the necessary strength or lose strength over time leading to an eventual failure, Or even you could create a secure design and have an appropriate material, but you could still fail to implement it correctly. We're gonna walk through examples of bridges that have failed for each of these reasons and correlate this to how digital environments fail in the same manner.

To start off, let's discuss how the design the design of a bridge impacts its stability and security.

Construction on the Tacoma Narrows Bridge began in nineteen thirty eight. It was nearly a mile long, which at the time would make it the third longest bridge in the world. It took two years to build and, unfortunately, it collapsed in just three months.

Efforts to build the bridge kept getting delayed due to insufficient funds to build the original design. As such, an architect on the, on the project proposed a new design based on recent research that reduced the depth of the trusses, those towers that you see crossing the bridge that were embedded into the river. Instead of the originally proposed twenty five feet of depth, the architect proposed trusses that were only eight feet in-depth, thus reducing the cost by nearly half. While the architects knew that these trusses provided the rigidity needed to keep the bridge from swaying side to side, the architect calculated that the less shallow depth should still be sufficient for what was needed for the bridge.

Unfortunately, his calculations were not correct. Whenever the wind would blow, the the bridge itself would rock side to side. And on one especially day seen above, the wind rocked the bridge so hard that one side of the cement traveled up thirteen feet and then down twenty six feet and back and forth. And shortly after, the bridge collapsed entirely.

In the cyber world, design has less to do with physics and more to do with the security of data. Specifically, how do we maintain the confidentiality, integrity, and availability of the data that we house? To address these concerns, system architects much must consider questions such as how will users be restricted to their own dataset?

How will patches or upgrades be rolled out in environments where high availability is a requirement?

How will sensitive information be protected both in transit and in rest?

Just as the design in the Tacoma Bridge led to collapse, a design flaw in cybersecurity can too lead to failures.

Earlier this year, I performed an assessment for a customer which had such a design flaw. They relied on a protocol built on top of HTTP called WebSockets.

These are advertised as fast, lightweight connections that can connect, users in an asynchronous manner creating a very snappy response time. However, as outlined in the documentation, the WebSocket protocol itself does not include any form of authentication or authorization framework. It is up to the designer to build their own on top of it. For this particular web application, the designers did not account for this. By simply generating request manually, my user could access everything. In fact, even unauthenticated users on the application could access all user accounts, all employee accounts, and they could even read, modify, and delete all the data stored in the application. While it is obvious that if an architect overlooks an different organization that we had an opportunity to perform an assessment for had gathered all the functional requirements for the application to ensure that each of them were accounted for in the design.

One of those requirements was that employees must be able to access customer accounts often referred to as masquerading in order to help customers debug issues that or the employees debug issues that customers may be experiencing.

On the surface, this requirement seems rather benign. However, the chosen design left the application completely vulnerable.

They chose to have the masquerade functionality only appear on pages where employees were accessing them and used an HTML comment to hide it for all non employee accounts. As anyone familiar with HTML can tell you, this doesn't do much. By simply examining the HTML, the analyst was able to see the comment, try out the request manually, and proceed to authenticate as every user in the database including employees.

Once we were authenticated as an employee with admin privileges, we were able to confirm that this page that this simple design flaw had resulted in a complete compromise of the data protected by this application.

Next, let's examine our how our choice of materials can impact the structure and stability.

The silver bridge was built using a common design for bridges. However, the architect tried to decided to use a new treatment for the steel which unfortunately led to its eventual collapse thirty nine years later. In this picture, you can see a rough equivalent of links on a bike chain. These pieces are called I bars. Most bridges use standard annealed mild steel, which had less strength than the newly discovered steel. So as you can see in this picture, architects had to double up and use many I bars to achieve the desired strength. You can see a total of six in the picture on each side of the joint.

The designer of the silver bridge opted to use a new type of steel, one that was high strength, high carbon, heat treated.

And based on his understanding at the time, the steel not only had a higher strength but had a higher resilience to the elements which led to two false conclusions being made by the architect. The first, that the higher strength meant that less redundancy was needed, and the second, being that the impact the elements could have did not need to be accounted for. The image above demonstrates how I beams were arranged on the silver bridge. Notice that only two were used on each side of the joint. On December of nineteen sixty seven, on a particularly cold night, the impact the elements could have on this type of steel came into view. A three millimeter crack had formed by corrosion over the years in which led to a brittle crack along the entire head of the I beam. Within seconds of this brittle crack forming, the entire I beam failed and the bridge collapsed entirely.

In response to this collapse, president Johnson passed a bill to create the National Bridge Inspection Standards, standards still used today which require any bridge longer than twenty meters to be inspected at a minimum every two years.

While the length of time physical materials can be relied on is typically much longer than software, the situation seen in the silver bridge occurs within software that we use on a daily basis within our own infrastructure, Linux, Windows, Apache, etcetera.

New software is considered secure or bug free when we first install it. However, vulnerabilities will be found that will require our attention. To visualize how this works in the cyber world, let's examine the life cycle of a specific version of Apache Struts, version two dot five dot ten, and how this or similar version of software led to one of the most publicized compromises in North America.

It was originally released on February third two thousand seventeen.

As such, any organization could have downloaded or upgraded to this version and to begin to host their application off of it. Just over a month later, on March seventh, Apache emailed out an emergency notification to all registered organizations that they needed to immediately upgrade to the newest version, version two dot five dot ten dot one. Three days later, it is publicly disco disclosed that a vulnerability had been discovered that could be used to compromise the entire web server.

Included on this publication was a proof of concept that could be used by organizations like Security Metrics to scan for the issue within different environments. As we can see, just thirty five days after this initial installation, an organization would be required to install a patch to ensure the further reliability of their website.

Unfortunately, Equifax, one of the three major credit bureaus, did not do this. In mid May, two months after the patch was released, attackers exploited this vulnerability on one of Equifax's servers and were able to silently harvest PII, personal identifiable information, for the next two months. It wasn't until July twenty ninth that attackers were finally discovered and the application was taken offline.

Just as the reliance on a new steel considered to be secure for the long term use eventually led to the collapse of the silver bridge, long term reliance on any software without proper maintenance can lead to compromise.

Just as we saw that design and materials chosen for a given bridge can impact the stability and security, the way the bridge is configured can have the same level of impact. The I thirty five w Mississippi Bridge was built in nineteen sixty seven. As traffic patterns increased throughout the decades, this bridge was expanded twice to accommodate the modern traffic patterns. Part of this expansion included two separate design reviews by two separate organizations.

While the bridge was generally considered, secure, it unfortunately collapsed in two thousand seven.

The cause of the collapse was determined to be the girders, that steel plate that holds the five beams coming together.

They were only half an inch thick when the needed thickness was an inch. It is believed that the sore that a sourcing flaw by the builder when the bridge was put together was the root cause of this and not a design flaw. Unfortunately, this is a silent failure because it's so hard to detect. As we already dis discussed, the design of the bridge was independently reviewed twice, neither of which picked up on this flaw. Without someone manually inspecting all the pieces of the bridge, this never would have been discovered.

Unfortunately, due to the amount of software dependencies within a given environment, misconfigurations occur fairly frequently. It's not a matter of the software itself being insecure, more an issue of how the software is used that causes the issues.

While the types of issues vary, some of the most commonly impactful ones are listed on this slide, it's probably no surprise that the most impactful listed is that of default credentials. From attacker and pen test perspective, nothing is in nothing is an easier route to compromise data than, simply googling the default password for a given software.

I encountered a classic instance where a misconfiguration led to a compromise on a network where I knew that the application and the database that housed the sensitive data was. However, both of these servers were securely designed, configured, and maintained.

After investing significant amount of time and not getting anywhere, I began to explore other hosts that I had access to and discovered that I could access the backup server. This server, just like the application and database server, was also securely designed. It was properly maintained.

However, the Samba server was configured with rewrite, readwrites for the null session, to the drive that contained the backups. So even though I couldn't break into either of those servers directly, I was still able to get to the data simply by exploiting this simple misconfiguration.

Hopefully, after considering the past few slides, you can see how effective security requires a combination of a secure design with proper materials and a secure configuration with proper maintenance. Whether it's a result of an oversight, lack of funding, laziness, misunderstanding, whatever it is, mistakes can happen and they will happen. And unfortunately, those type of mistakes can lead to stability to a loss of stability and security within an environment.

While organizations may audit their code or configurations that they already use and they may already have a scans running within the environment, As we saw from the I thirty five w bridge example, some of these issues cannot be found through through traditional methods. The best approach to find them is through hands on analysis by a trained expert, which is how we get to the purpose of a penetration test. Because it's a simulated attack performed by individuals who specialize in finding design configuration and software related bugs, we can help organizations fix them before an attacker can exploit them. Better yet, we can find bugs that other assessments often miss.

With an understanding of the objective

Chapter

What should the scope for my assessment be?

of a penetration test in mind, let's transition to how the scope of an assessment can impact its effectiveness.

As with the previous section, we will continue to use bridges for our metaphor.

Every bridge starts out with a design.

Once the design design and source, materials are chosen, the bridge can then be built.

Once the bridge is open to the public, the public can begin to use the bridge to get from point a to point b. As can be seen here in yellow, the general public only has access to the main bridge.

However, tourists and other others with a burning curiosity or appreciation for architecture like myself, we can pay to get a guided tour of the bridge.

As such, not only can we see the bridge like the generic public, but we can also access each of the towers as you can see listed in blue.

While paying customers had more access than the public, an employee has even more. With the same as the previous two groups, they can also access all other areas of the bridge, and they can even view the original design plans.

Using this bridge, we can define different ways an inspector might go about reviewing the security and stability of this bridge which correlates to the different types of penetration test that we commonly see. They could simply review the parts of the bridge that's accessible to the general public. This would be considered a black box assessment and is the typical perspective of a vulnerability scan and is usually where hackers start before they transition to something like password spraying. And some penetration test can be performed from this assessment too, but the types of issues discovered would be limited to what they have access to. Issues in areas that were only authenticated personnel can access would not be possible to be to identify from this type of test.

The most common type of test that we perform in Security Metrics is an authenticated assessment.

Meaning that we will test using credentials from all non employee accounts.

While expanding the scope to this level, will expand the inspector and hence the penetration tester's ability to find bugs, it is still not comprehensive.

However, for most, security standards such as PCI, this is the recommended approach. It's all gonna come down to what attack vectors you're after. Are you concerned about employees or are you concerned about third parties, customers, and people coming from the generic public? Which is why the the latter is what most people are concerned about which is why this is the most common.

The most comprehensive assessment we typically perform is called a source code assisted assessment.

While this does not give us direct access to employee functionality which can be mission critical and sometimes sensitive if we're testing in production, this gives us the ability to review how they function, to review the the core of how the design was was, implemented and intended to function. This assessment will provide by far the most accurate and actionable findings.

So when people could typically approach me and ask what should my scope for my assessment be, I often respond with these three questions. What is your motivation?

Is your objective here to verify that your employees can't access more data than they're supposed to? Or are you more concerned about generic public or an attacker coming from a third party?

What are your main threat actors? Going along with what we just talked about, who are you trying to defend against? And finally, what is your budget? And I'm gonna be brief on this because we'll actually talk about this a little bit more in a few slides. So those are the three questions you should should consider when looking at your scope. Next, I hope to pull

Chapter

How is a penetration test performed?

back the curtain and talk about how a penetration test is performed. And since the single most common question I get is about the difference between a penetration test and an automated scan, I will also provide a comparison between these two services as we go.

The first step involved in any penetration test is to perform reconnaissance activities which in layman terms means to understand what the target or environment looks like. To do this, there are typically three steps that are performed, Automated scanning, manual walkthrough, and research of the target environment and personnel.

This is the phase that most heavily relies on automated solutions as the tasks are highly suited towards activities that computers can do well. They're repetitive in nature and they're signature based.

Every piece of software has a fingerprint or a way that it responds that's unique to itself.

Over the past few decades, security researchers have been gathering these signatures and creating databases with them which can be used to identify which piece of software is running an environment and oftentimes, what type of vulnerabilities may exist.

Armed with this knowledge, the analyst will manually familiarize themselves with the target. For me, this means reviewing the automated scans and then crafting some requests manually to better understand what the technology is, how it is used, and why it functions that way. This is in this is especially important for web or app or mobile applications because understanding what each user can do and what they're intended to do is at the heart of many of the tests that we're gonna perform.

The last step is to research what has been leaked publicly online. In its earliest forms, this was known as Google hacking. And to this day, it still amazes me what information can be found online.

In one assessment, I was able to find a forum post made by a developer to a website under his personal account, of course, where he asked for help parsing credit card numbers out of a batch file. In an effort to aid any would be helpers on the forum, he posted a link to a sample sanitized version of the file he was working on. By simply brute forcing the path that he provided in that forum, we were able to download the original file which contained tens of megabytes of transactional data.

The next step is to identify vulnerabilities in the environment.

During this step, the analyst will verify any vulnerabilities reported by the automated tools then will transition to manually looking for it the types of issues we described earlier. Design flaws, publicly disclosed issues in the software used by the environment, and misconfigurations.

For web or mobile applications, design flaws consist of vulnerabilities covered in standards such as the OWASP testing guide or the OWASP top ten. You've probably heard of many of them such as SQL injection and cross site scripting.

Testing for these types of issues require a meticulous handling of application state, something that we'll see in a few slides is very challenging for scanners.

As such, the type of test this type of testing is typically very manually intensive.

While there are plenty of public databases that keep track of vulnerabilities reported in software, a small minority of these vulnerabilities have proof of concept code. And an even smaller subset has code that's actually benign enough that a scanner could simply take it and add it into its list of checks. For this reason, an analyst has to not only look for a vulnerability which has a POC that's been publicized, but they also have to modify the code to actually work against the given environment.

Lastly, the analyst will review hardening guides and standards for the software being used and contrast that with what is being seen. This is sounds very straightforward and is for widely adopted technology like Microsoft, IAS and Apache. For lesser known projects, this type of material may not be widely distributed and it's for this reason that automated scanners have a well known blind spot for vulnerabilities and misconfigurations in lesser known and adopted software.

From what we've covered here, you should start to get a sense of the limitations of automated tools and begin to understand the manual work that goes into breaking into environments.

Once an issue has been identified and when necessary the POC code modified for the environment, the analyst proceeds to exploit the given issue. The goal of exploitation is twofold. Number one, to demonstrate actual risk to the environment, and number two, to expand the attack surface.

As anyone who's reviewed vulnerability scanners on a regular basis can attest to, they have a long history with false positives.

I have been privy to many conversations with organizations panicked by their latest VA scan which stated that exploitation was possible only to discover that the scanner was simply wrong.

The goal of a penetration test is to provide accurate actionable findings.

As such, the process of exploitation ensures that what gets reported back to the organization effectively represents reality.

Upon successfully exploiting a vulnerability, the penetration tester has either obtained access to an additional account, another network, or possibly both. As such, they now have to pivot and this whole process starts over again by performing recon against this new perspective.

Chapter

What is the difference between vulnerability scanning and penetration testing?

As we touched on in the past few slides, vulnerability scanners play a crucial role in penetration testing which often causes confusion between the what value is provided by the scanner and which value was provided by the analyst.

Based on my experience, automated scanners play a solid role during the recon and contribute to the start of identification.

However, the the rest of the assessment is almost entirely manual, which helps explain that why over eighty percent of the high and critical issues I've reported in the past year have come specifically from manual testing and not from an automated tool of any type. To understand why this is the case, let's talk about some of the limitations of computers.

Object recognition is a very visual task. As such, I'm gonna use it to help explain why automated scanners have such strong limitations.

When we look at a cat or a dog, we intuitively classify it for what it is. For computers, this is a much harder task. Though machine and deep learning have helped computers make large advances, and and at the surface, it may feel like we're on the edge of having AI robots walking around among us who can interact with the world in distinguishable ways as a human. Thank you very much Boston to not dynamics.

But as you dive deeper onto the subject, you will realize that the industry still has a long ways to go.

In two thousand eighteen, researchers at Auburn University wanted to test how well Google's AI recognition was. They purchased three d models of common objects such as the ones you see here, the fire truck, the scooter, and the school bus, and they adjusted the pitch, the yaw, and the roll of these objects.

In other words, they turn the objects at different angles to tell how well Google could identify the object.

While they found that Google did very well on objects in the typical pose in good lighting straight on, the moment variation was added, it quickly fell apart.

I in in the end, they only identified the object correctly less than three percent of the time. While Google struggled to identify these objects at different angles, our brains intuitively can. The intuitive nature of object recognition also applies to penetration testing.

Minor changes in responses can throw off an automated scanner. But to a penetration tester, those minor changes are negligible.

All this boils down to the core of the issues with Internet security in general. It's a huge landscape that evolves day by day as new protocols and software are developed and new attack techniques are discovered.

To effectively assess security, it takes a solution that has sufficient expertise on a given technology that can understand an implementation in order to look for the types of bugs applicable to the environment.

Manual penetration testing is that solution.

Automated scanners can help assess the security. However, they themselves are not adaptive. They rely on strict heuristics and they cannot apply those heuristics to new environments and new software.

So now that we've talked about the methodology, let's proceed to talk about how you can prepare for a penetration

Chapter

How can you prepare for a penetration test?

testing. Before ever contacting a penetration test provider, the first thing must be to define what the organization is hoping to achieve from performing a penetration test. In other words, they must define the success criteria.

For many organizations, their intent is simply to comply with a given security standard. In many instances, this will dictate what needs to be tested and how it needs to be tested and what needs to be remediated.

For other organizations, it may be to reduce risk or to manage risk to the organization.

We've seen organizations who make frequent acquisitions of other companies.

Before the acquisition can go through, that organization will contact us to have us perform an assessment to verify that they're not purchasing something that'll eventually be a liability to the organization.

Lastly, we see organizations generally driven by the desire to improve their security posture year over year. Each engagement emulates different attack scenarios or a different threat actor so that they can get a better understanding of what their attack surface looks like.

Knowing your objective will help you make all the critical decisions that we're gonna talk about in the next few slides.

Just as

Chapter

How often should you perform a penetration test?

the collapse of the silver bridge led to the National Bridge Inspection Standards which requires bridges to be inspected every two years, Similar standards have been included in cybersecurity standards such as the PCI DSS that require a penetration test every year or after every major change. However, based on your organization's objectives, you can map out how a penetration test fits into your security efforts. Should it be a one time thing basically tested before, testing the environment before it goes into production? Should it be tested yearly as it is the case with PCI and many other standards? Or should it be more frequent?

We see organizations where their ultimate goal is to be acquired by another company. As such, they view a security compromise as one of the biggest threats to that objective.

So they decide to perform quarterly assessments.

Again, your objective will help you define the frequency that's appropriate for you.

Chapter

How much does a penetration test cost?

With the frequency in mind, the organization next needs to define a budget to support such a product.

Just like building a bridge, there is no one standard cost. The cost depends on a lot of factors for a bridge. It's the environment it's being built in, the length of the bridge, and the materials needed, etcetera.

There's no similarly, there's no one set price for a penetration test. However, I can say that for an average first time penetration test against like a web against a web application and a series of networks, the cost will probably be around ten thousand dollars per engagement.

If after establishing a budget, you determine that it's not sufficient to meet the objective that you've defined, then you need to consider either narrowing the objective to ensure you get quality test against the little that you can afford versus the low quality test performed against everything that you want.

Or you can choose to increase the budget, but it should be a balancing act between a balancing of those two of those two goals, the objective and what the company can ultimately afford. With the objective and frequency in mind, you're ready to

Chapter

How to choose a penetration test provider?

choose a provider. Choosing a provider, to perform a penetration test should use the same logic that you use when hiring a new employee. You should discuss their expertise with the technologies you use. If you're a web shop that specializes in ecommerce websites, then it wouldn't make sense to engage a firm that specializes in network assessments.

It makes more sense to go with a company like Security Metrics that specializes in identifying and exploiting web based vulnerabilities.

In the same vein as expertise is experience. Choosing a local web developer who's trying to have a side business as a penetration tester may be cost effective. However, we frequently see it not yield the results the organization is after.

Again, if you desire a high degree of confidence in the test being performed, then engaging with a more mature organization that specializes in penetration test makes more sense.

Ultimately, it boils down to the more you know about an organization, the better you a call you can make as whether or not it's a good fit for your objective.

If you wanna know more about this topic, I recommend reviewing the supplement on penetration testing released by the PCI Council.

To wrap

Chapter

Conclusion

up the presentation, let's review the core of what we've discussed here today. We reviewed how the design, the material, and the configuration flaws can impact the security of the environment and how a penetration test is an assessment that finds bugs in areas often missed by other security tools such as audits and automated scans.

We discussed how the chosen scope of an assessment can impact what issues are discovered and in general can impact the objectives of the organization.

We then reviewed how penetration tests are performed and further contrasted them to vulnerability scans to magnify the differences between these two types of tests.

Lastly, we covered what your organization can do to prepare for a penetration test. Start with your objectives in mind, define how a penetration test fits into your security processes, and define a budget to support this. Armed with that knowledge, we discussed best practices on how to choose a provider. I hope this visual FAQ has expanded your understanding of what a penetration test is. And as a result, I hope you feel more prepared to engage for your next penetration test.

So

Chapter

Q&A

I've been handed a handful of questions from our attendees. So I'm gonna attempt to answer them now on the spot. So here goes nothing. For penetration testing, we will be giving your company

Chapter

Q&A: How can we be assured that our information is protected?

keys to our whole system. How can we be assured that our information is protected?

This is a great question I I I see come up with a lot of organizations that they they have the sensitivity of, like, we've got to keep it within our own organization where we have control of it.

And for those organizations, there are ways that you can that you can counterbalance this. So for example, for organizations sensitive to this, we'll often actually test in a development environment away from all the customer PII, away from any sensitive information. So even if we find a vulnerability and exploit it, it's against your development environment and doesn't impact production. So you've got options, go against the development environment, then you don't even have to worry about it. But secondly, that also, lines up with your objectives. If you're concerned about that type of thing, you wanna make sure that you engage in an organization that has a history of taking security seriously. Someone like SecurityMetrics who has to perform regular audits to verify that they're protecting data in a secure manner.

So

Chapter

Q&A: How often do we really need to get a penetration test performed?

the next question is, how often do we really need to get a penetration test performed?

So hopefully, as you saw, that we had a slide where we talked about this during the presentation.

I personally find that once a year is the correct balance for most organizations especially when you're looking like small to mid sized organizations.

Organizations that have rapid turnover or rapid deployment where they're changing critical pieces on a more frequent cadence than once a year, then they should potentially explore the quarterly or semi yearly option.

But honestly, if you were engaging with a provider like Security Metrics, you could have that type of conversation with them. Explain what your goals are. Explain how frequently you have turnover or you're changing code base.

They'll be able to help you help guide you to make that decision. So it looks like we have a follow-up to that one.

I've heard before that

Chapter

Q&A: Do you need to retest after every significant change to the environment?

it's that you need to have a retest after every significant change to the environment.

What does that really mean?

K. So that's terminology that comes directly from PCI DSS.

So if you're not associated with that standard or not doing that, I understand that this is specific for people who are. Significant change as defined by PCI is anything that could result in a, a critical change or a critical vulnerability being implemented in the environment.

Even that's a little abstract so I'll try and dive in a little deeper. I typically try to give guidance that it's like after operating system upgrades.

Updates are a minor change, upgrades to the operating system. So going from Windows ten to Windows eleven, that would be an upgrade and be considered a significant change. So if you're upgrading the OSes on your web servers, I I would probably consider getting another penetration test. But between updates and things like that, I wouldn't worry about it. Finding the balance for web applications can be a little bit more challenging, simply because what is a minor versus major change when you're testing when you're dealing with code that you've written.

We typically separate it as if you're just changing the front end, the CSS, HTML and things like that, definitely a minor version. If you're changing versions like, or revision revising the page to change basic functionality like how text is displayed on the page, again, I would classify that as a minor. But if you're reimplementing how you're doing authentication or authorization, perhaps you're plugging in a new single sign on solution that you're engaged within a third party for, those type of things would definitely fall under the significant change for that. Okay. So the next question I'm

Chapter

Q&A: What qualifications and training does one need to become a penetration tester?

getting is what sort of qualification and training does one need to become a penetration tester?

That's a great question and it's one that I've I've debated, with with with people from American Express, Mastercard, like fortune one hundred companies I've had this conversation with.

It used to be we lived in a society where a degree in a given subject was necessary for you to be considered qualified to do that job. And in cybersecurity world, we're seeing that that's just not necessarily the case. For me, when I'm interviewing a penetration tester, I'm looking for someone who has a passion for what they do and that they dive in deep to the subjects they're learning. So if they're a developer, I'm gonna test how well they can actually develop.

I'm gonna test how well they've actually researched security standards around developing. And so it's not necessarily around the qualifications or training that they've been directly involved in. It's about focusing on those areas that they do specialize in and seeing how deep that they go. Because I often find if they go deep in the in the topics that they're passionate about, I can add security onto that and they tend to they tend to go just as deep on those as well.

There are other such sorts of training available. We do, Hack the Box. I know a lot of my guys love that type of training. We also do Pen Tester Lab and a few other different types of professional organizations that provide training that you can look out there for. But a lot of it can be found open source or on the internet. It's just a matter of doing your research and connecting to a community that can give you that type of direction.

Okay.

Chapter

Q&A: What are the easiest ways for beginner level personnel to conduct vulnerability testing and assessments in small organizations?

I think it's gonna be the last question that we have time for and it is what are the easiest ways for beginner level personnel to conduct vulnerability testing and assessments in small organizations?

Big question to leave to the end.

Okay. So to answer this one, I'm gonna answer it two fold and, because I think it's very context specific. If you are at an organization and they're asking you they're saying, hey, we need to add in a security program. And so we want you to do vulnerability scanning and we want you to do penetration testing.

The first question I would ask is, what time am I gonna be given to do this? Because ultimately at the end of the day, do you wanna be a jack of a jack of all trades and a master of none? Or do you wanna be able to master that what's your core responsibility and be able to outsource and utilize a third party like Security Metrics who is a master in what they're asking for. And so if they're gonna give you the time to do it, then the best thing you can do is to start researching different security tools, scanners that can be run inside your organization.

There are plenty of them out there. You'll you'll be able to find them. Try those out and see which user interface and experience works best for your organization.

And then you can manually run those, review those, and then you can also start to manually establish your own pen testing methodology internally.

Now on the flip side which unfortunately, I find is more common, organizations ask for individuals to do it without actually giving them time to become a master at it. In those situations, I would personally recommend that you find a provider that you can partner with who does expert who does master that service and allow them or utilize them to perform the services that you need to perform. In my experience, that's been the most time and cost effective option for organizations.

K. So, I'm gonna cut it off there because we are at time. So if you've asked questions that we haven't addressed, I'll just let you know that over the next few days, me and my team will actually be reaching out to everyone to provide answers for those.

So our contact information will be provided. Thank you very much for your time. Have a great day.

Why Watch SecurityMetrics Summit?

  • Learn more about PCI compliance, Penetration testing techniques, Ransomware, Security and compliance technologies, and more
  • Hear from the brightest minds in the industry
  • Improve your job skills with the latest cybersecurity strategies

Who Is SecurityMetrics Summit For?

SecurityMetrics 2021 Summit has the latest information you need as a PCI Manager, Compliance Officer, Security Officer, Information Officer, IT Administrator, and other security professionals.

Summit is ideal for those working in universities, retail, government, acquiring banks, and the healthcare industry. If your job includes anything related to compliance, payment card data, or cybersecurity, this is a must-attend event.

Interactive Penetration Testing Timeline Checklist
Download
Get Quote for Penetration Testing
Request a Quote