The weakest link in ensuring security. An Interview with Dawid Balut, Cyber Security Director in TestArmy
Who inside a company is really responsible for security? What are the challenges facing the security industry and which companies need testing? What does the testing process look like and how is it carried out? The following interview is answered by Dawid Bałut, who recently joined TestArmy as our Cyber Security Director. During the conversation, Dawid shared his thoughts with us and talked about TestArmy’s mission. Have fun reading!
Pentester and Bug Hunter with extensive experience, who joined the security world more than half a decade ago. Since then he has worked as a Security Architect for corporations from Silicon Valley. Every day, he builds security systems, trains employees and automates all security processes.
You have extensive experience in applications security. Tell me what have you been doing so far and what will you do in TestArmy.
In my career, I have worked as a network operator, programmer, sysadmin, bug hunter, pentester, security engineer and on managerial positions. For the last 5 years, I’ve been cooperating with an American firm in the security sector that generates tens of millions of dollars every year. As a security architect, I was responsible for building security systems from top to bottom, focusing on process optimization and creating an ecosystem that will make new mistakes less common. In the meantime, I helped other companies and security professionals secure their firms from the inside, by implementing DevSecOps and increasing ROI from penetration tests and Bug Bounty programs.
At TestArmy, we want to help companies get a greater return on their investment in security. The most important element was building an elite team of pentesters, with the best group of experts I have met in my corporate life. We know that there are specialists for whom safety is more than just a job, and who takes pride in the quality of services they provide to customers. That’s why we want to create an ideal working environment for them and work together on something that is deeply important to us.
The second element that is very important for me is the increased investments in education on the subject of security as well as code, applications and entire IT systems. As a programmer and someone who has spent the last half decade securing the software engineering process, I know that in addition to knowledge, programmers need useful tools and resources.
It’s good that you raised the Bug Bounty topic. You were one of the first pentesters reporting errors in the applications of companies like Amazon, Apple, Ebay and Facebook. Even tech giants have a security problem with their own applications?
Of course. In my career, I did not meet with any company that did not have a security vulnerability. Big players usually have problems related to their scale. Corporations such as Facebook or Google produce so much code and so many new applications that they are not able to secure it with their own hands – which among other things, is the main reason for launching Bug Bounty programs, in which they pay millions of dollars annually to independent security researchers. Google acquires dozens of smaller companies and their associated products every year. While with the appropriate processes it is possible to secure a newly produced code inside the company, a very large amount of work is needed to test products inherited from the takeover of a company from top to bottom.
Big and small companies have many similar problems, but quite a few separate them. For example, small businesses often can’t afford the security-savvy developers, while large companies bring in so many new employees each week that they do not always manage to train each employee sufficiently.
What challenges do you see facing the cybersecurity industry?
The growing number of codes, emerging new technologies and increasing liquidity in delivering new versions of applications to the client brings with it the need to change our security management approach. I am a huge supporter of the use of Agile and DevOps in the production of software, but many companies forget that “new” methodologies carry new threats. Currently, in many companies, the software is automated for many processes related to code quality testing. However, a very small number of companies have similar automation for security processes, which means that customers often end up with new security features that are not tested at all.
An ideal example is the Internet of Things market, where the vast majority of products are completely unsecured. The Internet of Things is growing at an incredible pace and we need to contribute more to securing this sector. If we do not, we will bring danger to users who will not know that hackers can remotely peep into their child’s bedroom through the webcam in a baby monitor, or remotely switch the oven on full power when away from home.
What can a company requesting tests expect? How do they look?
The process itself looks like this: a client asks us for pentests and then we work with them to understand if they really need a pentest. We want our customers to understand what service they are requesting and suggest the proper solution which will yield the largest return for the customer. When this is agreed, we send the client a questionnaire to understand their technological stack, collect the system’s data and potentially access the data. We bring the client through the process so that they know what to expect during our tests. We agree on the date on which the test is to be carried out, and we collect a team of the most competent pentesters in the technologies used by the client. Then we run tests documenting discovered vulnerabilities and create a final report.
Such a report contains everything from the summary for the board as well as detailed information for programmers. Thanks to this, the management knows what situation their company is in and where they should increase their investments, and the developers know exactly how to reproduce the error and repair it by following our guidelines.
Will you always find any mistakes?
The ego of many pentesters orders us to answer “of course”, but I will answer honestly – not necessarily. Whether or not mistakes are made depends on many factors. In addition, how much time will be spent on testing the system and how complicated the application is, a lot depends on the quality of the testers who performed the tests before us.
In working for various software companies, I carried out hundreds of penetration tests and it often happened that when we hired external companies, to test the application previously tested by my team, they came back empty-handed. I have also encountered situations where external pentests detected important vulnerabilities that were not found by another pentest company a week earlier.
In the vast majority of cases, “something” will be found, while the chance of finding a serious error decreases in proportion to how good the security processes the company has and how many times it has used professional pentesters services in the past. After all, such situations can be counted on one hand and in most pentest scenarios there are serious errors.
What is usually the weakest point in security and why is it usually human?
Because it is the man who is responsible for the quality of the code, product or security of the infrastructure and applications used internally by other employees. But when I mention humans as a weak point, I very rarely mean a developer. When I talk about the person responsible for quality, I mean company management. These groups decide how much to invest in security.
The programmer does his job and if he is forced to produce huge amounts of code, he can not afford interruptions where he debates the security of his code. It is also up to the board how much time and money they invest in each employee security education. Ten years ago I often thought that the responsibility for emerging errors belongs directly to programmers. After so many years in the industry, I know that if the developers had adequate support, most of them would be very keen to write high-quality, secure code.
Are tests only valid for large companies?
Sometimes on the contrary. Large companies often have great programmers who know how to write secure code, have solid software quality assurance processes, and often an internal security team. In my experience, smaller companies have more holes because they cannot afford qualified specialists. And even if they have such specialists on board, they prefer to devote their time to producing something that brings a return instead of testing security.
Also, the level of security awareness in smaller companies is much lower, which means they do not realize what may happen if they ignore the need to invest in security processes.
What do social engineering tests look like?
Ha, the purpose of social engineering tests is to make them not look like social engineering tests! Social engineering tests rely on the use of our emotions and abuse of our human weaknesses in order to gain access to restricted resources. Social engineering tests are unfortunately still the black sheep in the world of security. While the majority of people have heard about technical penetration tests, there are few companies that carry out social engineering tests. When the level of awareness of this type of threat is low among employees, the attacker who wants to break into the company does not have to invest hundreds of hours searching for vulnerabilities in the software.
All you have to do is call a kind accountant who is known for her openness and willingness to help, and persuade her to release confidential information or passwords, for example, posing as an IT worker who needs this information to improve its work. Who refuses an IT specialist who wants to improve your work for free? All you have to do is enter your password so that they can log into your computer and optimize it.
What do you think about the huge wave of popularity of Bug Bounty programs? Is it really worth investing in them?
It’s worth it, but not everyone should. I will say that most companies should not invest in it because, in order to exploit the potential of Bug Bounty programs, the company and its current security processes must be very mature.
Bug Bounty programs involve encouraging external security researchers to test the company’s software and systems. If the researcher finds a mistake and reports it to the company in a responsible way, he will be paid for it.
The problem with Bug Bounty programs is that they are not cheap when it comes both to money and time, they are not completely safe, and do not solve long-term problems. Most companies should invest money in improving internal processes for secure software development; in monitoring and security systems; in solid penetration testing and only after several dozen successful iterations can they think about running the bug bounty program. Bug Bounties should be the icing on the cake and they are not a substitute for any other security initiative.
According to many reports, in2019, there will be a shortage of around two million security specialists in the labor market. How can we get such a large number of employees and how can a company manage this deficit?
I do not think that we have such a big problem with the lack of security specialists. I believe that companies simply cannot exploit the potential of people who are already on the market and panic instead of focusing on a practical approach to the problem. The answer to the problem is to increase investments in education, employing juniors and training them inside the company, identifying internal talents or loosening employment policy by offering remote work and flexible working hours. Of course, there are many solutions and this problem is deeper, but there is no point in talking about bigger government initiatives since most companies do not do basic things like those mentioned above.
Generally, good specialists are missing in all specialties of the IT industry. We have a huge amount of vacancies for programmers, sysadmins, but also for competent HR and management. The main difference is that without proper company security, burglaries may bankrupt and leak our data. Without programmers or HR, the company will simply grow slowly and this is its greatest risk.
TestArmy has been running a wide range of training for programmers, testers and UI designers for many years. Recently, we have added security testing workshops and will work on more educational materials, because each company should invest in the development of its employees.
Thank you very much for these incredibly accurate questions. I plan to answer most of them in the form of longer articles on the TestArmy blog because these problems are so important that they deserve careful consideration.
Thank you for getting to the end of the interview. If you have more questions about security, contact us!