Welcome to the third part in my series about legolizing software development. In this third part of the series I selected this essential (by design) element first because I, just like most of us working in the software industry, perceive it as an additional burden.
To prevent my own prejudice clouding the subject matter, I asked a friend from my college years, Tony Lemmers, Data Protection Officer, to join me as a co-writer exploring this topic.
With the new global emphasis on privacy and the legal consequences of non-compliance, we do not have the option to fail fast, learn fast, we can’t get away with a number of 9s, because the penalty on failing might sink our business. In theses political uncertain times, defending privacy is paramount to the success of every enterprise. The threats and risks to data are no longer theoretical; they are apparent and menacing. Tech decision makers have to step in-front of the problem and respond to the challenge. Privacy by design is part of the high-level design of software systems, with emphasis on the system’s structure and the non-functional requirements it needs to meet.
Privacy by design (PbD) is a policy measure that guides software developers to apply inherent solutions to achieve better privacy protection. For PbD to be implemented successfully, it is important to understand developers’ perceptions, interpretation and practices as to informational privacy (or data protection).
We’d like to share our lessons learned about architecture, development environment, security considerations and privacy considerations with you. We will also explain some issues we stumbled over and what solutions we chose to solve them.
In the first part of our exploration on the subject, we start off outlining the two main elements in this exploration:
- Cultural change
- Non-functional requirements
Analysis of the way of working in projects in the past revealed an interplay between several different forces affecting the way in which developers handle privacy concerns.
We discussed and analyzed the cognitive, organizational and behavioral factors that play a role in developers’ privacy decision making. We identified three key elements:
- Our findings indicate that developers use the vocabulary of data security to approach privacy challenges. This vocabulary limits their perceptions of privacy mainly to third-party threats coming from outside of the organization;
- that organizational privacy climate is a powerful means for organizations to guide developers toward particular practices of privacy;
- that software architectural patterns frame privacy solutions that are used throughout the development process, possibly explain developers’ preference of policy-based solutions to architectural solutions.
Framing these findings leads us to the factors that can influence developers’ privacy practices and allows us to use them as a guide for working toward a more effective implementation of PbD.
A Non-functional requirement (NFR) specifies criteria that can be used to judge the operation of a system, rather than specific behaviors. The plan for implementing PbD NFRs needs to be detailed at the start of designing the basic system architecture, because they are architecturally significant requirements.
The general believe is that the GDPR forbids a lot of things. This is not the case. GDPR mainly requires us to think before acting. The same is the case for development. Here are some simple steps to aid in this process.
- Do you need PII (Personal Identifiable Information)?
If for example you need to know whether a user of your application is older than, let say, 18 years. You could ask for his/ her birthday but that is PII. Is is better and simpler to ask ’Are you older than 18 year?’
- Which legal basis do you use to gather the PII?
GDPR offers six possibilities here;
– Performance of a contract,
– Compliance for a legal obligation,
– To protect the vital interest of the user,
– Public interest,
– Legitimate interest,
In software development, performance of a contract or consent are the bases that are most used. Of these two consent is the weakest because it can be withdrawn at any moment and has to be presented in a clear and easily accessible way.
- Only gather the PII you need and have a basis (see point 2) for.
When you need someone’s address because you’re sending them something it’s not necessary to ask about gender. The package will get there without knowing the gender and you have minimized the PII you gather on someone. Also if you have less information on someone there is less to protect, store, etc. .
- You can use the PII you gathered on someone only for the purpose for which you gathered it.
There is an exemption on this, if a new purpose is similar to the original purpose, than it is allowed, otherwise this is one of those few things that are forbidden.
- Take care that the PII you have is up-to-date.
This is not only useful for yourself but also a requirement from the GDPR. So ask your users on a regular basis (once a year) if there data is correct. Not only to comply with GDPR but it is also good business.
- PII is like money, so protect it the same (or better) way as you would protect your money.
This protection should both be technical (separated databases for PII with better encryption and pseudonymization) but also organizational. This means the people should only have access to the PII they need access to and that there should be access controls in place. The first fine which was given in the Netherlands was for the lack of good access control by a hospital.
This concludes our introduction to PbD. Feel free to contact us if you feel we can be of help.
This article is part of a series of articles themed Legolizing software development. I will post more on my site lion.nu, follow me on linkedin or facebook if you like to read more.