18F’s research practices are grounded in building and maintaining trust. When participants trust us they are more likely to share full and accurate accounts of their experiences. A large part of maintaining participant trust involves protecting participant privacy.
Disclaimer: This page is intended for internal use. It is shared in the spirit of open source, to prompt conversations around design research as it relates to privacy. GSA has no regulatory authority over any of the laws discussed in this section, so don’t just take our word for it.
Personally Identifiable Information (PII)
We protect participants’ privacy by giving them a say whenever we seek to collect identifying information about them. The Federal government defines personally identifiable information (PII) as “information that can be used to distinguish or trace an individual’s identity, either alone or when combined with other information that is linked or linkable to a specific individual.” (Obama White House Archives OMB circular A130)
What is considered PII depends on context. Recordings of people’s voices and photos and videos of people are always considered PII. This includes recorded interviews and usability tests. Email addresses, phone numbers, and mailing addresses are sometimes considered PII; and when collected, stored, or presented in combination with a first name they often become PII. Each piece of collected or stored PII increases the risk of privacy violations. When possible, it’s best to refrain from collecting or storing PII at all (if you or your agency partner have questions about storing PII on third-party systems that have not been approved by GSA, you should consult with your partner agency’s Privacy Office). One way to do this is analyzing data collected via the Digital Analytics Program.
The Privacy Act of 1974 [Justice.gov] provides protection to individuals by ensuring that Federal agencies:
- Collect PII only when it is both legally authorized and necessary
- Present Privacy Act Notices whenever they collect PII (in order to allow for informed consent)
- Protect agency-held PII against anticipated threats to security or integrity which could result in substantial harm, embarrassment, inconvenience, or unfairness to the participant.
18F complies with the Privacy Act by following the information practices outlined in our Privacy Impact Assessment for Design Research posted on GSA.gov. The TTS Research Guild [GitHub] works with the GSA Privacy Office to annually review this assessment.
18F teams should protect participant privacy while also encouraging the broader team’s involvement. This is a balancing act. For example, if you were to share recordings of every stakeholder interview outside of the core project team, you would risk violating your stakeholders’ privacy (and trust).
The following guidelines, drawn from our Privacy Impact Assessment for Design Research, help us build trust and protect privacy. This list isn’t exhaustive, but it’s a good place to start:
- Ask key stakeholders to introduce you before conducting interviews with their team
- Build rapport with research participants in advance of the research session — for example, by emailing participants to see if they have any questions about the research, or briefly meeting with participants before the session begins to go over any logistical requirements
- Whenever you collect PII, store it digitally on GSA’s approved systems for PII and actively remove access to PII from anyone not on your project team; and always place paper documents with PII in locked file cabinets
- Store design research administrative data (for example, contact information collected during research participant recruiting) separate from study data (for example, recorded video of a usability test); share research-related records on a need-to-know basis
- Collect the informed consent of anyone who participates in moderated research; we generally do this with a participant agreement
- When scheduling research sessions via Google calendar, set the event visibility to “Private” (since invitations include the participant’s name and email address)
- Use pseudonyms or participant codes (for example, “Participant 1”) when naming recordings, transcribing audio files, and writing reports
- De-identify research data before subjecting it to shared analysis, synthesis and sharing
- Ensure that any included quotes could be attributable to multiple participants so no one person can be identified as the person that said it
- If attribution is important, request the participant’s permission before incorporating personally identifying information or directly attributable information (for example, quotes with attribution) into shared analysis, reports or presentations. Be mindful of who might see the information. Don’t attribute information that could pose any personal or professional risks for the participant (for example, a negative comment about executive leadership included in a report shared with their organization).
- Be mindful of information norms before sharing and presenting research. For example, if you were to shadow an agency’s acquisitions team, and notice that the team had freely discussed information about a particular vendor (such as reputation), that doesn’t imply that your research artifacts (such as reports) could name the vendor in question or even the nature of the conversations you observed. In this case, you might ask members of the acquisitions team whether or not it’s okay to include information about the conversation you observed.
- Periodically inventory and confirm need-to-know access to study data (as defined in our Privacy Impact Assessment for Design Research)
- GSA’s Privacy Office welcomes questions and feedback anytime at firstname.lastname@example.org