ePrivacy and GPDR Cookie Consent by Cookie Consent
The leading association
of public opinion and
survey research professionals
American Association for Public Opinion Research

Disclosure Standards

Section III. of the AAPOR Code of Professional Ethics & Practice (April 2021) specifies:
III. Standards for Disclosure

Broadly defined, research on public opinion can be conducted using a variety of quantitative and qualitative methodologies, depending on the research questions to be addressed and available resources. Accordingly good professional practice imposes the obligation upon all public opinion and survey researchers to disclose sufficient information about how the research was conducted to allow for independent review and verification of research claims, regardless of the methodology used in the research. Full and complete disclosure for items listed in Section A will be made at the time results are released, either publicly or to a research client, as the case may be. As detailed below, the items listed in Section B, if not immediately available, will be released within 30 days of any request for such materials. If the results reported are based on multiple samples or multiple modes, the preceding items (as applicable) will be disclosed for each.
A. Items for Immediate Disclosure

  1. Data Collection Strategy: Describe the data collection strategies employed (e.g. surveys, focus groups, content analyses).
  1. Who Sponsored the Research and Who Conducted It. Name the sponsor of the research and the party(ies) who conducted it. If the original source of funding is different than the sponsor, this source will also be disclosed.
  1. Measurement Tools/Instruments. Measurement tools include questionnaires with survey questions and response options, show cards, vignettes, or scripts used to guide discussions or interviews. The exact wording and presentation of any measurement tool from which results are reported as well as any preceding contextual information that might reasonably be expected to influence responses to the reported results and instructions to respondents or interviewers should be included. Also included are scripts used to guide discussions and semi-structured interviews and any instructions to researchers, interviewers, moderators, and participants in the research. Content analyses and ethnographic research will provide the scheme or guide used to categorize the data; researchers will also disclose if no formal scheme was used.
  1. Population Under Study. Survey and public opinion research can be conducted with many different populations including, but not limited to, the general public, voters, people working in particular sectors, blog postings, news broadcasts, an elected official’s social media feed. Researchers will be specific about the decision rules used to define the population when describing the study population, including location, age, other social or demographic characteristics (e.g., persons who access the internet), time (e.g., immigrants entering the US between 2015 and 2019). Content analyses will also include the unit of analysis (e.g., news article,  social media post) and the source of the data (e.g., Twitter, Lexis-Nexis).
  1. Method Used to Generate and Recruit the Sample. The description of the methods of sampling includes the sample design and methods used to contact or recruit research participants or collect units of analysis (content analysis).
    1. Explicitly state whether the sample comes from a frame selected using a probability-based methodology (meaning selecting potential participants with a known non-zero probability from a known frame) or if the sample was selected using non-probability methods (potential participants from opt-in, volunteer, or other sources).
    2. Probability-based sample specification should include a description of the sampling frame(s), list(s), or method(s).
      1. If a frame, list, or panel is used, the description should include the name of the supplier of the sample or list and nature of the list (e.g., registered voters in the state of Texas in 2018, pre-recruited panel or pool).
      2. If a frame, list, or panel is used, the description should include the coverage of the population, including describing any segment of the target population that is not covered by the design.
    3. For surveys, focus groups, or other forms of interviews, provide a clear indication of the method(s) by which participants were contacted, selected, recruited, intercepted, or otherwise contacted or encountered, along with any eligibility requirements and/or oversampling.
    4. Describe any use of quotas.
    5. Include the geographic location of data collection activities for any in-person research.
    6. For content analysis, detail the criteria or decision rules used to include or exclude elements of content and any approaches used to sample content. If a census of the target population of content was used, that will be explicitly stated.
    7. Provide details of any strategies used to help gain cooperation (e.g., advance contact, letters and scripts, compensation or incentives, refusal conversion contacts) whether for participation in a survey, group, panel, or for participation in a particular research project. Describe any compensation/incentives provided to research subjects and the method of delivery (debit card, gift card, cash).
  1. Method(s) and Mode(s) of Data Collection. Include a description of all mode(s) used to contact participants or collect data or information (e.g., CATI, CAPI, ACASI, IVR, mail, Web for survey; paper and pencil, audio or video recording for qualitative research, etc.) and the language(s) offered or included. For qualitative research such as in-depth interviews and focus groups, also include length of interviews or the focus group session.
  1. Dates of Data Collection. Disclose the dates of data collection (e.g., data collection from January 15 through March 10 of 2019). If this is a content analysis, include the dates of the content analyzed (e.g., social media posts between January 1 and 10, 2019).
  1. Sample Sizes (by sampling frame if more than one frame was used) and (if applicable) Discussion of the Precision of the Results.
  1. Provide sample sizes for each mode of data collection (for surveys include sample sizes for each frame, list, or panel used).
  2. For probability sample surveys, report estimates of sampling error (often described as “the margin of error”) and  discuss whether or not the reported sampling error or statistical analyses have been adjusted for the design effect due to weighting, clustering, or other factors.
  3. Reports of non-probability sample surveys will only provide measures of precision if they are defined and accompanied by a detailed description of how the underlying model was specified, its assumptions validated, and the measure(s) calculated.
  4. If content was analyzed using human coders, report the number of coders, whether inter-coder reliability estimates were calculated for any variables, and the resulting estimates.
  1. How the Data Were Weighted. Describe how the weights were calculated, including the variables used and the sources of the weighting parameters.
  1. How the Data Were Processed and Procedures to Ensure Data Quality. Describe validity checks, where applicable, including but not limited to whether the researcher added attention checks, logic checks, or excluded respondents who straight-lined or completed the survey under a certain time constraint, any screening of content for evidence that it originated from bots or fabricated profiles, re-contacts to confirm that the interview occurred or to verify respondent’s identity or both, and measures to prevent respondents from completing the survey more than once. Any data imputation or other data exclusions or replacement will also be discussed. Researchers will provide information about whether any coding was done by software or human coders (or both); if automated coding was done, name the software and specify the parameters or decision rules that were used.
  1. A General Statement Acknowledging Limitations of the Design and Data Collection. All research has limitations and researchers will include a general statement acknowledging the unmeasured error associated with all forms of public opinion research.
B. Additional Items for Disclosure. After results are reported, we will make the following items available within 30 days of any request for such materials:
  1. Procedures for managing the membership, participation, and attrition of the panel, if a pool, panel, or access panel was used. This should be disclosed for both probability and non-probability surveys relying on recruited panels of participants.
  2. Methods of interviewer or coder training and details of supervision and monitoring of interviewers or human coders. If machine coding was conducted, include description of the machine learning involved in the coding.
  3. Details about screening procedures, including any screening for other surveys or data collection that would have made sample or selected members ineligible for the current data collection (e.g., survey, focus group, interview) will be disclosed (e.g., in the case of online surveys if a router was used).
  4. Any relevant stimuli, such as visual or sensory exhibits or show cards. In the case of surveys conducted via self-administered computer-assisted interviewing, providing the relevant screen shot(s) is strongly encouraged, though not required. 
  5. Summaries of the disposition of study-specific sample records so that response rates for probability samples and participation rates for non-probability samples can be computed. If response or cooperation rates are reported, they will be computed according to AAPOR Standard Definitions. If dispositions cannot be provided, explain the reason(s) why they cannot be disclosed, and this will be mentioned as a limitation of the study.
  6. The unweighted sample size(s) on which one or more reported subgroup estimates are based.
  7. Specifications adequate for replication of indices or statistical modeling included in research reports.

C. Access to Datasets
Reflecting the fundamental goals of transparency and replicability, AAPOR members share the expectation that access to datasets and related documentation will be provided to allow for independent review and verification of research claims upon request. In order to protect the privacy of individual respondents, such datasets will be de-identified to remove variables that can reasonably be expected to identify a respondent. Datasets may be held without release for a period of up to one year after findings are publicly released to allow full opportunity for primary analysis. Those who commission publicly disseminated research have an obligation to disclose the rationale for why eventual public release or access to the datasets is not possible, if that is the case.
D. AAPOR Standards Complaint
If any of our work becomes the subject of a formal investigation of an alleged violation of this Code, undertaken with the approval of the AAPOR Executive Council, we will provide additional information on the research study in such detail that a fellow researcher would be able to conduct a professional evaluation of the study.

Frequently Asked Questions about AAPOR’s Standards for Minimal Disclosure