Extension:CentralNotice/Design Research 2017/methods

The goal of our project was to collect feedback on common uses, issues and feature requests for CentralNotice. We conducted interviews and asked a series of open ended questions to hear directly from users and potential users.

Participants edit

We started with the list of recent CentralNotice administrators. We added members of the WMF staff who have either participated in CentralNotice campaigns or managed teams that have used CentralNotice. We also contacted several chapters and community groups who use CentralNotice on a regular basis. The willing participants made up our participant pool.

Interview process edit

Participants were informed that we intended to record the interview for note taking purposes. Upon request, we did conduct some interviews without a recording.

When someone agreed to participate we asked them to fill out a demographics survey before the interview. During a one on one interview, we took notes based on a specific protocol.

During group interviews, we followed the same question type and order. However we also used a survey with the same questions. We would begin by letting participants write down their answers in the survey (usually taking 5 minutes for this step). Afterward we would discuss the answers. This was to ensure group participants would have a chance to respond privately and honestly if they wanted to.

When participants outside the WMF agreed to an interview, we provided privacy policy statements for the first (demographics) survey [1] and a separate privacy policy and release form for the interview [2]. The legal department at WMF wrote both of these documents.

[1] demographics privacy statement

https://docs.google.com/document/d/1Ju9mW9QPSB0Uk1Ya7sYkxQ2nMr2pS9bZ5fwHH7I49T4/edit

[2] Interview privacy statement and release form

https://docs.google.com/document/d/10XDvM7sjhvSI59Kz2XUXaIAiMgEs_QOi39bVfDmJcsE/edit

Data analysis edit

At the end of our interviews, we had a lot of qualitative data usually in the form of long paragraphs or handwritten transcripts. At times we hand handwritten transcripts and typed responses (from the in-interview survey) from the same person. As we bagen to analyze the data, we put these responses side-by side in the same spreadsheet. When we considered a single participant’s response, we made sure to not overcount their comments.  

We attempted to code our responses [3] and analyze the most prominent themes and larger categories. We started this process by lining up the hand written interview transcripts with the typed responses from the survey. Both coders (David Strine and Joseph Seddon) read through all the answers and came up with their own lists of categories (and themes within those categories). We then met and combined them into one list for the final coding.

We met together and read through each answer, noting which themes were touched on in each answer. As mentioned before, if we had hand written transcripts and in-interview survey data, we made sure to not overcount a participant’s comment on a theme. Also as happens in conversation people tend to repeat the same comment when making a single point. We made sure to analyze comments like this and to not overcount. We did this together to reduce individual bias and to make sure each of us understood what each theme meant. The final coded answers looked like this:

Person 1 Person 2 Person 3 Total
Question 1 answer answer answer
Question 2 answer answer answer
Question 3 answer answer answer
Theme 1 1 1 1 3
Theme 2 1 1
Theme 3 1 1

Defining themes edit

In order to reduce bias as much as possible, we needed to have a combined understanding of what each theme meant to us. Here is how we defined them.

Category Targeting edit

  • Reach - a general comment on how broad the reach of CN already is
  • General - anytime targeting abilities are mentioned without being any more specific.
  • Request: User - A request or suggestion that more user-specific data should be used to target specific users.
  • Request: Content (subsections of content, categories, wiki data query, triggered by action) - A request or suggestion that more user-specific data should be used to target specific content
  • issue: don't know current targeting features - should be self explanatory
  • Request: cross platform - communication with the app or other platforms in the future

Category: banner creation and functionality edit

  • Resources (Templates, Design resources, Widgets) - any comment about wanting templates or examples for campaigns of banners. Also a widget that can help you make a campaign or banner. Creation “wizard”
  • Widget like - direct interaction with other site functionality (account creation, upload etc.) - interacting with other technologies in the banner. Forms, logins and other media wiki functions in the banner
  • RML - any comment about remind me later functionality
  • Branching - user interactions with a banner can lead to different flows/further interactions
  • Surveying - The ability to add surveys or run surveys through CN
  • general UI - any general comment on the UI
  • issues: Campaign config UI - a negative comment on the campaign config UI
  • issues: Banner config UI -  a negative comment on the banner config UI
  • request: clone campaign - a request for the clone campaign feature
  • request: detailed campaign data for CN admins - This is for requests from admins to get detailed data on their campaigns. Most people outside WMF don’t get much data on their campaign performance at the moment.
  • request: better previews/ WYSIWYG" editor - This is a request for a banner editor more like visual editor or to get easier previews of banners while editing.
  • request: banner version control - should be self explanatory
  • request API for CN -

Category: Sitenotice edit

  • Lack of functionality - should be self explanatory
  • general comment - should be self explanatory

Category: Process edit

  • Process general comment - general comment about how to set up banners or campaigns
  • ease of use comment - general comment
  • User roles - Drafting - The ability to create a banner but not publish a banner
  • Requests - Any comment on how to request a campaign
  • Scheduling/Calendars - any comment on how to view the scheduling of banners
  • Results/data availability for admins - ??????? combine with one above^^
  • What is enabled where and when? - specific request to understand what campaigns are live where
  • Data available for the community - This is for requests from any interested party to get detailed data on campaigns.

Category: CentralNotice - miscellaneous edit

  • Security -  This is a general comment about the security around CN (i.e. vandalism, admin process and rights etc)
  • Integration with other communication platforms (esp. Echo) - How CN does or does not interact with other communication tools
  • general negative comment - should be self explanatory
  • Subscription - User control of banners - The ability of users to opt in or opt out of classes of CN banner (photo contests, local events, etc.)