Closed
Bug 1030152
Opened 11 years ago
Closed 5 years ago
Does knowing an MDN contributor increase initial contributions: A/B Test
Categories
(developer.mozilla.org Graveyard :: General, enhancement)
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: stormy, Unassigned)
Details
(Whiteboard: [specification][type:change])
What feature should be changed? Please provide the URL of the feature if possible.
==================================================================================
Sign up for MDN form.
"Do you know someone who works on MDN?"Yes/No Or "Who referred you to MDN?"
Then we could track number of new users who actually make a contribution and see if knowing an MDN person influenced that at all. If it does, that could change our outreach efforts.
We'd only have to do this for long enough to get meaningful data. Then we could remove the extra question from the signup form.
What problems would this solve?
===============================
If we knew that knowing an MDN contributor or being referred by someone you knew greatly increased the likelihood that a new member would make a contribution, we could adjust our outreach strategy.
Who would use this?
===================
Everyone signing up for MDN would see the new field.
Members of DevRel responsible for our outreach strategy would use the info.
What would users see?
=====================
A new field on the signup form that would ask "Do you know someone who works on MDN?"(Yes/No) or "Who referred you to MDN?" (Space for name)
What would users do? What would happen as a result?
===================================================
They would check a box or add a name.
Is there anything else we should know?
======================================
Comment 1•11 years ago
|
||
This sounds useful.
Would it be more valuable to ask about this particular factor, e.g. "Did someone refer you?", or to offer a choice of factors, e.g. "What inspired you to sign up?" The first could help us understand /who/ is doing all the referring, which might help us refine that activity. The second could help us learn what factors are most compelling. Which is the intent?
What amount of data would make it meaningful? 100 responses?
Comment 2•11 years ago
|
||
Happy to see we are gathering evidence here. We should be sure to also use the right tool for the job.
The variable being studied is familiarity. Familiarity is a property of users rather than a property of interfaces. To study this, we would not show different interfaces one group of users, but instead show one interface to different kinds of users.
A/B testing measures interface performance, so it would not be the best tool for gathering this kind of evidence. Instead, we should use one interface to determine which users are familiar with the team and which are not, and then collect data on how those two different groups behave. I imagine Google Analytics could help us do that.
We should still compare these groups during the same time period, however. Comparing one group in June to another group in July would result in unreliable data, because so many things would be different (time, design, etc.) in addition to user group.
Comment 3•11 years ago
|
||
To split test we could:
1. Add a "referrer" field to user profiles (similar to "vouched by" on Mozillians.org)
2. Only show the field to 50% of visitors to the register form (via optimizely or via waffle flag)
3. Track the two segments of users over time
As for sample size:
To be 95% sure that we measure within +/- 5% accuracy of a population of ~3k registrations per month, we would need a sample size of 341 users in each segment. [1]
[1] http://www.surveysystem.com/sscalc.htm
Reporter | ||
Comment 4•11 years ago
|
||
We aren't trying to test people who saw the form and who didn't. We are AB testing between who clicks yes and who clicks no.
Comment 5•11 years ago
|
||
Great. I think we agree on the measurements.
I just want to be sure the measurements are collected correctly. This would be a split test (comparing the behavior of two different groups) rather than an A/B test (comparing the effectiveness of two different interfaces) so this will need to be measured using a tool other than Optimizely.
Updated•11 years ago
|
Severity: normal → enhancement
Comment 6•5 years ago
|
||
MDN Web Docs' bug reporting has now moved to GitHub. From now on, please file content bugs at https://github.com/mdn/sprints/issues/ and platform bugs at https://github.com/mdn/kuma/issues/.
Status: NEW → RESOLVED
Closed: 5 years ago
Resolution: --- → WONTFIX
Updated•5 years ago
|
Product: developer.mozilla.org → developer.mozilla.org Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•