A/B testing is a method to determine the best performing variant by simultaneously running two versions of the same thing, in this case, the hiring workflow. You randomly assign applicants to A or B version of the workflow and at the end see which one performed better. We recommend establishing a baseline with one workflow that has live applicants and performance measures first, before A/B testing. To help you build your first workflow check out some of the recommendations in this article on building the Primary Workflow.
What's Covered
Why A/B Test?
You have built a hiring process, but is it performing the best it can? Many factors can contribute to the effectiveness of the workflow such as images, workflow order, usage of SMS, etc. The effectiveness of the workflow is also subjective, to some, it's the conversion rate while others maybe want to achieve faster completion time. The advantage of simultaneously running two versions of the same workflow means you can isolate which factor contributes to what outcome.
Designing A/B Testing
In order for A/B testing to work, you can only test one factor at a time. That means you can only change one variable in version B, or you will not be able to tell which variable contributed to the performance change. Determine what you want to test. Here are some examples to help you get started.
Variable | Hypothesis |
SMS Messages | Increase applicant engagement, improve conversion rate and completion time. |
4 Follow-Up Messages | Decrease applicant drop-off. |
Different message verbiage | Increase applicant engagement and improve on conversion rate. |
Testing a new integration (i.e. Native Doc signing or Learning stages) | Decrease user supervision and inbound questions from applicants. |
Adjust the order of your stages | Increase applicant completion rate. |
Add a welcome message explaining the process on Applicant Portal | Decrease support tickets and increase conversion rate. |
Setup
Create Version B Workflow
- Clone the original workflow and make the updates to the variable you're testing.
- Make sure to rename the Opening so you know which is version B.
Randomly Assign Applicants to Workflow
- Go to the company settings page and check the box for Generate a random integer between 0 and 100 upon creation under the Applicants section. Make sure to save this change by clicking the button: Update Applicant Setting.
- Applicants will now have a data key titled "Rand" in their profile:
- In order for this field to show up in the Rule Stage, you must go into a Data Collection Stage and add a new custom question (any question type) and assign "rand" as the data key. Please use a data collect stage that is not in use. Create a test applicant and put them through this data collection stage and complete the rand question as an applicant. Now data key `rand` will be available on the Rule Stage and you can now delete the question and the test applicant.
- Add a rule stage to Version A to the top of the workflow to route applicants to the Version B Opening based on the random number assigned to them. We recommend doing a rule that filters those who are greater than 50, that way it's split down the middle.
- Make sure to document your changes and when you made the change.
- Use the analytics page to view conversions between the funnels (see "stage metrics" in this help article for more information).
Comments
0 comments