Homebuyer Education Competitive Product Evaluation

challenge & outcomes

Homebuyer education (HBE) is designed to educate homebuyers about the potential benefits and risks of owning a home. HBE is typically offered as part of an affordable loan program, often as a requirement.

Our design team evaluated the learning, UX, content, and accessibility of five self-directed online HBE courses. We used the evaluations to:

  • provide tactical recommendations for our HBE partner’s newest course;

  • identify emerging opportunities for the HBE field at large; and

  • propose additional areas of research to pursue with consumers.

my role

I led this project and was responsible for:

  • Managing our overall relationship with our business partner

  • Coordinating with experts from our design team to create our evaluation tool

  • Managing the weekly evaluation sprint cycles (including facilitation of weekly discussions and presenting weekly updates)

  • Leading comparison of evaluations & definition of recommendations

  • Designing and co-presenting the final share-out to our business partners and their stakeholders

Yvonne Tran managed the project and Tyler McLeod contributed to the share-outs with partners. Ashley Wituschek, Kiah Guilford, and Vy Vu rounded out the evaluation team.

evaluation tool

We created a custom evaluation rubric with our business partners and design team experts. The rubric had four sections, which each pulled from existing industry standards: learning, UX, content, and accessibility. It was based on a heuristic evaluation template our design team already created.

I designed the rubric to support our weekly sprint cycle of evaluations:

  1. M-W: Individual evaluators take the assigned HBE course, rate it against each of the standards, and leave comments + screenshots to support their scores

  2. Th: Core team compiles the evaluation scores, comments, and screenshots; calculates score averages and variances; and sends to the evaluation team

  3. F: Evaluation team gathers to discuss initial reactions, things each course did well and poorly, and any standard with wide variance in scores.

synthesis

After the evaluation team completed evaluations for each of the five courses, the core team compared scores across the courses to evaluate our existing partnership choice.

We analyzed the evaluation data to identify specific areas where the new course had scored lower than other courses. These comparisons became tangible recommendations for the new course.

We also unearthed three emerging opportunities where the new course could push the HBE industry forward – where none of the evaluated courses scored well.

Finally, our evaluation team was not a representative sample of borrowers we are trying to reach with HBE, so we recommended research with consumers to understand more about learning impact, voice and tone, and inclusivity/representation.

impact

We provided more than 20 recommendations for improvements to the new HBE course: some of which were very tactical, others that were more long-term. We also identified key opportunities for the new course and recommended further research. These recommendations informed the continuous development of HBE strategy and the roadmap moving forward.