HOA Surveys That Matter

The Courage to Ask Better Questions

When my HOA Board recently announced plans to distribute a “feedback survey,” I was initially optimistic. But instead of thoughtful questions designed to provide the Association with actionable information, we received vague topics with rating categories that left no room for corresponding feedback. Worse yet, many owners were intentionally excluded. With the management company’s contract renewal in process, we deserve better. Let’s examine the specific problems with this approach and detail how a process that truly values every resident’s voice works.

Brought to you by Drew McManus, your neighbor in 7908.

Key Takeaway: the following data collection process is so deeply flawed, it’s actually worse than having no survey at all. It creates an illusion of resident input while silencing many voices, provides false legitimacy to whatever the board decides, and makes future advocacy more difficult.
Survey Review

The Questions We Were Asked vs. What They Should Have Asked

Here are the exact questions about the Management Company from our survey. Owners were asked to rate each with a five option rating scale from “very satisfied” to “very dissatisfied” with no option for “Not Applicable” or “Unknown”:

  1. Availability and responsiveness
  2. Professionalism
  3. Ability to manage staff
  4. Effectiveness of systems/processes
  5. Communication with residents

While these topics have value at a conceptual level, the way they’re asked makes them nearly impossible to answer meaningfully.

Why These Questions Don’t Work

TOO VAGUE
“Availability and responsiveness” could mean different things to different people. Am I rating how quickly my emails are answered? How fast maintenance requests are handled? Whether someone answers the phone during office hours? Without specificity, our answers won’t help improve service.
They ask about things we can’t see
How can I rate their “ability to manage staff” when I have no visibility into their hiring practices, training, or supervision? I can only judge what I experience – like whether maintenance requests are completed properly.
COMBINE MULTIPLE ISSUES
“Effectiveness of systems/processes” lumps together everything from the online payment portal to how violations are handled. A system might work well for one task but poorly for another.
THEY USE SUBJECTIVE TERMS
What does “professionalism” actually mean? To me, it might mean returning calls promptly. To you, it might mean wearing business attire. Without clarity, our answers aren’t comparable.
They provide no context
“Communication with residents” doesn’t specify whether we’re evaluating newsletters, emergency notifications, or day-to-day interactions. Each requires different skills and deserves separate evaluation.
Overt positive bias
Questions are designed to push respondents toward agreeing with statements. For example: relying on reverse-scored questions would have produced more balanced results by reducing automatic agreement and inspiring critical consideration.

How These Questions Should Have Been Written

A better approach asks the same question different ways to owners then mix and match the versions equally and at random between surveys.

For example, using reverse-scored questions help prevent survey bias by asking about the same topic from different angles, ensuring respondents are reading carefully and providing consistent answers rather than just automatically giving the same rating to every question.

Any response that garnered a result on the “negative” end of response options should provide the respondent with a text field and encourage them to provide additional details.

Here’s how straightforward it is to convert the questions we received into something capable of garnering actionable results leading to data-driven decision making:

Instead of “Availability and responsiveness”

  • “How satisfied are you with how quickly the office responds to phone calls?”
  • “How satisfied are you with response times to maintenance requests?”
  • “How satisfied are you with the availability of staff during posted office hours?”
  • Reverse-Scored Version: “How often have you experienced delays or unresponsiveness from the management company?” to inadequate communication from management?” {Never (best score), Rarely, Sometimes, Often, Very frequently (worst score), Not Applicable, I Don’t Know}

Instead of “Professionalism”

  • “How satisfied are you with the courtesy shown by management staff?”
  • “How satisfied are you with staff members’ knowledge when answering your questions?”
  • “How satisfied are you with how conflicts or disagreements are handled?”
  • Reverse-Scored Version: “How often have you observed senior management behaving in ways that seemed unprofessional or inappropriate during interactions with residents?” to inadequate communication from management?” {Never (best score), Rarely, Sometimes, Often, Very frequently (worst score), Not Applicable, I Don’t Know}

Instead of “Ability to manage staff”

  • “How satisfied are you with the quality of maintenance work?”
  • “How satisfied are you with the consistency of landscaping services?”
  • “How satisfied are you with the responsiveness of on-site personnel?”
  • Reverse-Scored Version: “How often have you noticed projects lacking clear direction from senior management?” to inadequate communication from management?” {Never (best score), Rarely, Sometimes, Often, Very frequently (worst score), Not Applicable, I Don’t Know}

Instead of “Effectiveness of systems/processes”

  • “How satisfied are you with the online payment system?”
  • “How satisfied are you with the process for submitting maintenance requests?”
  • “How satisfied are you with how rule violations are communicated and enforced?”
  • Reverse-Scored Version: “How often do the company’s systems (e.g., billing, maintenance requests) cause confusion or errors?” to inadequate communication from management?” {Never (best score), Rarely, Sometimes, Often, Very frequently (worst score), Not Applicable, I Don’t Know}

Instead of “Communication with residents”

  • “How satisfied are you with the clarity of written communications?”
  • “How satisfied are you with the timeliness of updates about community matters?”
  • “How satisfied are you with how well the management company listens to resident concerns?”
  • Reverse-Scored Version: “How often have you felt uninformed about important community matters due to inadequate communication from management?” {Never (best score), Rarely, Sometimes, Often, Very frequently (worst score), Not Applicable, I Don’t Know}

Specificity Leads to Actionable Data

Research in survey methodology consistently shows that specific, concrete questions produce more reliable and actionable data than broad, abstract ones. According to the American Association for Public Opinion Research (AAPOR), “Questions should be specific and concrete rather than general and abstract” because respondents interpret vague questions differently, making the results difficult to interpret and act upon.1

The improved questions suggested in this article follow this principle by focusing on specific experiences (e.g., “response times to maintenance requests”) rather than abstract concepts (e.g., “responsiveness”).

Avoiding Double-Barreled Questions

Survey methodologists at Harvard University’s Program on Survey Research note that questions should “ask about one thing at a time.”2 Questions like “Effectiveness of systems/processes” violate this principle by asking respondents to evaluate multiple systems simultaneously.

The revised questions break these down into specific systems (payment portal, maintenance requests, rule enforcement), allowing for precise feedback on each component.

Observable vs. Unobservable Criteria

Pew Research Center guidelines emphasize that survey questions should ask about observable behaviors or experiences rather than characteristics respondents cannot directly observe.3 Residents cannot directly observe “ability to manage staff” but can observe the outcomes like “quality of maintenance work.”

The Importance of Context

Gallup, a leader in survey research, stresses the importance of providing adequate context in survey questions.4 Without context, respondents make assumptions about what’s being asked, leading to inconsistent interpretations.

The improved questions provide clear context (e.g., “clarity of written communications” instead of “communication with residents”).

Best Practices for HOA Surveys

Open-Ended Follow-ups

The Community Associations Institute recommends including open-ended questions as follow-ups to satisfaction ratings.5 This allows residents to explain their ratings and provide specific suggestions for improvement.

Balancing Quantitative and Qualitative Data

Effective community surveys balance quantitative ratings with qualitative feedback. According to research by SurveyMonkey, surveys that include both types of questions receive higher completion rates and provide more useful insights.6

Inclusive Distribution

The Foundation for Community Association Research emphasizes that all stakeholders should have the opportunity to participate in community surveys.7 Limiting distribution to billing emails excludes many legitimate stakeholders whose perspectives are valuable.

References

  1. Foundation for Community Association Research. (2018). Community Association Governance. Falls Church, VA: CAI Press. https://foundation.caionline.org/
  2. American Association for Public Opinion Research. (2018). Best Practices for Survey Research. https://www.aapor.org/Standards-Ethics/Best-Practices.aspx
  3. Harvard University Program on Survey Research. (2016). Questionnaire Design Tip Sheet. https://psr.iq.harvard.edu/files/psr/files/PSRQuestionnaireTipSheet_0.pdf
  4. Pew Research Center. (2022). Questionnaire Design. https://www.pewresearch.org/methods/u-s-survey-research/questionnaire-design/
  5. Gallup. (2021). Writing Good Survey Questions. Gallup Methodology Blog.
  6. Community Associations Institute. (2019). Best Practices: Governance. Falls Church, VA: CAI Press. https://foundation.caionline.org/
  7. SurveyMonkey. (2020). Survey Methodology: Best Practices. https://www.surveymonkey.com/mp/survey-methodology/
Voices Got Left Out

The Distribution Problem

Beyond the poorly designed questions, I’ve discovered a deeper issue with how the survey was distributed. Not every owner was given the opportunity to participate, creating serious concerns about the validity of the results and denying those owners their voice and agency.

The genuinely baffling part is this was by design.

For example, my spouse, Holly, received the 2/27/2025 email from Board President Scott Timmerman informing owners about the survey but she never received the survey invitation. When she reached out to the office inquiring why this was the case, even after checking spam, she received a reply to that inquiry from Timmerman, but he did not acknowledge nor reply to her follow-up messages.

Holly

Hi Timothy,

I have requested twice for the survey to be re-sent. It is possible my emails from my Gmail account are in your spam folder.

I know my email is working because I have received other notifications from you, including the email mentioning that a survey would be arriving. However, I have not received it, and it is not in the spam folder of any of my accounts. Additionally I am able to receive and accept Survey Monkey notifications, so that is not the issue.

Please send the survey. Both of my email addresses are listed below.

Scott Timmerman

Holly,

The survey was sent to the email we have on file for your unit which is: {email address}.  SurveyMonkey indicates that it was sent and received.  We used the emails for the billing system (they were recently updated – rather than the Rise email list), so multiple owners don’t have multiple responses.  My wife was not able to separately fill out a survey either since my email is the billing email.

Holly

Dear Scott,

The handling of this survey has been unacceptable. Nowhere in the original message did it state that only one response would be allowed per household, which led to misleading expectations. Additionally, the fact that my email was on file for communication but not for participation is frustrating and dismissive. My opinion matters, and right now, my opinion is that this process has been poorly executed.

This approach is fundamentally flawed for several reasons:

It silences co-owners. Many units have multiple owners, each with their own experiences and perspectives. By only sending invites to the billing email, many legal owners were effectively told their opinions don’t matter.

It creates inconsistent representation. Some units with two co-owners reportedly received multiple invites while others didn’t. Similarly, owners of multiple units received just one invite despite having potentially different experiences with each property.

It misunderstands the purpose of feedback. The goal of a community survey isn’t to gather one opinion per unit – it’s to understand the experiences of all community members. Each owner, regardless of billing status, deserves a voice.

This distribution method undermines the entire purpose of seeking community feedback. Any decisions made using these results will be based on a skewed and incomplete picture of owner satisfaction.
Why This Matters

It’s Time for Action

The purpose of this survey isn’t just to collect opinions, it’s to help make a major decision about renewing our management company’s contract. Without clear, specific feedback from all owners, our board cannot make a truly informed choice.

Your voice matters in creating the kind of community we all want to live in. If you believe our community deserves a better survey before making decisions about our management company, please reach out to our Board President, Scott Timmerman requesting a process that truly values every resident’s voice rather than just checking a box.

Be sure to include your own ideas and suggestions on how to improve the process, such as providing alternate methods to email-only, investing in a third party provider to write and administer surveys, specify what the survey is designed to measure and which upcoming decisions the information will be used.

Be respectful, concise, and clear in articulating the negative impact it has had on you and your fellow homeowners. You are welcome to use the example language as-is, but feel free to personalize the example message before you send.

Research shows that customized messages have the greatest impact, so be sure to personalize the example message before you send.
Sincerely,(Required)
Please provide your unit number only if you reside at 175 E Delaware Pl. HOA.