This 360-degree employee evaluation survey template gathers feedback from peers, managers, and direct reports across core competencies — communication, leadership, and collaboration. It includes 10 questions with rating scales and open-ended follow-ups, taking about 2-3 minutes to complete. Use it for annual reviews, promotion decisions, or leadership development programs where a single manager's perspective isn't enough.
What Questions Are in This 360-Degree Employee Evaluation Survey Template?
This 360-degree employee evaluation survey template includes 10 questions designed to capture multi-rater feedback across three core competencies — plus context behind every rating. Here's what each question does and why it earns its place:
- "What is your full name?" (text field) — Identifies the evaluator. Some teams make this optional to encourage candor, but named feedback tends to be more specific and useful. If you're worried about honesty, pair this with anonymous survey settings instead of skipping the identifier entirely.
- "Please rate his/her communication skills" (rating scale) — Communication is the competency most likely to surface blind spots. Managers often rate communication higher than peers do — that gap is the whole point of running a 360-degree employee evaluation survey template in the first place.
- "Comments or suggestions" (open-ended — after communication) — The rating tells you there's a problem. This tells you what the problem actually is. Pair open-ended responses with thematic analysis to spot patterns across 20+ evaluators without reading each one manually.
- "Please rate his/her leadership skills" (rating scale) — Leadership ratings from direct reports are the most honest data point in any 360 feedback survey. People know exactly whether their manager supports them — they just don't say it in person.
- "Comments or suggestions" (open-ended — after leadership) — Look for specifics here. Vague praise like "great leader" is noise. Actionable feedback like "doesn't delegate enough" or "avoids hard conversations" is signal.
- "Please rate his/her teamplayer skills" (rating scale) — This catches the high-performer who delivers individually but creates friction for everyone around them. It's the question that most self-assessments get wrong.
- "Comments or suggestions" (open-ended — after teamwork) — Collaboration issues hide in open-ended responses. Run these through sentiment analysis to flag negative patterns before they become retention problems.
- "Will you recommend this individual to a colleague?" (yes/no or scale) — Think of this as a peer-level NPS. A "no" here from multiple raters is a stronger signal than any individual rating score. It cuts through noise.
- "What are his/her strengths?" (open-ended) — Strengths feedback is underrated. Most 360s focus on gaps, but knowing what others value about someone shapes better development plans than a list of weaknesses.
- "What are his/her weaknesses?" (open-ended) — The money question. Compare this across rater groups (peers vs managers vs reports) — if the same weakness shows up from all three, it's real and urgent.
How to Customize This 360-Degree Feedback Template for Your Team
This template works out of the box for general competency reviews. But the real value comes when you adapt it to your context. Here's how to do it without overcomplicating things:
- Add role-specific competencies — For engineering managers, add a "technical decision-making" rating. For sales leaders, add "cross-functional collaboration with product." Generic competencies get generic answers.
- Adjust rating scales to match your review framework — If your org uses a 5-point scale for performance reviews, use the same scale here. Mixing a 10-point 360 with a 5-point annual review creates confusion when managers try to calibrate. Use the survey builder to match your existing framework exactly.
- Segment by rater relationship — Add a "What is your relationship to this person?" question at the start. This lets you filter results by peer, manager, and direct report — which is where the real 360-degree insights live. Without this segmentation, you're just averaging opinions.
- Use skip logic for conditional questions — If a rater selects "I don't work closely enough to evaluate," skip the remaining competency questions. Forcing uninformed ratings pollutes your data.
Pro tip: Don't add more than 12-15 questions total. Every question past that threshold drops your completion rate by roughly 5-8%. Three competencies with rating + open-ended pairs is the sweet spot for most teams.
Why Parameter-Level Ratings Beat a Single Overall Score
A lot of 360-degree employee evaluation surveys ask one overall question: "Rate this employee's performance." That's a mistake. Here's why this template breaks feedback into competency-level parameters instead:
- Overall scores hide the story — An employee rated 4/5 overall could be a 5 in communication and a 2 in leadership. The average looks fine; the reality is a coaching emergency. Parameter-level ratings from this 360 feedback survey template surface those gaps.
- Development plans need specifics — Telling someone "you scored 3.8 overall" gives them nothing to work with. Telling them "peers rate your teamwork at 2.9 while managers rate it 4.1" gives them a clear blind spot to address.
- Cross-rater comparison gets interesting at the parameter level — The gap between how peers rate communication and how managers rate it is often the most revealing data point in a 360. You can't see that gap with a single score. Check these splits in survey reports to spot patterns fast.
Teams that switched from single-score to parameter-level 360s report spending 40% less time in calibration sessions — because the data already tells the story.
Common Mistakes That Ruin 360-Degree Evaluations
Running a 360-degree feedback survey is straightforward. Getting useful data from it is not. Here are the failure modes that trip up most teams:
- Letting employees pick their own raters — This is the fastest way to turn a 360 into a popularity contest. People choose raters who like them. Assign raters intentionally: 2-3 peers, 1-2 direct reports, 1 skip-level manager. That's where honest signal lives.
- Collecting ratings without open-ended context — A rating of "3 on leadership" means nothing without knowing why. This template pairs every rating with a comment field for that reason. If your completion rates on open-ended questions are low, shorten the survey rather than dropping the comment fields.
- Running 360s without guaranteeing confidentiality — If raters think their manager will see their individual responses, they'll give safe, useless feedback. Use anonymous survey configuration and communicate the confidentiality policy before sending. Aggregate responses by rater group (peers, reports, managers) — never show individual answers.
- Using 360 results for compensation decisions — The moment people learn 360 data affects pay, they game it. 360-degree evaluations work best for development, not ranking. Keep them separate from your compensation cycle.
The biggest failure mode of all: running the 360 and never sharing results with the employee. If feedback goes into a black hole, raters stop giving honest answers by the second round. Close the loop within 2 weeks. Always.
Where and How to Distribute This 360-Degree Evaluation Survey
The distribution method shapes response quality more than most teams realize. Here's how to deploy this 360-degree employee evaluation survey template for maximum participation:
- Email surveys — The default for most 360 deployments. Send personalized invitations with the employee's name in the subject line and a clear deadline. Response rates jump 15-20% when you include "This takes 2 minutes" in the email body.
- Web-based surveys — Embed the 360 in your intranet or HRIS portal for teams that already live in internal tools. Works well for large-scale 360 programs where you're evaluating 50+ employees in the same cycle.
- Slack or Teams integration — For pulse-style 360s (quarterly or monthly), push survey links directly via Slack. Raters are more likely to respond when the request shows up where they already work, not buried in their inbox.
Pro tip: Set up feedback alerts to notify HR when a rater submits a score below 2 on any competency. Don't wait until the review cycle ends to spot serious issues — flag them in real time.
Closing the Loop — Acting on 360-Degree Feedback
Collecting multi-rater feedback is the easy part. The hard part — and the part most teams skip — is turning it into action.
- Share results within 2 weeks — Delayed feedback loses context. The longer you wait, the less the employee connects the feedback to specific behaviors. Use AI-powered feedback analytics to auto-summarize themes from open-ended responses so you can deliver results faster.
- Present results by rater group, not individual — Show the employee their peer average, report average, and manager scores separately. The gaps between groups are the most valuable data points. "Your peers rate your communication at 3.2 while your reports rate it 4.5" is a specific, coachable insight.
- Build a development plan with 1-2 focus areas, not 5 — If the 360 surfaces 4 weaknesses, pick the 2 that matter most for the employee's role and growth path. Spreading effort across everything means improving at nothing.
- Re-run the 360 in 6 months — One-time 360s are snapshots. Repeated 360s show whether coaching is working. Set up recurring surveys to track competency scores over time without manually re-launching each cycle.
The goal isn't a perfect score from every rater. It's a trend line that shows growth in the areas that matter.
Related Employee Survey Templates
Depending on what you're evaluating and when, these templates complement a 360-degree review program:
360-Degree Employee Evaluation Survey Template FAQ
-
What is a 360-degree employee evaluation survey?
A 360-degree employee evaluation survey collects performance feedback from multiple sources — peers, direct reports, managers, and sometimes clients — rather than relying on a single manager's assessment. It typically covers competencies like communication, leadership, and collaboration using rating scales paired with open-ended questions.
-
How many questions should a 360-degree employee evaluation survey template include?
Keep it between 10-15 questions. This template uses 10 — three competency ratings with open-ended follow-ups, plus identifier and recommendation questions. Every question past 15 drops completion rates by 5-8%, and incomplete 360s create biased data that's worse than no data at all.
-
Should 360-degree feedback surveys be anonymous?
Confidential, not anonymous. Raters should know their individual responses won't be shared with the employee — but HR should be able to identify respondents in case of concerning feedback. Aggregate results by rater group (peer, report, manager) so patterns are visible without exposing individuals.
-
How often should you run 360-degree evaluations?
Twice a year works for most organizations — once during the formal review cycle and once as a mid-year development check. Running 360s quarterly creates rater fatigue and rarely produces meaningfully different results between cycles. If you need faster feedback loops, use a pulse survey between 360 cycles instead.
-
Who should be included as raters in a 360-degree feedback survey?
Include 2-3 peers who work closely with the employee, 1-2 direct reports (if applicable), the employee's direct manager, and ideally one skip-level or cross-functional stakeholder. Don't let employees choose their own raters — that introduces selection bias and defeats the purpose of multi-source feedback.
-
Can you use 360-degree survey results for promotion decisions?
You can, but carefully. 360 data works best as one input alongside performance metrics and manager assessment — not as the sole deciding factor. If employees learn that 360 scores directly determine promotions, they'll lobby raters for favorable reviews and the data quality collapses within two cycles.
-
What's the difference between a 360-degree evaluation and a standard performance review?
A standard performance review captures one person's perspective — usually the direct manager's. A 360-degree employee evaluation survey collects feedback from multiple relationships, revealing blind spots that a single-source review misses. The most common gap: managers rate leadership higher than direct reports do.