11 Good Reasons Why 360 Degree Feedback Simply Doesn’t Work

Photo by Dreamstime
Photo by Dreamstime

You might have read about something called 360-degree feedback. Depending on who you read, it gets good, bad, or ugly reviews.

People generally agree performance feedback is a good thing, so what goes wrong? How can feedback from multiple raters possibly be a bad thing?

Why do organizations generally toss it out after a few tries? After the initial shock and awe, why does it usually die on the vine? The reasons are quite simple.

11 critical issues with 360 degree feedback

  1. 360 feedback is silly — Think about it: If you want to get information about your performance, why send a questionnaire to people who don’t know you, don’t care, or don’t see your performance? Isn’t it better to ask customers about customer behavior, subordinates about subordinate behavior, peers about peer behavior, and so forth? You will get better data using a few narrow surveys designed for targeted audiences than one great big one-size-fits-all survey.
  2. Too much, too fast — Ask anyone who ever received a 360 report and they will probably tell you the amount of data was so overwhelming it took a half-day workshop to explain. Changing behavior is difficult. We might want to change ourselves, but everyone else wants us to stay the same. In addition, have you ever tried to develop more than one or two behaviors at a time? Keep it simple.
  3. It’s irrelevant — Why ask someone to rate things that have nothing to do with job performance? If being inquisitive is not important to job performance, then leave it off the survey. The surest way to confuse employees is to ask questions and give feedback about silly items. Base survey items exclusively on job specifics.
  4. Guestimates — This part is really irresponsible survey design. Some surveys suggest not rating items that don’t apply to the job or about which the rater is unsure. So far, so good — but what about rater subjectivity? Do we need to statistically analyze data for agreement between raters? Calculate meaningless averages? Asking for survey items that cannot be seen or heard on a frequent basis is error filled.
  5. Reward and punish — Now, here’s a great way to torpedo feedback. As soon as people learn feedback is used to either punish or reward, the system is toast. Keep feedback on a developmental level. Reward and punish based job performance. Treat feedback as the road to performance and rate them separately. Your 360 will either be dead in the water, or, encourage vicious infighting if your organization cannot separate the two.
  6. Organizational bias — External consultant are not experts in any single organization. On the other hand, multiple organization experience allows them to recognize strengths, weaknesses, and differences. The trend I see most is “no one here is below average” or, just as silly, “everyone here is seriously lacking.” This is management craziness. In every organization, people range from the top of the food chain to the bottom. Useful feedback is honest, objective and free of organizational dogma.
  7.  Not job related — What do you think happens when people are asked a lot of questions unrelated to someone’s job performance? Right — you get a lot of answers unrelated to doing the job. The result encourages confusion, frustration, and a bunch of ticked-off people. This a great outcome if you planned to waste someone’s money and time answering surveys. If you want to avoid that problem, either do a formal job analysis to identify specific competencies, or at least talk to the subordinate and his/her manager to get some ideas of what to survey.
  8. Poor planning — It’s simple: Never ask a question if you are unprepared for the answer. Know ahead of time what developmental plans apply to each survey item. Don’t plan? Then don’t ask. And, don’t think recommending a list of books and resources will suffice. It takes an unusual person to engage in self-development. It’s much easier to bury bad news in the bottom drawer and hope it goes away.
  9. Crummy items — Asking questions about someone’s cognitive ability is sure to yield worthless results. Are you asking about intelligence? Accommodation to external factors? Based on what alternatives? Who is best positioned to evaluate the quality? Get the picture? As a general trend, you cannot go wrong designing items following the S.M.A.R.T. goal setting principles. If it’s not specific, measurable, attainable, realistic, and time-bound, then it’s D.U.M.B.
  10. No involvement — I know this is a radical idea, but management is more than a title. It’s a responsibility that involves guiding, coaching, and developing subordinates. Any 360 should be a joint activity between coach and subject, again, using the S.M.A.R.T. principles. Your feedback program is short-lived if the people who benefit most are not involved.
  11. Uncoordinated — Which do you think works better? An organization-wide initiative to accomplish a single goal or one encouraging everyone to do their own thing? I suggest choosing some kind of common goal such as teamwork, better problem solving, initiative, creativity, goal setting, and so forth. It really doesn’t make a difference just as long as everyone is in it together. Within the umbrella item (e.g., teamwork) each individual employee gets to choose those job-specific teamwork elements making the greatest difference to him/her, and for forth. Group involvement is a great way to encourage group development — and it makes life much easier for the training department.

From a great idea to life support

Let’s summarize: Common sense and best-practices require shared group support; manager and subordinate to working together; isolating a specific area that affects a specific audience; developing a few critical S.M.A.R.T. items; gathering honest and unbiased feedback; using the data to support planned development activities; developing the skills; and, follow-up surveys to check for progress.

Too much work? Management would resist it? You always have a fallback position.

Article Continues Below

Buy a great big survey with fuzzy generic items that apply to everyone; deliver it up, down, and sideways; summarize the results in a huge report; send subordinate back home with a page of developmental resources; and misuse results in performance reviews. The first year it will be hailed as a great idea.

The second year, however, it will be on life support. In three years it will be dead, and you will have wasted tens of thousands of dollars, managers and subordinates will be thoroughly irritated with you, and your professional reputation will get another hit.


14 Comments on “11 Good Reasons Why 360 Degree Feedback Simply Doesn’t Work

  1. Dr. Williams, after reading this post, it seems that the only issue specifically related to 360 degree feedback is #2.  All of the other identified issues seem like they could relate to ANY feedback method that is poorly designed.  I’ve seen 360 feedback work, (this was a small organization of 17), but it was designed well and had great exection.  I think that this post might have been better targeted to improvement of feedback methods and performance review in general, and not just singling out 360 feedback.  That being siad, the issues you listed and the ideas to improve them are very sound.

    1. Actually, Kelli…yes, these are important for all feedback, but if you think about it, they are MOST important for 360 because there is no one there to explain, clarify, or answer questions

  2. Nothing on that list has to be true or a problem.  The author suggests every 360 survey strategy is the same. Yes, there are “consultants” who take the easy path to make a buck but with adequate planning those problems could easily be avoided.

  3. 360s are like lawnmowers. They are invaluable to most people. But because some boob once misused one, now they come with warning stickers. Make 360s optional for your leaders. Give them to the managers that sincerely want to use objective feedback for development and self-regulation. Stop using them as interventions to single out leaders with derailing behaviors. (for those, just politely ask them to stop being a blowfish).

  4. Bravo. 360 used evaluatively rather than for development only defeats the purpose entirely and poisons the water for everyone involved. Similarly, dropping a complex 360 report on anyone’s desk and walking away is felonious.

    A well-designed 360 delivers pointed, highly useful information and should be a part of every organization’s developmental tool kit. I like instruments with the flexibility to deliver a core set of statements to all audiences and targeted statements to individual perspectives – this addresses your point #1. It is an art in my opinion to balance brevity and depth while maintaining focus.

    Assuming organizations are smart enough to get everything else right, your point #6 about avoiding organizational bias is crucial, and perhaps the most difficult for many to master. Frankly, all of this seems simple and obvious, and I suspect that’s part of what makes it difficult when it comes time to execute.

    Provocative title, provocative article. I like it!

  5. Dr. Williams I agree with your article and issues around 360 feedback; However isn’t SMART principle more task-objective driven with no E.I. evaluation?

    1. Good question…When setting goals, the SMART acronym emphasizes how to achieve clarity and objectivity. I think this applies just as well for 360 questions. After all, don’t you want to be Specific and evaluate things that are Measurable, Achievable, Realistic, and Timely?  

  6. Poorly designed and executed 360’s can be horribly harmful. When they are properly thought out, aligned to specific objectives and executed properly though they can be invaluable. Are they right for pan-enterprise deployment? I don’t necessarily believe so. Are they good tools for examining the underlying effects of a poor performing team or leadership? I believe so.

  7. This posting takes a too narrow view of 360 by generalizing to all programs based on some bad experience with some.   Like ANY process, 360 feedback can be done poorly and wastefully.  I notice that Wendell specializes in selection.  You could say hiring tests are a waste too… if you implement them poorly.  Selection tests don’t work at all…if you give poor instructions, score them wrong, measure irrelevant content, etc.  This list of 11 things about 360 reads to me more like a cautionary tale about how 360 can be done poorly.  Unfortunately, the author doesn’t recognize that it can be done well.  This is why my firm works so hard to educate clients about how to design their 360 program so it works well and doesn’t fall into these design traps.  check us out: http://www.3dgroup.net.  None of our clients fall into these traps…because we help avoid them.  

    1. I’m not exactly sure what point you are trying to make. I’m a psychometrician…someone who specializes in identifying job skills and measuring them…That can apply to a lot of things…Aside from using my article to promote your own product, exactly which 360 recommendation you think we can do without? Job relatedness? Specificity? Objective questions? Group and managerial support? Audience-specific surveys? Integrated developmental plans?

  8. Wendell, I’m happy to be more specific and thanks for asking.  Let’s keep it simple.  We’ll start with your items #3, #7 and #9.  Say your co-workers give you low ratings on the item “follows through on commitments.” This is a generic item, right? But wouldn’t you say this is part of every job?  Is it job related?  Well yes, but it isn’t on a job description.  On the other hand, knowing that your peers feel you are not reliable and that you frequently don’t deliver on your commitments to them is incredibly valuable.  Often this happens when you have had no clue about the shortcoming because no one was willing to tell you to your face.  That’s what 360 feedback does so well – it gives people a chance to help you learn how to be more effective with your fellow employees. But wait, you say, you do follow through on commitments, because your direct reports gave you high marks (I know, psychometricians who are not familiar with multi-source data love to think of this as poor reliability, when in fact it is merely a valid rating of behavior that varies by group).  The fact that your direct reports (and maybe even your boss) rated you high on this item may mean you only need to improve your follow through with peers. Perhaps you spend so much time worrying about your boss’s impressions that you neglect your peers and become a bottleneck for firm-wide productivity when they have to rely on your inputs for their work.  So, is helping you to get better at follow through with your peers valuable for your career and your company?  I would say yes.  Testing for cognitive ability won’t help, but properly designed and implemented 360 feedback will.

  9. What exactly a 360 degree feedback basically means to in terms of the work management. It basically says that the even if any work starts from an individual, it will come to the same individual after certain process involved in it. But what I figured out from this post that this strategy will not work specifically for every operation of work management that needs to be synchronized and maintained with.

    Take for example the work management of a service based industry, the sales personal gets a lead, the over all project gets processed with the help of certain experts who work on it to get the things done up and after the successful completion of the project, the delivery is being given to the respective client by some other team. No doubt the work initiated from the client and ended up with them but the work management basically if we look out at then its not 360 degree, because the initiation and completion is being done by two different teams. More over the process flow should be perfect and upto date.

    In this procedure the deployment and inclusion of online web based tools are more importantly recommended to have a better go through. Looking ahead with all the consequences and significance, we have opted to go with the cloud based fully automated Replicon’s task management tools ( http://www.replicon.com/olp/task-management-software.aspx ) which makes the output with high significance that makes the work management a much better one to be ahead.

Leave a Comment

Your email address will not be published. Required fields are marked *