Thursday, July 31, 2014

Balancing Transparency and Discretion

I've recently became aware of an evaluation process in an intentional community that raised a poignant question about the balance between transparency and discretion, and what it means to be creating cooperative culture.

The Back Story
In the case of this particular community there exists a Board of Directors that oversees 501c3 aspects of community operations (the portion of community life that donors can receive tax deductions for supporting), and that Board is comprised of a mix of community members and non-community members. This is a relatively common arrangement, designed to ensure both: a) that the community's voice is present in Board considerations; and b) that there are outside eyes making sure that the community's educational activities adhere to mission and long-term goals.

Given that this community's vision is to be a model of sustainable living for the wider culture—as is true for many groups—it further makes sense to include non-community members on the Board because of (hopefully) their ability to see more clearly what it will take to bridge between what the community is offering and what the mainstream is available to receiving, which is no simple thing to navigate.

The Lead Up
In this instance there were two particular positions in the community that were hired by the Board, and it came time to evaluate how well those two people were doing in their roles. For the sake of this story let's cleverly refer to them as Position A and Position B. Having gone through iterations of this before (evaluating how well managers and committees are functioning in their roles) the community had developed some years earlier a standard evaluation form for this purpose. To be clear, this form was an internal document that had not been run by the Board for approval because most roles in the community are not subject to Board review.

That said, when it came to evaluate the people in Position A and Position B, the Board was delighted to make use of the evaluation form and process that had already been vetted and was familiar to the community. The norm in this community is that evaluations proceed thus:

o  The Personnel Committee announces that the evaluation is underway (for a set period of time), and sends an electronic link to the job description and the evaluation form. Note that everyone in the community is given a chance to evaluate job performance: that includes the Board, other managers who work alongside this person as a peer, staff who work underneath this manager, and even people in the community who are only occasionally affected by this person's work. While it's common that only a small number fill out evaluations, the net is cast wide.

o  After the comment period ends, Personnel makes sure that copies of all evaluations are sent to the hiring entity (the Board in this case) and to the person being evaluated.

o  The hiring entity then meets with the person being evaluated and discusses what surfaced in the evaluations and decides how best to proceed.

o  At the end of this face-to-face review, both the person being evaluated and someone representing the hiring entity sign a form indicating that this meeting took place and that all parties have seen and had a chance to discuss the points raised in the evaluations. This signed document then gets turned in to Personnel to become part of that person's permanent employee record—which is kept confidential, accessible only to Personnel, the hiring entity, and the person themselves.

It is important to note that this sequence is spelled out in the evaluation form.

The Train Wreck
When Person A was evaluated, only a small number of people filled out forms. While more participation had been hoped for, the process went smoothly. For the most part the feedback was positive and it was not difficult to discuss the ways in which improvement was desired.

Things did not go so well with the evaluation of Person B, which occurred right after evaluating Person A. The number of people filling out evaluations was again small, but this time there was considerably more critical feedback. When Personnel dutifully passed along copies of the evaluations to the person being evaluated, a couple of Board members blew a gasket.

What was Personnel thinking when it blithely shared raw critical comments with Person B? While Personnel was just doing its job—as delineated in the evaluation process—the Board members who were shocked had apparently not digested how evaluations were done in the community, and at least one of them rued the candor with which they described Person B's shortfalls. In fact, it seems the complaining Board members didn't even read the evaluations forms, where the process was laid out. Oops!

In fairness to the upset Board members, they were seeing this through the lens of how things are typically done in the mainstream, where critical comments tend to be summarized (and defanged) before being passed along to the person being evaluated. This simultaneously protects the recipient from being overwhelmed by the bow wave of criticism (however large it is), makes evaluators feel safer in being candid, and makes it less likely that bad blood will result between evaluator and employee.

Going the other way, sanitized feedback is more vague (both in terms of the specifics of what has been challenging, and in terms of how it can often be crucial knowing who gave comments in order to frame their meaning properly), which blunts their value. In line with its commitment to direct and honest communication—including the hard stuff—the community has intentionally embraced an evaluation process where feedback is passed along unadulterated. (If you can't say it to their face, don't say it.) If the recipient struggles to take it in, the community will provide support (this is not about treating people as piƱatas, letting them dangle in the wind while everyone gets free swings).

Finally, passing along evaluations unedited saves the time it takes to craft a sensitive and balanced summary (no one in community complains that's there's too little to do) and neatly eliminates the danger of someone inadvertently seeing the unexpurgated evaluations at a later date, thus defusing what might become time bombs. (And don't tell me that never happens.)

The After Grow
What makes this a compelling story is that no one is wrong and there's considerable tenderness about which road to take. Which path leads to a fuller transmittal of critical information and which leads to its most constructive treatment?

While I applaud the community for bravely setting a high bar for communication standards by embracing direct feedback, there's plenty of room to question whether that quashes the expression of concerns. This is a nuanced conversation that needs to include both an assessment of what's possible now, and what we want to be possible in the future. (If you are not living the change you want to be, how will you ever get there?)

I think the community gets high marks for having a full-featured evaluation process, yet a lower grade for weak responsiveness to the call for evaluations. There is also work for the community to do in bringing the Board into greater awareness about the ways in which the community is expressly trying to be different than the mainstream culture (as well as work for the Board to do in reading forms before they fill them out).

Like a lot of things in community, robust evaluations—ones that are accurate, comprehensive, compassionate, and constructive—are a work in progress.

No comments: