Administration Evaluation Committee (AEC)

Report

to the University Senate, Senate Assembly and SACUA

on the Administration Evaluations carried out in Fall 2004

Prepared by the AEC members:

                                                                                                            Richard Alfred

                                                                                                            Dan Burns

                                                                                                            Louis D’Alecy

                                                                                                            Sugih Jamin

                                                                                                            John Lehman (Chair)

                                                                                                            Keith Riles

                                                                                                            Brett Seabury

                                                                                                            Wayne Stark

                                                                                                            Michael Thouless

                                                                                                            Pete Washabaugh

                                                                                                            Don Winsor


March 15, 2005


HYPERLINKS TO REPORT SECTIONS

A. Summary and Recommendations

B. Background to Faculty Evaluation of Administrators

C. Committee Operations

D. Questionnaire Construction

E. Information Technology Implementation and Data Handling

F. Reporting Practices

G. Results of Year 1 Evaluations: Ann Arbor Campus

H. Feedback to AEC

I. Response to feedback and possible improvements

J. Recommended Action Items for Senate or Senate Assembly including Rational


A.  Summary and Recommendations

In accordance with a charge enacted by the University of Michigan Faculty Senate in March 2004, the Senate and Senate Assembly conducted an evaluation of university administrators during December 2004.  A total of 864 individuals participated in the evaluations (28% of all eligible) and 2511 evaluation forms were submitted (20% of all possible).

In adherence to the timetable stipulated by the Senate, the Administrator Evaluation Committee (AEC) of the Senate developed electronic questionnaires, designed and developed software and information systems, and instituted reporting practices novel to faculty governance for the purposes of this evaluation.  Experience with and feedback received from the inaugural round of evaluations have been positive and constructive, Future evaluation efforts will be beneficial to faculty governance. 

This report describes the structure and operation of the AEC, the information technology and management systems developed, as well as key results of the evaluations. 

Critical findings identified across all units are as follows:

Complete evaluation results for all administrators are available to all University Senate members at http://aec.umich.edu

The AEC developed a set of recommendations for action by the Senate Assembly, listed in section J of this report.  Among these are the following matters of timing:

In addition, the AEC requests guidance from the Senate Assembly or the Senate about the expected degree of openness in reporting evaluation results.  Presently, the annual results published on the AEC website are accessible to Senate members and Regents of the University of Michigan only.  Other possibilities include permitting access to a larger share of the U-M community, or unrestricted access. 

Back to Top

B.  Background to Faculty Evaluation of Administrators

On 15 March 2004 the University Senate of the University of Michigan mandated periodic evaluation of the university administrators by the faculty.  During Fall Term 2004 the Faculty Senate performed its first evaluation of administrative officers at the Ann Arbor campus, and distributed the results in January 2005.  The president, provost, deans, and department chairs were subject to evaluation by Senate members within their respective units. 

The call for evaluation of U-M administrators began as a grassroots effort stemming from dissatisfaction with the status quo.  Dissatisfaction traced to a lack of accountability by administrators to the governing faculty of the university, and was articulated in a Faculty Perspectives Page article (‘Administrative Accountability,’ The University Record, 12 Jan 04). 

Actions ordered by Resolution of the University Senate on 15 March 2004 included the following:

  1. Creation of a system of periodic evaluation by the faculty of administrative officers including the president, chancellors, provosts, deans, and department chairs.
  2. Creation of a standing committee (Administration Evaluation Committee, or AEC), chaired by the Secretary of the University Senate
  3. Delegation of authority and responsibility to the AEC for:

(a)   Development of Evaluation Questionnaires;

(b)   Implementation of the evaluation as a web-based procedure that must begin operation by Fall of 2004;

(c)   Reporting the results to the Senate and the U-M Board of Regents.

The Senate articulated the goals and timeline, and charged the Senate Assembly with the specific task of creating a committee to do the job.  The Senate Assembly meeting of April 2004 was ruled to be short of quorum; hence, the Assembly was unable to populate an AEC as required by the University Senate.  The Senate Assembly agenda of September 2004 did not include attention to the AEC.

Elected faculty governance members and faculty governance volunteers formed a provisional AEC over the summer of 2004 to help fulfill the charge from the University Senate.  The structure of the provisional AEC work groups paralleled the actions required by the University Senate.  In October 2004 the Senate Assembly voted to ratify the plans developed by the provisional AEC and installed its members formally as the inaugural AEC.

AEC Structure and Membership

Questionnaire Work Group

Richard Alfred (Education) 

Keith Riles (LSA)

Brett Seabury (Social Work)           

Michael Thouless (Engineering)

IT Work Group

Sugih Jamin (Engineering)

Wayne Stark  (Engineering)

Don Winsor (Engineering)

Reporting Work Group

Louis D’Alecy (Medicine)

Dan Burns (LSA)

Pete Washabaugh (Engineering)  

Back to Top

C. Committee Operations

Division of responsibilities permitted the work groups to function semi-autonomously with general coordination by the committee chair.  Questionnaire and IT groups interacted at the stage when the questions had to be integrated into the electronic instrument.  Reporting and IT groups interacted to implement the desired reporting formats and distribution of information.

The full committee met on three occasions to review status and make strategic or operational decisions that affected overall committee function.  These decisions involved, for example, specific dates for the evaluation, specific dates for release of evaluation results, and dates for communications to Senate members and administrators.

Back to Top

D. Questionnaire Construction

The questionnaire subcommittee had the charge to develop a draft set of questions for evaluation of administrators at four levels: President; Provost; Dean; and Department Chair.  Dearborn and Flint Chancellors were to be treated at the same level as the Ann Arbor Provost, but the Year 1 round of evaluations was confined to the Ann Arbor campus only.  

The subcommittee first met 23 June 2004 to discuss broad issues and candidate questions. The subcommittee then exchanged follow-up questionnaire drafts iteratively by e-mail and met again on 26 July to finalize questions and recommendations.

At its first meeting, the subcommittee agreed to the following general principles:

·        Questions should be worded so that responses guide administrators toward self-improvement.

·        Statistics on responses should be public and detailed.

Four categories of questions were selected:tee decided also to divide the questions into four categories:

 The responses to categories 3) and 4) would be provided to the administrators being evaluated through an automated system, to preserve confidentiality.

Among the questionnaires, the following core questions were to be common to all administrators:

• [administrator] actively promotes an environment for scholarly excellence

• [administrator] actively promotes an environment for teaching excellence

• [administrator] consults the faculty adequately before making important decisions

• [administrator] makes excellent administrative appointments

• [administrator] inspires confidence in leadership overall

The final question was meant to serve as an overall assessment of the administrator’s performance after the specific questions above it had been considered. Based on feedback received from participants in the inaugural evaluation round, future iterations of these questions may change somewhat to sharpen their diagnostic value. 

There were other core questions common to some but not all administrators:

There were two core questions unique to administrative positions:

Topical questions were defined for all administrative levels, except the Department Chair:

Feedback received during the inaugural round subsequently indicated that many participating faculty found the diversity question confusing, and the placement of  “About right” in the middle of possible choices made the interpretation of median response values more difficult than for other questions. The wording of the question and response choices will be revised in the future, in response to faculty feedback.

The subcommittee recommended to the AEC that each administrator be given two weeks notice of planned core and topical questions each round before the administrator’s own optional supplementary questions are due. Feedback received during the inaugural round subsequently demonstrated that some participating faculty would welcome a statement of goals and accomplishments from each administrator, in addition.

Concerning question development for future evaluation rounds, the subcommittee recommended that faculty be invited to suggest new topical questions to the AEC for the President, Provost, and Deans. It may be appropriate in the future to consider unit-customized topical questions. The subcommittee is not confident, however, that the AEC is well positioned to consider topical questions at the Department Chair level.

Back to Top

E.  Information Technology Implementation and Data Handling

E1. General design goals

Four design goals were emphasized for the initial implementation of the AEC on-line evaluation system: ease of use, anonymity, security, and minimal cost.  Concerns that any unnecessary burden placed on the users of the system would discourage participation made ease of use a primary goal.  Because many faculty members indicated that any breach of anonymity would discourage participation, strong guarantees of anonymity were also made a primary design goal.  Security was a primary goal to maintain confidence in the integrity of the results.  The goal of minimal cost was imposed by the limited resources (volunteer time and computer hardware) for the initial implementation.

 E2. Implementation

To produce a working prototype on time, existing infrastructure and software were used whenever possible.  A web solution was designed based on the existing CoSign infrastructure.  CoSign is a system for authentication for secure web services developed at the University of Michigan as part of the National Science Foundation Middleware Initiative (http://www.umich.edu/~umweb/software/cosign).  CoSign is now in production use at U of M for a variety of secure services including access to course rosters, personnel and payroll information, and other sensitive data.  Without such a central authentication system, it would have been necessary for the AEC to issue and manage accounts and passwords to over 3000 Senate members.  This would have been impractical with the resources available.

The selection of CoSign influenced other technical design decisions.  For secure data transmission, https (Hypertext Transfer Protocol over Secure Socket Layer) was selected.  This is a widely used standard for transmitting confidential data such as credit card numbers or other sensitive financial data, and it is a prerequisite for the CoSign system.  A Sun Solaris system running Apache web server software was chosen for the web site, as this is currently the best supported web server platform for the CoSign system.

The choice of programming languages for the implementation of the system was made with several criteria in mind.  Widely used languages were preferred to make it relatively easy to hand off operation and maintenance of the system to new people.  Open source systems were preferred for several reasons, including positive experience with the security of open source systems (and negative experience with proprietary systems), the desirability of having a completely open end product that would be straightforward for independent technical experts to review, and the lack of funding to purchase proprietary software.  With these criteria in mind, three high level scripting languages were selected for components of the system.  The website was implemented in PHP (http://www.php.net), supporting tools to prepare configuration data for the website and to process results were written in Python (http://www.python.org), and tools to prepare the faculty affiliations database were written in TCL (http://tcl.sourceforge.net).  All three of these programming languages are free, open source projects that are widely used in both academic and commercial environments.

The software developed for the initial system consists of a web server component plus a number of supporting software tools.  The web server component is a module written in the PHP scripting language that provides all of the functionality necessary for the web based questionnaire system.  Although it is fairly compact (426 lines), it includes functions to support all of the following tasks:

Several additional pieces of support software, not running on the web server, were also developed:

For hardware, a Sun Microsystems Sun Blade 1000 workstation with a single 750 MHz UltraSPARC-III processor and 512 megabytes of memory was selected.  This choice was made based on what was already available, as there was no funding to purchase a dedicated server for the initial run of the AEC system.  The machine chosen was a multipurpose Sun server in the EECS department with some spare capacity, and preliminary work had already been done to set up a CoSign based secure web server for an EECS student honor society.  The student honor society did not have an immediate need for the server during the fall 2004 academic term, so it was made available to the AEC.

E3. Anonymity and security

The goals of achieving both anonymity and security for the AEC evaluation system inherently conflict.  This section gives an overview of the methods used to adequately address both concerns.  Security is necessary to ensure the integrity of the statistical results from the evaluation process.  Questionnaires should be accepted only from authorized faculty members, and each faculty member should evaluate each administrator only once.  This is directly analogous to an election in which controls are needed to ensure that only registered voters can vote and to prevent ballot box stuffing.

Conflict arises between security and anonymity because security can only be achieved if the identity of the participant is known, while anonymity depends on not knowing this.  To resolve this conflict, the AEC system was designed to model a version of a traditional election system with paper secret ballots.  In such an election system, when a voter enters the polling place, one election worker checks identification and verifies presence on the voter registration list.  Once this is done, the election worker issues the voter a numbered paper ballot and records the ballot number. The ballot number is on a removable perforated edge of the ballot.  The voter takes the ballot to a private voting area and completes it.  The completed ballot is then placed in a privacy envelope with only the ballot number exposed and the voter gives it to another election worker.  This election worker tears off the ballot number and transfers the now anonymous ballot from the privacy sleeve into the ballot box without looking at its contents.  The detached ballot number is given to the election worker maintaining the voter list.  This is used to cross that voter off of the list of eligible voters, so that nobody can vote twice.

The electronic system follows this model fairly closely with some minor modifications.  In particular, each administrator questionnaire is treated as a separate ballot.  This allows a faculty member to evaluate different administrators in separate sessions.  Faculty members do not need to reserve a large enough block of their time to be able to complete the evaluations for all of their administrators in a single session.

E4. Overview of security implementation

When a person enters the secure area of the AEC web site, they are first redirected to the server that the University of Michigan ITCS (Information Technology Central Services) runs for CoSign secure web authentication.  The AEC web site has been registered with the ITCS CoSign server as a valid secure service within the University of Michigan.  The AEC purchased a web server security certificate to identify the server in an authenticated way to the user's web browser.  This is to make the system resistant to someone bringing up a phony web server that claims to be the authentic AEC web site.  The cost of the $50 security certificate was provided by a faculty member.

Once the uniqname has been authenticated by CoSign, the user is returned to the AEC web server using a connection encrypted with SSL, to prevent outside eavesdropping on the content of any questionnaires.  At this point the web server has assurance that it has a secure connection to a properly authenticated faculty member.

Internal to the AEC web system, uncompleted questionnaires have numbers associated with them, analogous to the printed numbers on the tear-off strip at the end of the paper ballots.  There is a master checklist of which questionnaire numbers have been used.  This checklist is implemented as a single file on the AEC web server so that it is impossible to use file timestamps to determine the time at which any particular questionnaire was used.  Eligible faculty members have each been assigned a small block of these questionnaire numbers sufficient to cover all of their administrators plus a small number of spares that can be activated should a missing affiliation be reported during the evaluation process and verified by the AEC.

When a faculty member accesses the web page with the menu of administrators to be evaluated, the system looks up the person’s block of questionnaire numbers and consults the master questionnaire number checklist to see which of these numbers are still unused.  When a faculty member completes a questionnaire and submits it, the system verifies that the questionnaire is from the authenticated person and that it does not have a questionnaire number that has already been used.  Any forged or duplicate questionnaires are rejected at this time.  If a questionnaire is identified as forged or duplicated, it is immediately discarded.   An error message is displayed to the user to indicate that the user had already evaluated this particular administrator.  Although this situation is unlikely to arise in normal operation of the system, a user could cause it by using the back button on the web browser after submitting a questionnaire to try to go back and submit it again.  In this case the duplicated questionnaire is immediately deleted and the user gets an error message.  If a questionnaire is determined to be valid, the web server immediately deletes all identification information from it (i.e., uniqname of the participant and the questionnaire number) and places it into a folder of collected questionnaires.  Immediately after putting the now anonymous questionnaire into the collection folder, the questionnaire number is checked off in the master list of questionnaire numbers so that it cannot be reused.  Note that if a questionnaire is rejected as a duplicate, the original questionnaire that was submitted by that faculty member for that administrator is not affected; it is already in the anonymous collection folder, and there is no way to specifically identify it.

One concern identified when the system was designed was the possibility that file timestamps maintained by the server operating system could possibly be used to defeat the anonymity of the system by correlating file modification times for the questionnaires with times from web server or CoSign system log files.  To prevent this, an auxiliary program was implemented on the server to periodically shuffle the collected questionnaires.  This destroys any information on the time they were received or the order in which they were received.  This defeats any later analysis based on file timestamps to determine which questionnaire came from which faculty member.

At no time does the system maintain a list of uniqnames of those who have actually submitted questionnaires.  This could be partially inferred from the web server logs, but there are a number of limitations to this.  ITCS logs the authentications through CoSign, but the logs are in highly secure area and the CoSign system keeps absolutely no record of what was actually done; none of the questionnaire material ever passes through their system.  All it does is provide verification of identity, analogous to checking voter IDs as they enter the polling place.  Also, even if someone obtained the logs, they would be of limited use since they would be cluttered with all the other uses of CoSign (faculty access to CourseTools sites, faculty and staff access to payroll and HR data, faculty and staff access to Wolverine Access course lists, etc.).  If a particular person's name was on the CoSign logs, that would be no proof that they submitted any AEC questionnaires.  The only statements that could be made from the CoSign logs with certainty would be negative ones.  If someone never used any U of M secure web services, this could be verified by their complete absence from the logs.  On the AEC web server, the situation is similar except that only accesses specific to the AEC site would be recorded.  From these logs also, only negative conclusions could be made with certainty. 

The single most security sensitive item concerns the internals of the AEC server.  While it is up and running, there exists both a list of the assignments of questionnaire numbers to faculty members and a list of which questionnaire numbers have been used.  As a matter of security policy, we never keep an actual list of uniqnames that have participated.  Given both the questionnaire numbers and the master list of used numbers, it would be technically possible to reconstruct such a list of uniqnames, but we intentionally never do this.  Immediately at the end of the evaluation process, the list of used questionnaire numbers and all web server logs for the AEC web server were destroyed.  This was done before any results were analyzed or reported.  Because these items have been destroyed, nobody will ever be able to say with certainty who actually submitted questionnaires, aside from obvious inferences that can be made in situations such as when a department gets a 100% or 0% response rate.

After the close of the evaluation period, only the folder of collected questionnaires was preserved for use in the report generation.  These questionnaires identify the administrator being evaluated, and in the cases where reports are broken out by unit, the originating unit is also identified.  Neither the uniqname of the person responding nor the questionnaire number is ever recorded in a saved questionnaire.  Once the results were tabulated and the private questions and comments were transmitted to the administrators, the questionnaires were also destroyed.  Thus, no information remains anywhere that could ever be used to break the anonymity of the system.

E5. Future software recommendations

Several lessons learned in developing the initial system should be applied to improve the next generation version:

Because the CoSign authentication system is an essential integral part of the AEC questionnaire system, maintaining good compatibility with CoSign is critical for future versions of the system.  Although Sun Solaris is currently the preferred hosting platform for CoSign authenticated services, this may change.  Support for Linux is increasing steadily at the University of Michigan, and this may emerge as the preferred platform.

E6. Future hardware recommendations

The initial implementation of the system worked fairly well, although two noteworthy problems came up during the operation of the system.  Late in the evaluation period, a disk drive on the server failed.  The system and its data were easily transferred to a spare drive, so downtime was limited and no results were lost.  The web evaluation system was also found to be more memory intensive than had been anticipated.  The 512 megabytes of main memory on the server were only marginally adequate to support peak user loads.  System response time degraded noticeably during heavy load periods, although it never became unusable as a result of overload.

Based on our experience with the hardware that was loaned for the initial system, several recommendations can be made concerning the selection of future server hardware.  Adequate main memory is the most critical requirement.  The next server used should have at least 1024 megabytes of main memory.  To forestall obsolescence, it is recommended that a server be selected with room for future memory upgrades to at least 4096 megabytes.  Processing speed is not a major concern for this system; the system does very little processing except for a one-time tabulation of the results at the end of the survey period.  Most of the processor load is from the necessary overhead of running a multi-user web server and encrypting all of the traffic.  The initial system used with a single 750 MHz processor was found to be entirely adequate.  Hard disk requirements are minimal.  The entire evaluation system consumed only about 15 megabytes.  Preparation of configuration data and processing of results requires some additional space, but the total is still only several hundred megabytes.  The Ethernet interface on the server used ran at 100 megabits per second.  This was found to be entirely adequate; network bandwidth requirements were minimal.  Thus, almost any new system configured as a web server would be adequate in terms of processing speed, hard drive capacity, and network bandwidth.  The most important requirement needing attention is main memory capacity.

Although a specific recommendation of hardware for a dedicated AEC server is beyond the scope of this report, a rough budget estimate of $4,000 should be adequate for hardware.  At current university pricing, this would be sufficient to buy a small rack mounted Sun server configured appropriately for this application.  Before any money is spent to purchase hardware, the AEC IT work group should consult with ITCS staff to determine what web server platform will have the best long-term support for CoSign.

E7. IT contributors

The system design was developed by the IT work group (Sugih Jamin, Wayne Stark, and Don Winsor).  The entire AEC participated in setting the design goals and evaluating prototype implementations.  Sugih Jamin and Don Winsor designed the security model.  Keith Riles collected the faculty affiliation data and the information on the structure of the university organizational units and he developed the TCL program used to process this data into a convenient format.  Don Winsor wrote the HTML pages for the website, the PHP library, and the Python programs to process data.  Several individuals not part of AEC also made valuable contributions.  EECS staff members Laura Falk and Hugh Battley identified a suitable server for the project, assisted with its configuration, and installed the Apache web server with the CoSign module.  ITCS staff provided helpful assistance in resolving issues with getting the EECS server to interact properly with the CoSign server.

Back to Top

F.  Reporting Practices

F1.  Format for Reporting

The AEC website should provide Senate members with access to evaluation data in a variety of user selectable formats that would facilitate inspection of the data in the form of histograms, means, medians, standard deviations of scores (1-5 scale) and response rates.

For free-form comments, a secure transcription of the comments in randomized order should be transmitted automatically to the evaluated administrators at the conclusion of the evaluation period. 

For the President, Chancellors and Provost, information should also be reported in aggregate as well as separately by each College or School. 

For Deans, information should also be shown in aggregate as well as separately for each Department with 10 or more full-time faculty. Information from departments with fewer than 10 faculty should be shown consolidated.

Executive officers and deans are provided summary evaluation information regarding their subordinate administrators, and the same information is made available to all Senate members through the AEC website.

F2.  Schedule for Reviews

The principles deemed important by the committee were as follows:

Annual schedule of review, during the Fall Term, satisfies these criteria.  The subcommittee did not favor allowances or exceptions in cases of new incumbents or retiring incumbents because of the resulting complexity of record keeping and decision making.

Back to Top

G.  Results of Year 1 Evaluations: Ann Arbor Campus

G1. Evaluation Scope

G2. Response Rate

President – 16%

Provost    – 18%

Deans – range from a high of 60% (Social Work) to a low of 7% (Law). Two units with response rates below 15% (Law and Pharmacy) are omitted from the subsequent analysis.

Department chairs – range from a high of 58% (Chemical Engineering) to 0% (nine departments: four from Music, two from Dentistry, and one each from Medicine, Pharmacy, and LS&A).  The 32 departments with response rates below 15% are omitted from the subsequent analysis.

G3. Comments on the Results

G3.1. President

·        The combined results for all units have median responses to all questions ranging from a high of 4.13 to a low of 2.99 (out of 5.00).

·        The highest median response concerns representing the University to the outside constituency; the lowest median response concerns consulting with faculty.

·        Schools and Colleges that gave low marks to their deans tended to give lower marks to the President, too.

G3.2. Provost

·        The combined results for all units have median responses to all questions ranging from a high of 4.04 to a low of 3.15.

·        The highest median response concerns promoting scholarly environment; the lowest concerns consulting with faculty.

·        Schools and Colleges that gave low marks to their deans tended to give lower marks to the Provost, too.

G3.3. Deans

G3.4. Department chairs

G3.5. Emerging trend

Improved consultation with faculty by administrators at every level would be desirable in the view of strong faculty opinion.

G4. Needs for Survey Improvements

G5. Conclusion

The 2004 evaluations serve as a useful diagnostic for both the university administration and the governing faculty on which administrators are perceived by participating faculty to be most – and least – effective. Experience obtained during the inaugural round of evaluations provides a basis for improvement of the instrument and interpretations in future iterations of the process.  The variations observed in median response scores to different questions indicate that participating faculty gave serious thought to their responses, and those variations provide specific guidance to administrators toward self-improvement. 

Back to Top

H.  Feedback to AEC

The AEC received 25 feedback comments through its website in direct response to the first survey.  Eight of the comments expressed support whereas two were negative. Most of these messages were neutral in their tone; they questioned some aspect of the process and generally offered suggestions for improvements.  The messages ranged in length from a phrase or sentence to more than a page. The topics addressed had in fact all been discussed to some degree by the AEC before implementation.

Some respondents wrestled with the same issues that were debated by the AEC. For instance, some advocated calculating and displaying both percentages and raw number responses. Some advocated discarding results if the sample size was 15 or smaller. Two individuals argued that anything less than a 50% rate of response would invalidate the results.  Concern was expressed about the timing of the survey near the end of the term because it was so busy. Some had questions about the scope of the survey and why it was limited to a subset of University members and administrators. Some of the feedback offered editorial corrections for the website. 

The three most repeated topics involved the question about diversity, the low response rates from certain units, and guarantees about security and anonymity. The diversity question was viewed as being ambiguous and therefore of little value.  A few people pointed to the overall response rate and questioned the value of the entire exercise. Some of those comments clearly came from people who were evaluated. A low response caused them to dismiss both numerical scores and written feedback as well. One person proposed that an incentive system be instituted to encourage participation.

There was enough concern about security and anonymity that some faculty did not participate in the survey other than to provide comments. There was skepticism that security and anonymity could be maintained, and such skepticism probably had an impact on the number of responses. Written remarks to administrators were of particular concern, but even reporting the raw numerical scores concerned one person.  One person expressed concern that his or her identity could be deduced from comments directed to specific administrators and instead opted to report personal experiences of ethnic intimidation by administration in an anonymous comment to the AEC.

Back to Top

I.  Response to feedback and possible improvements

This report contains a detailed section that addresses concerns about anonymity and security.  The software is open for inspection by any interested member of the Senate.  Suggestions for additional improvement are always sincerely welcome. 

To the extent that response rates are limited owing to concerns about intimidation and reprisals, those concerns are now a matter of record, and they should be addressed by both central and unit administrations.  It should be a matter of both policy and practice that Senate members be able to express opinions without adverse consequence from administrators.  There is unequivocal evidence that a sense of intimidation exists, and that is deplorable.

To the extent that response rates are limited because there is doubt whether the evaluation process will improve conditions, it is incumbent on the Senate Assembly and other elected leaders to react to the expressed opinions.  University of Michigan academic administrators are recognized as doing a poor job of consulting adequately with the faculty before making decisions.  Some deans are doing a better job than others; some deans appear to be doing a very poor job in the eyes of many faculty.  This annual survey now offers a metric by which performance and improvement can be measured.

To the extent that response rates are limited because many faculty do not even realize that they are members of the University Senate, faculty governance itself needs to engage in education and outreach to all members of its constituency.

There are many opportunities for technical improvement to the evaluation process.  The question about diversity should be re-examined and either improved or replaced.  An explicit logout function for the secure parts of the AEC website will be implemented.  Additional recommendations and their rationales are listed below in section J.

Back to Top

J.  Recommended action items for Senate or Senate Assembly including rationales

J1.  Faculty evaluation of administrators should be conducted as an annual process during the Fall Term of each year.

Rationale-  An annual schedule will institutionalize the evaluation process in a way that becomes part of the expected academic year routine for both faculty and administrators alike.  This provides the foundation for a longitudinal time series from which trends can be readily identified, and used to guide administration policy and practice.

J2.  The target dates for evaluation should be 1 to 15 November; results should be posted in early December.

Rationale- This timetable avoids the time conflicts that constrain Senate members at both start and end of a term.  Fall Term is selected because it was the precedent established by action of the University Senate.

J3.  The AEC should continue to operate with three subcommittees (Questionnaire, IT, Reporting) that interact freely as needed.

Rationale- These three subcommittees parallel the required functions of the AEC.

J4.  Composition of the AEC should be at least 9 members plus the chair. 

Rationale- Membership of roughly three per subcommittee has proven to be an effective working size for task-oriented activities.  The size of the committee of the whole provides sufficient variety of perspective for fruitful strategic discussions.

J5.  Repopulation of the AEC should occur as follows:  (1) continue with the existing committee membership throughout 2005, to permit the members to upgrade AEC system 1.0 to system 2.0;  (2) In winter term 2006 add three new members to the AEC, but retain any old members willing to serve for one more year to help the new members learn the system; (3) in winter term 2007 retire up to 6 members of the original committee and add three more new members; (4) achieve steady state in winter term 2008 whereby 3 members retire and 3 members are added each year.  Terms of new committee members should be three years, with renewal permitted.

Rationale-  The committee members who designed and executed the initial evaluation system are optimally suited to incorporate lessons learned from the first round into an improved system.  There is a significant degree of specific technical knowledge and familiarity with AEC database systems required for functionality of the committee.  New members would not typically be expected to master this system without study.  One of the key goals for round 2 and system 2.0 is the production of technical documentation that can be used to train new committee members.

J6.  It is essential that at least three members of the AEC possess the requisite computer programming and network service skill to operate the complete system.  System documentation will be produced in the transition from system 1.0 to 2.0.

Rationale- We learned by experience in Round 1 that technical failures can occur at short notice.  We were lucky to diagnose a failing hard drive on the AEC server in the midst of evaluations, and to transfer the system intact to a new machine without losing any data or functionality.  Redundancy of technical competence is essential because the IT component is the most vulnerable point of potential failure.  IT is also the group that interacts with both other subcommittees and translates their requirements into original computer code.

J7.  One or more members of the AEC must be versed in survey techniques.

Rationale- This capability proved invaluable in the design of questions as well as in providing guidance to the rest of the committee about interpretations of responses.

J8.  The AEC requests guidance from the Senate Assembly or the Senate as to whether the annual results published on AEC website should be accessible to Senate members and Regents only, to all members of the U-M community, or without access restriction.

Rationale-  The aggregate responses to core and topical questions from 2004 were the subject of a FOIA (Freedom of Information Act) request in January 2005.  The U-M FOIA office granted the request on the basis that there are no privacy issues involved for these types of summary data.  Both AEC and FOIA officers concur that privacy concerns are adequately addressed by a practice of aggregating together data from units which have fewer than 10 Senate members.  In the past, the FOIA office has also granted access to the results of deans’ evaluations conducted by unit faculty. 

J9.  The AEC recommends that the Senate Assembly conduct a Forum during one of its meetings in Fall Term 2005 to discuss the evaluation and feedback.  Recommended timing is late September to late October.

Rationale-  The advent of 360-degree evaluation of administrators is a landmark event at the University of Michigan, and there must be differing opinions about how it can be best used for improvement.  This timing will also raise awareness of impending Round 2 and hopefully encourage even greater participation.

J10.  Prior to each annual evaluation season, administrators should be invited to submit statements describing their goals and accomplishments as well as individualized questions.

Rationale-  Many Senate members responded NBJ (no basis for judgment) regarding the activities of administrators.  Specific requests were submitted to the AEC from some of these people for increased information flow from their administrators.  This proposal would permit faculty to evaluate activities that the administrators regard as their key accomplishments.

J11.  The Senate Assembly should encourage the Senate members who serve on unit executive committees to offer suggestions about the process that would increase its usefulness within the units. 

Rationale- The purpose of the evaluation process is (1) to encourage continuous improvement of the university by providing effective feedback to the administration, and (2) to promote accountability of administration to the governing faculty rather than strictly to their own administrative superiors.  Senate members who serve on unit executive committees are uniquely situated to highlight areas where broad faculty opinion could be informative. 

J12.  As a constructive follow-up to evaluation results, the Senate Assembly should invite a highly rated dean to visit the Assembly in order for the body to offer its compliments and inquire about practices that inspire faculty praise.

Rationale-  There is a wide disparity of faculty opinion expressed about different deans.  It seems reasonable to recognize a job well done and to encourage others to emulate it.

Back to Top


[1] The responses to the 11th question, related to diversity, were more difficult to quantify because a favorable score was 3.0/5.0 instead of 5.0/50. and are not included in this overview.