Michigan block M   Administration Evaluation

Anonymity and security for the AEC evaluation system

The goals of achieving both anonymity and security for the AEC evaluation system inherently conflict. This is an overview of the methods used to adequately address both concerns. Although the need for anonymity in this system is obvious, the need for security deserves some explanation. Security is necessary to ensure the integrity of the statistical results from the evaluation process. Questionnaires should only be accepted from authorized faculty members, and each faculty member should evaluate each administrator only once. This is directly analogous to an election in which controls are needed to ensure that only registered voters can vote and to prevent ballot box stuffing.

The conflict arises in that security can only be achieved if the identity of the participant is known with a good degree of certainty, while anonymity depends on not knowing this. To resolve this conflict, the AEC system was designed to model a version of a traditional election system with paper secret ballots. In such an election system, when a voter enters the polling place, one election worker checks their identification and verifies their presence on the voter registration list. Once this is done, the election worker issues them a numbered paper ballot and records the ballot number. The ballot number is on a removable perforated edge of the ballot. The voter takes the ballot to a private voting area and completes it. The completed ballot is then placed in a privacy envelope with only the ballot number exposed and the voter gives it to another election worker. This election worker tears off the ballot number and transfers the now anonymous ballot from the privacy sleeve into the ballot box without looking at its contents. The detached ballot number is given to the election worker maintaining the voter list; this is used to cross that voter off of the list of eligible voters, so that nobody can vote twice.

The electronic system follows this model fairly closely with some minor modifications. In particular, each administrator questionnaire is treated as a separate "ballot". This allows a faculty member to evaluate different administrators in separate sessions; faculty members do not need to reserve a large enough block of their time to be able to complete the evaluations for all of their administrators in a single session. An overview of the specific implementation follows.

When a person enters the secure area of the website, they are first redirected to the server that the University of Michigan ITCS (Information Technology Central Services) runs for secure web authentication. This is the same server used for Wolverine Access and other web services that require authentication; the authentication system is uses is called CoSign. The AEC web site has been registered with the ITCS CoSign server as a valid secure service within the University of Michigan. The AEC web server also has a purchased web server "security certificate" to identify it in an authenticated way to the user's web browser. This is to make the system resistant to someone bringing up a phony web server that claims to be the authentic AEC web site. The security certificate was inexpensive and the cost was covered by a faculty member.

Once a person has their uniqname authenticated by CoSign, they are returned to the AEC webserver using an encrypted connection (to prevent any outside eavesdropping on things like the content of any questionnaires). SSL encryption is used; this is the industry standard for sensitive information such as financial transactions, credit card number transmission, etc. At this point the web server has assurance that it has a secure connection to someone who has authenticated as a faculty member.

Internal to the AEC web system, uncompleted questionnaires have numbers associated with them, analogous to the printed numbers on the tear-off strip at the end of the paper ballots. There is a master checklist of which questionnaire numbers have been used. This checklist is implemented as a single file on the AEC web server so that it is impossible to use file timestamps to determine the time at which any particular questionnaire was used. Eligible faculty members have each been assigned a small block of these questionnaire numbers sufficient to cover all of there administrators plus a small number of spares that can be activated should a missing affiliation be reported during the evaluation process and verified by the AEC.

When a faculty member accesses the web page with the menu of administrators to be evaluated, the system looks up their block of questionnaire numbers and consults the master questionnaire number checklist to see which of these numbers are still unused. When a faculty member completes a questionnaire and submits it, the system verifies that the questionnaire is from the authenticated person and that it does not have a questionnaire number that has already been used. Any forged or duplicate questionnaires are rejected at this time. If a questionnaire is found to be forged or duplicated, it is immediately discarded in its entirety and an error message is displayed to the user. Although this situation is unlikely to arise in normal operation of the system, a user could cause it is by using the "Back" button on their web browser after submitting a questionnaire to try to go back and submit it again. In this case the questionnaire is immediately deleted and the user gets a message: "Your response cannot be accepted since you have already evaluated this administrator." If a questionnaire is determined to be valid, the web server then immediately deletes all identification information from it (uniqname of the participant and the questionnaire number) and places it into a folder of collected questionnaires. Immediately after putting the now anonymous questionnaire into the collection folder, the questionnaire number is checked off in the master list of questionnaire numbers so that it cannot be reused. A secondary program on the server periodically "shuffles" the collected questionnaires to destroy any information on the time they were received or the order in which they were received. This defeats any later analysis based on file timestamps to determine which faculty member any particular questionnaire came from.

At no time does the system maintain a list of uniqnames who have actually submitted questionnaires. Admittedly, this could be partially inferred from the web server logs, but there are a number of limitations to this. ITCS logs the authentications through CoSign, but the logs are in highly secure area and the CoSign system keeps absolutely no record of what was actually done; none of the questionnaire material ever passes through their system. All they do is provide an identity verification (analogous to checking voter ID's as they enter the polling place). Also, even if somebody obtained their logs, they probably wouldn't be all that useful since they would be so cluttered with all the other uses of CoSign (faculty access to CourseTools sites, faculty and staff access to payroll and HR data, faculty and staff access to Wolverine Access course lists, etc.) If a particular person's name was on the CoSign logs, that would be no proof that they submitted any AEC questionnaires. The only statements that could be made from the CoSign logs with certainty would be negative ones; if someone never used any U of M secure web services, this could be verified by their complete absence from the logs. In other words, the only statements that could be made with complete confidence would be of the form "Professor X never authenticated to obtain authorization to use a secure U of M web service such as the AEC web site". On the AEC web server, the situation is similar except that the only accesses specific to the AEC site would be recorded. From these logs also, only negative conclusions could be made with certainty (again, of the form "Professor X never did anything at all in the secure area of the AEC site"). However, since these logs are under AEC control, they can and will be destroyed immediately at the end of the evaluation process. The one most security sensitive item concerns the internals of the AEC server; while it is up and running, there do exist both a list of the assignments of questionnaire numbers to faculty members and a list of which questionnaire numbers have been used. As a matter of security policy, we never keep an actual list of uniqnames that have participated. Given both the questionnaire numbers and the master list of used numbers, it would be technically possible to reconstruct such a list of uniqnames, but we intentionally never do this. Immediately at the end of the evaluation process, both the questionnaire numbers and the master list of used numbers will be completely and thoroughly destroyed, along with all web server logs for the AEC web server. This will be done before any results are reported (or even analyzed). Once these items have been destroyed, nobody will ever be able to say with certainty who the people were who submitted questionnaires, aside from obvious inferences that can be made in situations such as when a department gets a 100% or 0% response rate.

After the close of the evaluation period, only the folder of collected questionnaires will be preserved to be used in the report generation. These questionnaires only contain identification for the administrator being evaluated (and in the cases where reports are being broken down done by unit, there is identification for the unit it came from). Neither the uniqname of the person responding nor the questionnaire number is ever recorded in a saved questionnaire.

As with any system handling confidential information, there is some implicit trust at several points. The user needs to trust that their own computer hasn't been compromised to install keystroke monitoring spyware (although one interesting effect of the implementation is that keystroke monitoring spyware wouldn't do much good for SA/A/N/D/SD format questions, since the answers are based on both mouse movements and the position of items on the screen in the browser, and those are very difficult to track or correlate). Keystroke monitoring spyware would be able to capture written out comments, of course. There is also implicit trust that the system software has been implemented as we have stated. The AEC IT subcommittee member who was the primary implementor of the web software offers his professional integrity, ethical standards, and reputation in support of this and extends an invitation to anyone familiar with the programming issues to independently inspect and review the code. It is also assumed that exotic and expensive (and not very practical) attacks best left to spy novels have not been carried out. One example would be to break into the computer room at night and install some device on the server that records all computations on the system during the evaluation period. Realistically, any effort to carry out such schemes would certainly cost far more than the information obtained would be worth. Also, the risk of getting caught would be high for anyone attempting any such scheme.

This is a prototype system put together by volunteers, and this writer will be the first to admit that it is in no way perfect. However, it is believed to be a good first version that adequately meets the immediate need, and it was completed with minimal resources.

AEC Information Technology subcommittee

Back to Administration Evaluation Committee home