Skip to main content

Spring 2016 Pilot & Results

A. Purpose

In Fall 2016, the current IDEA process will shift from paper and web-based to strictly web-based evaluations. The implementation of the new structure also includes a change in pricing structure from a per evaluation cost to an annual fee. The annual fee will be significantly greater than the yearly total fees paid over the last three years.

Given the pending cost increase, the Professional Development Workgroup (PDW) agreed to evaluate an alternative product. Since e-Campus recently adopted eXplorance Blue and received positive feedback for our online programs, the PDW approved a pilot implementation of eXplorance in Spring 2016.

B. Method

Recruitment

OIE sent invitations to Deans and Department chairs requesting volunteer faculty and courses for participation in the pilot. In some cases full departments participated, while other departments asked for faculty volunteers. Some faculty members included all courses in the pilot and others volunteered a selection of their courses. OIE requested that faculty use only one evaluation tool for a particular course, either IDEA or eXplorance, rather than both in order to promote higher completion rates. Participation details are provided below.

  • All colleges participated, with the highest percentage of courses from Business & Technology, Arts & Sciences and Health Sciences
  • 20 departments were represented
  • 108 individual faculty members
  • 287 courses
  • 2,387 students submitted evaluations

Instrument Development

Since eXplorance Blue allows each campus to develop a customized tool, it was necessary to build an instrument for EKU. A faculty workgroup consisting of 11 faculty members and two IR staff was developed and charged with the development of the evaluation instrument. The faculty workgroup convened throughout February and March to develop the core question set and an optional question bank. The workgroup used validated examples from IU Bloomington and the University of Louisville, as well as the IDEA and e-Campus instruments to guide question development. Due to time constraints, validation procedures were not conducted for the question set developed for the pilot. However, it is possible to implement reliability and validity tests over time and tweak the instrument as necessary.

The core question set was sent to all faculty members with confirmed participation in the pilot study. The faculty workgroup evaluated feedback from the pilot faculty and incorporated that feedback into the core question set or the question bank as appropriate. Final drafts of the core question set and question bank were approved by the faculty workgroup and entered into the eXplorance Blue system by IR staff.

Implementation

All pilot faculty received an email invitation to complete question personalization on April 12th. Faculty also received two reminders from the eXplorance Blue system and a personal reminder from OIE about the ability to customize their evaluations. During the question personalization period, faculty could add up to 10 additional questions to the core question set. The additional questions could be chosen from the test bank or completely customized by the faculty member, with a maximum of five customized questions. These limits were established to maintain brevity and encourage completion. More than 60% of pilot evaluations included questions personalized by faculty (N=176 courses).

Students received email invitations to complete the evaluations on April 25th. They also received two reminders, April 29th and May 4th. The email text was drafted and approved by the faculty workgroup with language that was more casual than the formal language typically used in such invitations. Additionally, evaluations were available to students through their Blackboard account. The evaluation module was visible on the homepage at each login to Blackboard after April 25th.

Implementation logistics were the responsibility of the OIE and IR offices. An implementation team including members of IT, e-Campus, Blackboard, IR, and OIE met weekly with the eXplorance Blue team to ensure timely completion of tasks and accuracy of data. Additionally, IR and OIE met as needed with the consultant to troubleshoot issues that arose throughout the project.

C. Response Rates

Response rates are one of the greatest challenges to the use of web-based course evaluations. Paper based surveys have higher response rates because they have a captive audience. Average response rates for paper based IDEA in Fall 2014 and Spring 2015 were 79%, while average rates for online IDEA in Fall 2014 and Spring 2015 were 44% and 39% respectively. Response rates for the e-Campus pilot of eXplorance Blue were 50.75% in Fall 2015.                                   

The overall response rate for the current pilot was 50% (2,387/4,806), with wide variation between departments and courses. Fifteen courses had a 100% response rate with several more over 90%. Response rates can be heavily impacted by faculty, even without incentives. OIE provided a “top ten” list of suggestions to help improve response rates.

D. Reports

University, College, Department, and Instructor reports were designed by OIE using the eXplorance Blue report builder. Faculty and Administrative Assistants received instructor reports, while Department Chairs received a less detailed version of the individual instructor reports for their departments. Deans and Department Chairs received Department, College, and University reports. Invitations to view reports were sent via email and accessible through Blackboard.

Turnaround time for reports was less than one week. Grades posted Monday, May 16th and report notifications were sent Thursday, May 19th. In comparison, IR received CDs with IDEA results on June 15th and are working on distributing those to the administrative assistants, who will then share them with faculty.

E. Faculty Feedback

One week after reports were delivered, pilot faculty were asked to participate in a survey regarding their experience using eXplorance Blue. Thirty-three faculty responded to the survey, for a 31% response rate. Overall, faculty feedback was positive. The majority of faculty reported their experience with eXplorance was the same or better than that with IDEA in all areas. The lowest rated items were student accessibility, format and layout of reports, and response rates. There was some confusion about students’ ability to access the evaluation through Blackboard. This problem can be remedied with stronger communication regarding the process. As for the format and layout of reports, we received detailed feedback from some faculty about what they liked and what they found confusing. We can use this feedback to improve reports in the future.

Several of comments regarding reports, questions, and logistics were very detailed and will be useful in revising and improving the process moving forward. Specifically, the comments will help make reports more useful and guide better communication with faculty about the process. Comments regarding response rates were primarily aimed at the loss of response rates due to the implementation of a web-based tool and a preference for paper-based evaluations. An increased emphasis on the mobile friendly nature of eXplorance Blue and the possibility of providing class time for students to complete the evaluation on their phones or other mobile devices may address some of these concerns.

F. Next Steps

In order to maintain the current pricing commitment from eXplorance Blue, a decision must be made by July 12, 2016. If the decision is to move forward, many steps regarding data preparation must be completed prior to the start of the Fall 2016 semester. Additionally, a stronger communication plan for faculty must be developed to address concerns voiced in the faculty survey. Finally, OIE will work with faculty to improve the questions and the question personalization process.

Spring 2016 Pilot Summary

Click here to view the full eXplorance Blue Pilot Study

Open /*deleted href=#openmobile*/