3. Method

 

3.1 Introduction

The basic research problem is this:

What factors affect the use of the Internet within a graduate school of education?

In Chapter 1, I stressed that there are no models for Internet diffusion within an educational organization. The process of diffusing the Internet throughout the SOE is an important issue that has occupied my attention and creative efforts for the past four years. The results of this study have important implications for the design of the tools and support structure and for the types of learning activities that they can facilitate.

In Chapter 2, based on empirical research and a review of relevant literature, I identified a set of factors that may be important in building a new model for Internet diffusion throughout an educational organization. These factors group naturally into six clusters, namely:

1. User characteristics and perceptions;

2. Cultural and organizational Issues, norms of use, legitimate activities;

3. Tools, design, and impersonal supports;

4. Social issues: scaffolding, mentoring, communication;

5. Individual learning, adoption, and conceptual change; and

6. Group learning, adoption, and conceptual change.

Ten research questions emerged from these six themes. These were initially presented in Chapter 1, are repeated below in Table 3.1, and will be addressed in this chapter. For each research question, in turn, I examined some suggested lines of inquiry which included surveys, interviews, focus groups, analysis of electronic documents, and an analysis of an electronic conference that was an optional part of an advanced statistics class.

I realized that some of the questions lent themselves to straightforward answers, such as "To what extent is the Internet used by the SOE?", whereas others did not lend themselves to easy answers. The last set of research questions—those that dealt with individual and group learning, adoption, and conceptual change—seemed to be the most difficult to answer and would require methods that went beyond the usual types of qualitative inquiry that I had used in the past. It was clear to me that I was dealing with a major case study. Based on these suggested lines of inquiry, and following the method that I have used in at RMC Research Corporation, I then built a data collection matrix, which is presented in Table 3-2.

For each of the themes and research questions in Table 3-1, I investigated how other researchers had followed similar lines of inquiry and tried to customize them for my particular case study. Section 3.4 explores the six themes in depth, matching data collection tools and strategies with each research question, and suggesting potential instruments and probes. I included this section to show the evolutionary process of my own thinking as I tried to figure out how to conduct large-scale research project using mixed method and how to apply them to the specific context of the SOE.

Sampling was a crucial issue. Whereas the 1995 sample of e-mail usage targeted only the instructional technology (IT) population, I wanted to expand the study to encompass the entire SOE. The stratified random sample that I used for the survey would provide me with a "big picture", an overview of major trends. The purposeful sample of early, middle, and late adopters throughout the SOE was meant to hear dissenting as well as approving voices regarding the use of computer-mediated communication (CMC) to enrich teaching and learning. The focus group investigated all six themes as perceived by a socially connected cohort of novice, non-IT users. Finally, the in-depth examination of a wide array of electronic artifacts was meant to substitute for the usual classroom observations that are generally used in a case study of an educational institution. This lent a balance to the heavy reliance on self-reported data.

I approached this dissertation project as a case study. Since case studies generalize to propositions rather than to other populations, I generated a set of propositions for each of the research questions, based on informal conversations with my fellow students and faculty members. The results of the study would then allow me to either accept or reject each of these propositions.

As with any research project, I always end up with 100% hindsight. If I had to do this study over, I would have pilot-tested my instruments first, rather than relying on the critiques and testimony of expert reviewers. Additionally, I would not have gathered data I did not intend to use myself, with the intent of sharing them with another dissertating student who then changed her area of inquiry. This left me without a second rater to provide a measure of inter-rater reliability—an especially important regarding the coding of the open-ended survey questions and the types of message posted on the electronic conference.

I begin this chapter on method by listing the research questions, grouped by theme. This table will form the organizing structure for the rest of this report.

 

Table 3.1

Six Themes and Ten Research Questions

Theme

Research Questions

1. User Characteristics and Perceptions

1A. To what extent is the Internet used by the SOE?

1B. For what reasons is the Internet used by the SOE?

1C. What challenges to the use of the Internet are perceived as most important?

2. Cultural and Organizational Issues, Norms of Use, Legitimate Activities

2A. How does the incentive structure of the SOE influence the types and levels of use of the Internet?

2B. What on-line activities are consonant with the administration’s vision of disciplined inquiry, professional engagement, and professional leadership and commitment by faculty and graduate students?

3. Tools, Design, and Impersonal Supports

3A. What improvements to the UCD and the SOE network’s human-computer interface (HCI) design and available Internet tools are suggested by new and continuing users?

4. Social Issues: Scaffolding, Mentoring, Communication

4A. What changes to the UCD and the SOE’s communication and support structure are thought to be most helpful to overcome barriers and support Internet use?

4B. How does the way that SOE members are joined to communication channels and other individuals influence their use of the Internet?

5. Individual Learning, Adoption, and Conceptual Change

5A. How do activities involving the use of Internet tools impact individual learning, adoption, and conceptual change?

6. Group Learning, Adoption, and Conceptual Change

6A. How does individual learning, adoption, and conceptual change influence the other members of the community to which these individuals are culturally linked?

 

 

3.2 Suggested Lines of Inquiry

 

3.2.1 Themes 1, 3, and 4

Questions that relate to Themes 1, 3, and 4—user characteristics and perceptions; tools, design, and impersonal supports; and social issues including scaffolding, mentoring, and communication—were answered through three data collection activities. These comprised an analysis of the 1995 and 1997 survey data; interviews with selected faculty, staff, and students; and open-ended questions posed to a focus group of students. For simplicity, responses from members of both the ILT master’s degree program and the CLT doctoral thread were labeled "IT". As I will explain in Chapter 4, this grouping process may introduce an error of less than 5% into the results, but made it much easier to compare the 1995 and 1997 data sets.

Responses of the IT faculty and staff to the 1995 and 1997 survey questions regarding levels of use, reasons for use, barriers to use, and suggested types of personal and impersonal support were compared. This shows the evolution of these themes over the two-year time period during which e-mail began to be used in classes and Web tools were made available to students and faculty.

A comparison of responses from the IT and non-IT samples for the 1997 survey revealed differences among these sub-populations. All of these were collected and entered into SPSS. Both surveys have a final, open-ended question in which participants were asked to provide any suggestions for improvement of Internet tools and the environment in which they are to be used. An analysis of these open-ended responses—especially response types that occur frequently—shed light on the types of interventions that Internet users felt might be particularly valuable.

Faculty and students to be interviewed in depth were purposefully selected to represent early adopters, early and late majority, and late adopters. (See Appendix A.) Some resisters were also contacted, but they declined to participate in this study. This is unfortunate, because several of them have been quite eloquent in meetings regarding the distance of online courses and distributed learning, voicing legitimate concerns that need to be heard.

As Fishman (1997) and Carlson (1970) reported, there is a strong social influence effect on the rate of adoption of educational innovations. A cohort of novice, non-IT users, all members of the School Psychology program volunteered to serve as a focus group for this study. A study of this socially coherent group lent a nice balance to the 1995 survey, which concentrated specifically on IT students and faculty.

The interviews and focus groups explored participants’ suggestions concerning potential improvements to the design and tools supported by the School’s network, as well as to its communication and support structure. It was especially important to find out whether training, scaffolding, job aids, and other forms of technical support may already exist but may not be publicized through the communication channels to which the faculty, staff, and students currently have access.

Electronic artifacts, including participation in class conferences, student Web pages, papers, and portfolio items, provided some examples of actual use of existing electronic communication and dissemination channels, as distinct from purely self-reported data. Investigation of these artifacts was intended to replace the usual classroom observations and examination of paper-based documents that lend objectivity to a case study.

 

3.2.2 Theme 2

I felt that questions that relate to Theme 2—cultural and organizational issues, norms of use, and legitimate activities—might not lend themselves to easy answers. However, it was important to explore them, especially with the recent cultural and curricular changes occurring in other universities as the use of distributed learning continues to rise. For example, an article in the CSS Internet News described the friction that is spreading among various university departments now that the Internet is changing or even eliminating old organizational schemes. "Universities preach change for the rest of the world, but tend to be very conservative in making changes themselves" (CSS Internet News, November 19, 1997).

Question 2A, which deals with the SOE’s incentive structure, was explored in the focus group and through in-depth interviews, especially with staff members. Question 2B was be dealt with in interviews with doctoral students. According to the doctoral handbook, they are expected to practice disciplined inquiry and exhibit professional engagement and leadership as part of their academic pursuits. Examples of student and faculty electronic artifacts such as Web pages, portfolios, and publications, lent objectivity to the investigation of this theme.

 

3.2.3 Themes 5 and 6

Questions that relate to Themes 5 and 6—individual and group learning, adoption, and conceptual change—were not as concrete in nature and were more difficult to answer. These themes involve learning and adoption on the part of individuals and the groups of which they are a part, sharing a "longitudinal continuity" (Crook, 1994a) of common experiences, beliefs and values, and solving problems together. Individuals are part of cultural systems, which, in turn, characterize learning organizations. Solving problems means inaugurating change, and change shifts the perspectives and paradigms of the members of the system, at all levels.

One way to study learning, adoption, and change is to investigate how participation in electronic conferences complements traditional classroom discussions, enhances or detracts from learning, and helps students to build meaningful representations of content. (See Wilson, Lowry, Koneman, & Osman-Jouchoux, 1994). The complete electronic conference of the spring 1997 Advanced Quantitative Methods class was available for analysis and was ideal for this purpose. To account for those members of the class who did not participate in the electronic conference, follow-up questionnaires and reminders were sent to all class members who completed the course. The questionnaires addressed their perceptions of the conference and barriers to participation.

Besides interviews, the focus group, and an analysis of messages in an electronic conference, I also examined individual students’ home pages and electronic portfolios. These are highly personalized electronic documents that can give good insights into students’ capabilities and perspectives. Electronic portfolios, especially, contain evidence of students’ disciplined inquiry, professional engagement, and professional leadership and commitment—qualities that are considered extremely important in the doctoral program. Home pages tend to reflect the author’s self-concept, topic foci, academic interests, and personal interests.

 

3.3 Data Collection Strategy

The overall research strategy was the case study (Yin, 1994). It is an excellent method for exploring and describing a complex social system in an authentic context. This allowed me to use an inquiry-based approach to examine a variety of evidence without attempting to manipulate any behaviors of the participants.

Starting with the patterns that emerge from the two surveys, I used in-depth interviews, a focus group, and an investigation of electronic artifacts to shed light on why these observed patterns occur, and what improvements to the system the participants may suggest. This variety of data collection methods assisted me in achieving triangulation by balancing self-reported data with observations (Sherry, 1997b). I also hoped to achieve representativeness by exploring the same research questions with different groups of members of the SOE.

The data collection matrix presented in Table 3.2 lists the ten research questions and their corresponding data collection methods. The following six sections delve into each of these six themes in greater depth and illustrate the process of creating the actual data collection instruments.

 

Table 3.2

Data Collection Matrix

Question

1995 Survey

1997 Survey

Inter-views

Focus Group

Electronic Artifacts

1A. To what extent is the Internet used by the SOE?

x

x

x

x

 

1B. For what reasons is the Internet used by the SOE?

x

x

x

x

x

1C. What challenges to the use of the Internet are perceived as most important?

x

x

x

x

 

2A. How does the incentive structure of the SOE influence the types and levels of use of the Internet?

 

 

x

x

 

2B. What on-line activities are consonant with the administration’s vision of disciplined inquiry, professional engagement, and professional leadership and commitment by faculty and graduate students?

 

x

x

 

x

3A. What improvements to the UCD and the SOE network’s human-computer interface (HCI) design and available Internet tools are suggested by new and continuing users?

x

x

x

x

 

4A. What changes to the UCD and the SOE’s communication and support structure are thought to be most helpful to overcome barriers and support Internet use?

x

x

x

x

 

4B. How does the way that SOE members are joined to communication channels and other individuals influence their use of the Internet?

 

 

x

x

x

5A. How do activities involving the use of Internet tools impact individual learning, adoption, and conceptual change?

 

 

x

x

x

6A. How does individual learning, adoption, and conceptual change influence the other members of the community to which these individuals are culturally linked?

 

 

x

x

x

 

 

3.4 Six Themes to be Explored in Depth

 

3.4.1 Theme 1: User Characteristics and Perceptions.

Rogers (1995) addresses five users’ perceptions: compatibility, complexity, observability, relative advantage, and trialability. Berge (1997), Fishman (1997), and Horton (1994) add several others, namely: written communication apprehension (which I polarized and renamed mediated writing proficiency), general representational proficiency (both of which are related to written communication apprehension), comfort level, and expertise using Internet tools. The 1995 and 1997 surveys were the logical starting place to explore the main patterns of individual users’ reasons for use, problems, and concerns. These are presented in Appendixes B and C. Similar work was also done by Pelton and Pelton (1996), using a Varimax rotation factor analysis of 42 Likert scale items that addressed preservice teachers’ attitudes toward technology.

Though surveys may reveal observable, surface characteristics of network usage, a focus group with new users and in-depth interviews with faculty, staff, and students—early and late adopters alike—provided insights into both the concerns and successes of our members. Hall and Hord (1987, p. 65) suggest some probes that may clarify individuals’ stages of concern. From time to time, I used probes of this nature in the interviews.

1. Are you aware of the innovation?

2. Are you using it?

3. How do you feel about it?

4. Any problems or concerns you have about it?

5. What do you think of it?

6. How does it affect you? Others you’re involved with?

7. What is your reaction to it?

8. What is your attitude toward it?

9. Do you have any reservations about it?

10. Would you like any information about it?

Appreciative inquiry, developed by Cooperrider and Srivastva (1987) is a good way to balance success with concern. Developed as both theory of organizing and a method for changing social systems, appreciative inquiry is considered to be a research method that leads to practical results as well as the development of new social theory. It consists of three parts: searching for the best examples of participants’ experience, creating insight into the forces that lead to peak experiences, and reinforcing and amplifying the elements that contribute to superior performance (Bushe, 1995, p. 3). I used the appreciative inquiry method to create three important exploratory questions that I included in the follow-up questionnaires for the electronic conference participants:

1. What was your most memorable experience with the electronic conference?

2. What caused that experience to be so memorable?

3. How do you envision yourself using electronic conferencing or any other Internet

tools to support your own growth and learning in the future?

Hall and Hord (1987, p. 97) also propose a sequence of questions to ascertain the individuals’ levels of use, which are related to their level of expertise:

1. Are you using the innovation?

2. Have you decided to use it and set a date to begin using it?

3. Are you currently looking for information about the innovation?

4. What kinds of changes are you making in your use of the innovation?

5. Are you coordinating your use of the innovation with other users?

6. Are you planning or exploring making major modifications or replacing the innovation?

I listed questions of this general nature as probes for the first three interview and focus group questions, namely:

1. To what extent are you using the Internet?

2. What are your reasons for using the Internet?

3. What problems or concerns do you have about using the Internet?

The final version of the interview and focus group instrument is presented in Appendix D. A preliminary survey of prior experience with Internet tools, which was used with the focus group to assess their level of expertise with computers and CMC, is presented in Appendix E. Appendix F—the questionnaire for the electronic conference participants—includes the three appreciative inquiry questions that were listed above.

 

3.4.2 Theme 2: Cultural and Organizational Issues, Norms of Use, Legitimate Activities

Whereas Hall and Hord are primarily concerned with individual adopters and Rogers deals with a traditional, hierarchically structured, centralized social system, the SOE must be addressed as a set of cultural systems embedded in a hierarchical organization. It must also be addressed as a learning organization in which not only the individuals, but the organization as a whole learns as it adopts.

Both Deming and Schein stress the fact that a system must have an overarching aim or goal. What is the overall aim of the SOE in making the Internet available to students, staff members, and faculty? This is an area that was explored by interviewing key decision-makers among the faculty. The School’s aims and goals should ideally be related to the incentive structure for using e-mail and the Internet, to the amount of persuasion that faculty members impose upon their students, and to the types of use that members of the School consider legitimate and appropriate. This issue will become increasingly important as the SOE begins to develop distance education and distributed learning courses and to deliver them on-line.

Some probes that I considered using in the in-depth interviews with faculty, staff, and policy makers to explore Theme 2 included:

1. What do you consider to be the School’s aim in using the Internet?

2. Do you consider that using Internet tools is part of your job?

3. Do you participate in any on-line conferences or listservs that pertain to your field of inquiry or professional interests? If so, do you find them valuable?

4. Do you feel there is any clear payoff for publishing your research on-line? Why or why not?

5. What do you feel is the appropriate balance between becoming proficient in the use of Internet tools and carrying out your other duties?

6. How do you feel about students doing their research on the Web?

7. What types of use do you consider are most important as you carry out your administrative duties?

8. To what extent and for what reasons do you feel your students should use Internet tools?

9. How important is it to post academic program information on the Web?

Similar questions that deal with students’ perceptions of the School’s incentive structure and their own vision of professional inquiry and information sharing were also appropriate for the interviews. Such questions were listed as probes for the fourth and fifth interview and focus group questions, namely:

4. How does the SOE’s incentive structure influence your use of the Internet?

5. What do you consider to be appropriate or legitimate uses of Internet tools, within the SOE, and why?

An examination of student and faculty Web pages provided insights into their perceptions of on-line disciplined inquiry and professional engagement.

 

3.4.3 Theme 3: Tools, Design, and Impersonal Supports

In this theme, I began to concentrate more on the innovation and the environment in which it is used. Here, technological or environmental factors such as system capacity, speed, reliability, usability, and possibly the HCI, begin to mix with the users’ perceptions. There is a very fine line between users’ perceptions (primarily attentive) and the affordances of the innovation (primarily pre-attentive). I will not emphasize this distinction except to say that the users’ reasons for use, levels of use, and barriers to use are enhanced or hindered by the types of tools they share; the types of representations that these tools support; the ways in which the design, interface, and capacity of the system helps them to carry out tasks that they consider appropriate; and the perceived usefulness of job aids and supports.

Initial information about the usefulness of aids and supports were gleaned from the surveys. I scanned through the users’ comments and suggestions for improvements to the university network and support structure. As patterns emerged, I focused attention on those classes of suggestions that occurred frequently.

Some questions pertaining to Theme 3 that were appropriate to use as probes in the in-depth interviews and the focus group included:

1. What aids or supports would you use most often? Why?

2. Are there better ways to design or redesign existing aids and supports? How would you redesign them?

3. Do you feel that giving out disks or descriptive brochures, or giving demonstrations of the system should be part of class activities?

4. Do you feel that having a computer and modem at home should be a requirement for students in SOE? How about requiring students to have Internet access through a commercial Internet Service Provider (ISP)?

5. What types of support would you like to see on the SOE’s On-line Helpdesk?

6. Are there newer or better Internet tools, or better ways of using them that could make your life easier?

7. Can you suggest any ways that the network’s look and feel could be improved, either by CINS or by the SOE?

It is important to remember that this is a commuter campus. Many students connect to the network from home or work. When the interviews were administered (January-March 1998), free access to the Web was not available through the university. Therefore, the following caveat was added to the sixth interview question:

Currently, you can’t access the full capabilities of the Internet from home without paying for an Internet Service Provider. Since the labs have direct connectivity, it is easier to improve the usability of the Internet in the labs than at home.

Thus, the sixth interview question was broken into two parts:

6a. What improvements would you suggest for the design, interface, Internet tools, and usability of the computers in the UCD labs?

6b. What improvements would you suggest for the design, interface, Internet tools, and usability for your home computer?

 

3.4.4. Theme 4: Social Issues: Scaffolding, Mentoring, Communication

At this point, the inquiry moved from the users’ perceptions of, and interactions with, the system and its impersonal support structure to the social relationships that both Rogers and Hall consider vital for new users. This was stressed by many teachers who participated in the BVIP—they needed "a real person" to answer their questions, to give them help and support when they needed it, to troubleshoot equipment when it broke down, and to help them when they got into trouble. Though some of this can be dealt with by teaching, training, and formal classes, both Hall and Rogers emphasize the importance of client/change agent empathy and developing one-on-one supportive relationships.

Information about these issues came from the interviewees, the focus group participants, and suggestions by survey respondents. Some new users perceived individual help from graduate assistants and on-line consultation with Internet Task Force members as useful. In fact, some students remarked informally that they felt very frustrated when there was no graduate assistant available in the SOE lab who was familiar enough with the hardware and software to assist them when they got into trouble. Part of these students’ frustration was due to the fact that the computers in the lab are directly connected to the UCD servers, whereas their problems involving modem settings only occur when they attempted to connect from their home computer.

Striking a balance in the use of graduate assistants versus other forms of scaffolding is an area that needs to be explored further, especially since users connect primarily from home or work rather than from the SOE lab. Ongoing investigations concerning the usefulness of the following potential interventions may prove useful: (a) more help from graduate assistants in the lab, (b) workshops on specific topics, (c) individual or group mentoring, (d) Internet fundamentals covered in introductory courses, (e) better communication with technical support personnel, and (f) coaching by more advanced students who have mastered Internet fundamentals and who would be willing to serve as a visiting lecturer for a class, seminar, or internship meeting.

The seventh focus group and interview question was worded as follows:

7. What changes to the SOE’s communication and support structure would be most helpful to you for overcoming challenges that you encounter in learning how to use Internet tools appropriately?

Another important aspect of social relationships is defined as the social influence factor. This is the extent to which an individual’s peers or superiors impact their behavior, their attitudes, and their use of tools through the mutual flow of information and sharing of ideas. Person-to-person contact tends to influence the speed at which an innovation is adopted (Carlson, 1970).

Using a variation of Carlson’s original interview question, I included a query in interview question 8 that asked interviewees about the three people they talk to or communicate with most frequently to discuss their ideas, activities, and problems with Internet use. The second half of interview question 8 began with the following sentence: "I would like to find out the three people you talk to most frequently about how you use the Internet for teaching and learning".

 

Though social influence is not a factor that can be controlled, it can shed some light upon one of the important group dynamic processes that exists within a networked community of learners. Both the social influence of peers and colleagues and the way in which individuals are joined to the School’s communication structure were studied through the interviews, the focus group, and the electronic conference.

Here are some probes that I used to explore the social influence factor:

1. To what extent have you learned about Internet use from your colleagues?

2. Has communication and interaction with your peers and colleagues changed the way in which you use the Internet?

3. To what extent do you keep in close contact via e-mail with your classmates? Your advisor? Your professors? Your program committee?

To explore the ways in which the social influence of participants’ colleagues and the flow of information through the SOE’s communication channels affected their use of the Internet, I broke the eighth interview question into two parts:

8a. How does the way in which you are joined to the School’s or your program’s communication channels influence the way in which you use the Internet?

8b. I would like to find out the three people you talk to most frequently about how you use the Internet for teaching and learning. This includes whom you consult with about technology ideas, activities, and problems, [for faculty: about how you design your classes], and how you use the Internet to support your own professional growth.

All participants were assured of confidentiality, both for their own responses and for any names that they might mention as the three people who influenced their own Internet use.

 

3.4.5 Theme 5: Individual Learning, Adoption, and Conceptual Change

Learning, conceptual change, and adoption are intimately related. In-depth interviews revealed the processes that students, staff members, and faculty go through as they pass through the stages of concern and levels of use, and as they begin to transform their perspectives. They also revealed some of the reasons why individuals decided to adopt, or why they chose to intentionally reject the innovation for legitimate reasons of their own.

In informal conversations, some students and faculty members exhibited low self-efficacy, feelings of intimidation by expert users, a sense of mystery regarding the types of computer operations that take place beneath the level of the desktop and icons, and a feeling of uncertainty regarding any clear payoff and intrinsic value of using the Web for communication and dissemination, considering the learning curve that must be traversed for them to use it effectively. On the other hand, students in doctoral seminars who have had successful experiences participating in on-line conferences mentioned that they found them valuable and informative. The focus group with the School Psychology cohort was intended to explore how initial activities involving the use of Internet tools could impact individual learning and conceptual change.

The ninth interview question, which addresses CMC, was worded as follows:

9. You are free to participate in Internet conferences as an active speaker/writer, or to listen in on conversations and learn from what others are sharing. How do you feel that Internet activities such as electronic conferencing and e-mail messaging impact your own learning and conceptual change? Your adoption and use of Internet tools?

The analysis of electronic artifacts was especially important for the fifth and sixth theme. I hoped that Individual Web pages and electronic portfolios would reveal the process of growth that students went through as they developed portfolio products of increasing sophistication.

Another important data collection activity was the analysis of all the messages that flowed among the members of an electronic conference. I saved all e-mail messages from the Spring 1997, Advanced Quantitative Methods seminar. To compensate for the fact that not all class members participated in the conference, I sent a follow-up questionnaire and several reminders to all students who successfully completed the class.

 

3.4.6 Theme 6: Group Learning, Adoption, and Conceptual Change

Adoption of new tools for legitimate, appropriate purposes occurs within a community of practice that shares a common environment, cultural norms and conventions, and mediating tools and artifacts. Learning, in this viewpoint, is not purely an individual activity. As Mezirow (1991) points out, citing Bowers (1984), social reality is shared, sustained, and continuously negotiated through communication with members of one’s sociocultural group. It is important that individuals learn to negotiate meanings and values critically, rationally, and reflectively, and to act on their understandings intentionally, rather than simply accept those imposed upon them by others (Mezirow, 1991, pp. 1-3).

From this perspective, my investigation explored not only individual perceptions, concerns, and perspectives, but also those of the social learning groups, programs, divisions, and subcultures within the SOE. Often, learning, change, and adoption occur within an individual because an entire class or social group is participating in an electronic conference. The reverse may also be true. Individuals who post messages that answer important questions or provoke lengthy discussion may tend to involve new members in the discussion if they see value in such activities.

Wilson, Lowry, Koneman, and Osman-Jouchoux (1994) found that, often, e-mail conversations would take a different turn from in-class discussions, exploring ideas in depth over the network that could not be covered in class. Ideas that were introduced via e-mail were occasionally addressed by the instructor in follow-up class discussions. Like Berge (1997), Wilson and his colleagues found a differential response toward e-mail by different types of students—a few who were shy in class tended to participate more via e-mail.

 

There were two ways to explore the individual/cultural link. The first was by examining the messages in an electronic conference, especially one centered on performing a legitimate or appropriate task. The second was via the interviews and focus group. These revealed the extent to which individual learning and conceptual change among key members of the group influenced the other members of their cohort.

The tenth and final interview question was open-ended. It was meant to investigate the individual/cultural interaction:

10. As part of a learning community, you’ll find that meaning is often negotiated in dialogue among yourself and your colleagues, face-to-face or electronically. How do you feel your own learning, growth, and change influences other members of your class, your program or division, or the SOE?

The questionnaire sent to the electronic conference participants delved into the same areas that were explored by Wilson, Lowry, Koneman, and Osman-Jouchoux (1994). These were: (a) the ways in which the electronic conference enhanced or detracted from the learning experience, (b) the ways in which any form of anxiety might have affected participation, (c) any limitations or negative aspects of the conference, (d) the ways in which the conference might complement in-class discussions, and (e) the necessary preparation for successful participation. As mentioned previously, the questionnaire included an appreciative inquiry section.

 

3.5 Study Setting and Subjects

 

3.5.1 Survey Sample

Although the initial survey and data analysis of 73 IT students and faculty was carried out in the spring of 1995 (Sherry, 1995), this current phase of the study actually began in the spring of 1997 with a follow-up survey using a revised version of the same instrument. The second survey targeted the entire SOE rather than just the IT cohort.

Participants in the 1997 e-mail survey comprised a stratified random sample of 278 faculty, staff, and graduate students within the SOE. The SOE is housed in a regional urban university whose members are primarily nontraditional students. All students were current or former teachers, counselors, school administrators, corporate trainers, or multimedia designers, enrolled in eleven different academic programs within the SOE. "Other" refers to students who were enrolled in SOE seminars but were not enrolled in a specific program. Table 3.3 presents the 1997 survey sample.

The sample consisted of 20.5% men and 79.5% women. Of the sample, 168 respondents had e-mail accounts, 109 did not, and one didn’t know. Sampling was done by division rather than by academic program, since cohorts within the two schoolwide programs (Educational Leadership and Innovation, Initial Teacher Education) followed separate threads or emphasis areas. Note that the number of IT participants was particularly high. I purposefully collected data from additional IT participants to facilitate comparison between the 1995 and 1997 surveys for the IT subpopulation.

 

Table 3.3

Stratified Sample for 1997 Survey

Division

Number of Respondents

Percentage of Respondents

Percentage of Students Enrolled

IT (ILT/CLT)

112

40.3

24.5

SPED

19

6.8

8.5

SPSY

12

4.3

3.2

ITE

36

12.9

24.5

ASCD/EDU

13

4.7

5.4

CPCE

29

10.4

16.8

EPSY/REM

11

4.0

5.3

C&P

13

4.7

2.2

LLC

21

7.6

9.6

OTHER

7

2.5

0

STAFF

5

1.8

0

TOTAL

278

100

100

 

 

3.5.2 Interview Sample

The sample of faculty, staff, and students to be interviewed in depth was purposefully selected to maximize variability in access, stages of concern, levels of use, and adopter category. Participants ranged from early adopters, to early/late majority, to late adopters. It was important to interview several early adopters to find out what struggles they went through as they learned how to use Internet tools effectively, what platforms and tools they preferred, and to what extent they have become members of the global Internet community. It was equally important to interview several late adopters or reluctant users, since they were able to express their valid reasons for resisting the use of the Internet. I wanted to know what these reasons were, because they may impact the design of future interventions or the inclusion more on-line courses, distributed learning, and interactive forums.

Questions dealing with administrative vision and support were most appropriately directed to two individuals: a faculty member who is in a policy-making position, and one of the Dean’s staff.

The final selection of twelve faculty, students, and staff members is presented in Table 3.4. Selection of specific interviewees was based on informal conversations with faculty, students, and the dissertation committee. The process of validating this selection, based on the actual responses given in the interviews, is presented in Appendix A.

Since participation was voluntary, there were some substitutions in the initial selection of interviewees until the pertinent cells were filled and the interviews were completed. I had originally suggested using two early and two late adopters for both faculty and students, plus one early and one late adopter for staff members, making a total of ten interviewees. However, based on initial interviews and my understanding that this is an inductive study, I decided to adjust the sample to include participants who were at neither extreme, since they may be more representative of typical users.

 

Table 3.4

Purposeful Selection of Participants for In-Depth Interviews.

Category

Early Adopters

Early/Late Majority

Late Adopters

Students

1,2

3,4

5,6

Faculty

7

8

9

Staff

10

 

11

Policy Maker

12

 

 

 

 

3.5.3 Focus Group

A convenience sample of School Psychology (SPSY) students volunteered to participate in a focus group at the beginning of the spring 1998 term. The subjects were all practicing professionals who were enrolled in the internship program at the time. Since this focus group consisted of non-IT students, it was a good way to balance the preponderance IT faculty and students in the two surveys.

Like the in-depth interviews, the focus group provided information regarding all ten research questions. There was a heavy emphasis on the social and cultural factors within this close-knit cohort of participants. In fact, the SPSY cohort decided to create an electronic conference on CEO, although they had no prior experience with computer conferencing.

 

3.5.4 Analysis of Electronic Artifacts

An investigation of electronic artifacts lent triangulation to this study and offset the emphasis on self-reported data. These artifacts comprised messages downloaded from an electronic conference and examples of student and faculty work.

I carried out an analysis of the Advanced Quantitative Methods Seminar electronic conference using the 107 archived messages from the spring 1997 class. These messages were stored on a Zip disk. There were ten students enrolled in the class, of which about half were active participants in the on-line conference. The students were gathered from all three emphasis areas in the schoolwide doctoral program, with the exception of one student from the Boulder campus.

Samples of student on-line portfolios and scholarly products were available via links from the SOE Web page to Scholarly Publications and Student Home Pages. At the time the investigation was carried out, two doctoral students had approved portfolios on-line. Two more were in the process of constructing on-line portfolios. All but one of the student Web pages belonged to IT students. Most of these pages were used as research management products, a required portfolio items in the doctoral program. Most of the scholarly publications were written by IT faculty and students. Other programs—primarily the High Achieving Classrooms for Minority Students (HACMS) doctoral laboratory participants—were just beginning to link their on-line publications to the SOE Web page.

 

3.6 Data Analysis

 

3.6.1 Survey Data

The data from the 1997 survey were analyzed in the same manner as the 1995 survey (see Sherry, 1997c). The results from the IT sub-sample of the 1997 data were compared with the 1995 data to explore the changes that occurred within IT over two years, during which time the Internet and its associated tools changed significantly.

In 1995, IT students created their own e-mail accounts on the Ouray server. The university did not have a Web server at the time. Two years later, Internet tools were much more varied and transparent to the users. In 1997, students could create e-mail accounts and electronic conferences on CEO. The CEO interface is more user-friendly than PINE and employs a desktop metaphor that is somewhat similar to the Macintosh interface. Students and faculty could also create Web pages on both Carbon and Ouray with customized graphics and hypertext links, thereby enabling them to access and disseminate information, both locally and globally.

Beginning in March 1998, midway through my data collection activities, students and faculty were granted free Web access via the Carbon and Ouray servers. This new service eliminated the inequity in access by financing TCP/IP connectivity with student fees. Moreover, students and faculty were now able to create Web pages on CEO and to access CEO’s full range of Internet tools. With the university network services serving as an Internet service provider (ISP), members of the SOE were finally provided with full Internet and Web access. However, at the time the surveys were conducted, these two options were not yet available.

Questions 1-6 in the 1995 survey, and questions 1-8 in the 1997 survey, addressed demographics. Frequency distributions of the demographic items in Section 1 of both surveys provided a profile of participants’ status in the division or program, access to telecommunications, and usage of e-mail and the Web at the time the surveys were distributed.

The 1997 survey also provided gender information. My original intent was to share these data with another dissertating student who was particularly interested in the gender/technology interaction, in return for cross-checking the coding to furnish some measure of inter-rater reliability. For my study, however, gender information was not used in any of the analyses.

For both the IT and the non-IT samples, I performed a factor analysis with Varimax rotation on the 14 Likert scale items listed under "reasons for use"—question 7 in the 1997 survey and question 9 in the 1997 survey. The analysis showed how the individual reasons for use grouped into factors. Question 8 in the 1995 survey and question 10 in the 1997 survey addressed challenges or barriers to use. I performed a similar factor analysis on the 11 Likert scale items listed under "challenges to use". Question 9 in the 1995 survey and question 11 in the 1997 survey addressed participants’ perceptions of the relative usefulness of suggested supports for training and performance using e-mail and the Web. I reported the frequency distribution of the rank orders of the eight proposed aids and supports in Section 4, "usefulness of aids and supports".

The final question in each survey collected open-ended comments and suggestions for possible interventions and supports. The 84 responses were typed into an Excel spreadsheet, coded with keywords, and sorted. A frequency distribution of the coded comments revealed the relative importance of factors relating to the six themes under investigation in this study.

 

3.6.2 In-Depth Interviews and Focus Group

Interviews with faculty, students, staff members, and one key policy maker were recorded. The cassette tapes were transcribed verbatim by a professional transcribing service. The same procedure was used for the focus group. Each participant signed a consent form that was approved by the Human Subjects Committee. Confidentiality was assured. I contacted each interviewee and offered to provide them with a copy of the transcript for correction and verification. Where tape quality was bad or information was missing, I substituted typed information from my field notes and asked the interviewee to verify its accuracy. I also asked each participant for permission to use his or her statements in the final report.

Although more than twelve students and faculty expressed an interest in being part of this study, the cost of having tapes transcribed professionally limited the sample size for the interviews. Since my budget for transcribing was about $500, and since each interview was limited to a maximum of 45 minutes, that was how I arrived at the value of twelve for the maximum number of interviews.

Following the data analysis protocol in Miles and Huberman (1994), I generated a contact summary sheet after each interview as part of my field notes to refer to if I had any questions about the transcripts. Next, I broke each transcript into ten sections, one section per question. Note that the ten interview questions mirrored my ten research questions. Then I grouped the responses to each question from all of the transcripts, including the focus group, tabbed each section by question number, and stored all of this information in a notebook. Each tabbed section in the transcript notebook thus contained information about one, and only one, research question.

At this point, I went through each section of the notebook, one question at a time. I highlighted important concepts, quotes, and vignettes that I would use when I presented my findings in Chapter 4. From time to time, comments appeared in one question that actually referred to the construct being examined in another question. When this happened, I flagged the comment and cross-referenced it.

 

3.6.3 Analysis of Electronic Artifacts

Case studies deal with a full variety of evidence, including documents, artifacts, interviews, and observations. For this study, the electronic portfolios and student Web pages constitute the documents and artifacts. The analysis of the Advanced Quantitative Methods on-line conference—another electronic artifact—was intended to substitute for observations in an attempt to achieve triangulation.

 

3.6.3.1 Student Web Pages and Portfolios

Individual student Web pages and portfolios were used for several purposes: (a) as a research management product, a key portfolio product that organizes lists of links relating to research interests; (b) as reflective devices that contain products such as synthesis papers and on-line publications showing insight and thought; and (c) to advertise talents, research interests, and accomplishments to the global Internet community. Student Web pages included lists of published papers, examples of professional leadership and engagement at professional conferences, completed products worthy of graduate students; resources for one’s students; vitae; and the like.

I examined these Web pages to see how closely the content was linked to the student’s course of study, growth in cognitive goals, professional engagement, and personal responsibility for learning over time. I was also interested in finding out why students and faculty were motivated to create their own Web pages, so I e-mailed each author a single question: "What were the reasons (or purposes) why you created your Web page?" and collected the e-mail responses that I received from each respondent.

 

3.6.3.2 Analysis of an Electronic Conference

In the Spring of 1997, there was an optional CEO conference for the Advanced Quantitative Methods class. As I have mentioned before, all 107 messages in the conference were downloaded, saved, and keyworded. The sender’s identity (ID) code, message subject, response type, date, and other pertinent information were all entered into an Excel spreadsheet to facilitate data analysis.

My intent was to examine students’ attitudes toward using e-mail to facilitate individual and group learning and conceptual change; types of learning strategies that were facilitated by electronic conferencing; and adoption of the Internet as a means to enrich and enhance in-class learning. This conference provided a wealth of information concerning the fifth and sixth themes: individual and group learning, adoption, and conceptual change—the themes that I originally felt would be the most difficult to investigate in this study.

 

3.6.4 Propositions

According to Yin (1994), case studies generalize to propositions, not to other populations. I expected the variation in the use of Internet tools among individuals in the SOE to be related to a number of factors. Tables 3.5 and 3.6 illustrate some propositions that may relate to these factors. These propositions were not simply my personal opinions. Rather, they were gathered from previous research by the Internet Task Force, information gleaned from the 1995 survey data, observations in classes where Internet Task Force members served as mentors, and informal conversations with faculty, staff, and students prior to the current investigation.

 

Table 3.5

Some Propositions that Relate to Questions 1 through 3.

Research Question

Concept

Proposition

1B For what reasons is the Internet used by the SOE?

Reasons for Use

Primary use is information sharing and dissemination; followed by finding and organizing information; and finally, as a means of communicating with others, provided e-mail communication is perceived as more effective than written messages and phone calls.

1C What challenges to the use of the Internet are perceived as most important?

Affective Variables

Late adopters are frustrated by the tools and may tend to give up before mastering them. Students may feel uncomfortable writing their thoughts down and presenting them to a public forum such as a seminar conference.

2A How does the incentive structure of the SOE influence the types and levels of use of the Internet?

Incentive Structure

If there is a lack of incentive structure or persuasion to use the Internet, individuals will not be willing to invest the time to use it. Faculty need to see a clear payoff for the extra time invested, and a balance of that time with other academic activities.

2B What on-line activities are consonant with the administration’s vision of disciplined inquiry, professional engagement, and professional leadership and commitment by faculty and graduate students?

Appropriate Activities Consonant with School Norms and Conventions

Students are producing electronic portfolios and Web pages as research management products. Faculty tend to use e-mail effectively, but lag behind students in disseminating scholarly products on-line, partly because of a fear of compromising their intellectual property rights.

3A What improvements to the University and the SOE network’s human-computer interface (HCI) design and available Internet tools are suggested by new and continuing users?

Design and Tools

New users would like to see a user-friendlier interface on the UNIX servers, similar to the usability of CEO messaging. Some students still have difficulty installing CEO on their home computers. If Web access is required, ISP cost could become a problem.

 

 

Table 3.6

Some Propositions that Relate to Questions 4 through 6.

Research Question

Concept

Proposition

4A What changes to the UCD and the SOE’s communication and support structure are thought to be most helpful to overcome barriers and support appropriate Internet use?

Communica-tion and Support Structure

To get faculty on-line requires more technical support and individual mentoring than is currently available. Students would like to see more graduate assistants in the labs, who are knowledgeable in the tools and applications that they are expected to master and available to assist them.

4B How does the way that SOE members are joined to communication channels and other individuals influence their use of the Internet?

Social Influence Factors, Communi-cation Channels

Students who are sociometrically close (as in a cohort in a specific program) will tend to adopt as a group because of social influence. Some individuals are not aware of the workshops and training that are currently available to them.

5A How do activities involving the use of Internet tools impact individual learning, adoption, and conceptual change?

Individual Conceptual Change

Students who participate frequently in on-line conferencing and messaging are actively practicing higher order thinking skills and generative learning activities. Students who "lurk" may also learn by reading the content of the on-line messages, but their ability to generate meaning and their regard for their own and their classmates’ knowledge may not increase as fast as that of the more active participants.

6A How does individual learning, adoption, and conceptual change influence the other members of the community to which these individuals are culturally linked?

Group Conceptual Change

Faculty who think of themselves as being knowledgeable about technology and making effective use of it have most likely incorporated technology into their sense of who they are and what they do in relation to their jobs to a greater extent than faculty who are reluctant users of technology. They may not necessarily act as change agents for their peers.

 

 

3.7 Strengths and Limitations of the Study

I am aware of the striking parallel between my conceptual framework and Activity Theory. I also found that my research methods were consistent with Jonassen and Murphy’s (1998) suggestions for analyzing activity systems as learning environments, as presented in their 1998 AECT paper. Jonassen and Murphy made three specific "advice to researchers" statements, namely:

1. The research time frame should be long enough to understand the objects of activity, their changes over time, and their relation to objects in other settings;

2. Analysts should pay attention first to broad patterns of activity before considering narrow episodic fragments; and

3. Analysts should use varied data collection methods and points of view to understand the system from different perspectives (p. 14).

I interpreted Jonassen and Murphy’s second statement as follows: (a) first use quantitative methods with large values of N to examine broad patterns of activity; (b) then focus on specific concepts with quantitative methods, including in-depth interviews and open-ended questions. This is a method that I have used successfully in my evaluation of the BVIP, and it seemed appropriate to apply it to the current study.

Since this study investigates how the Internet diffuses through the UCD SOE over time and across programs, there are several constraints on the generalizability of the results. The unrepresentative sample of the 1995 survey, which was limited to IT faculty and students, limited the generalizability of the initial findings. The stratified sample of the entire SOE in the 1997 survey was intended to alleviate this problem.

Since there were no multiple cases, I could not use replication logic to strengthen external validity. Case studies generalize to propositions, not to other populations. Internal validity, however, was strengthened by pattern matching across the diverse data sources shown in the data collection matrix (Table 3.1), leading to possible explanations for the results. Internal validity was further strengthened by disseminating and collecting all 278 of the 1997 survey responses within a two-month time frame.

Construct validity is always a problem when dealing with affective measures. Here, it was strengthened by having the original survey instrument evaluated and revised by a team of experts (the Internet Task Force) who were able to judge the extent to which the sample of items represents the defined domain. These individuals were experts in using Internet tools, had access from home or work, were familiar with the CINS computer systems, and brought multiple perspectives to the evaluation and revision of the survey instrument.

I also attempted to tie my research questions and propositions directly to my conceptual framework, which has both an empirical and a theoretical base. To try to achieve construct validity, I used multiple sources of evidence with converging lines of inquiry. I also suggested that key informants review both their interview transcripts and summary drafts of the interim data analysis, although only one interviewee examined her transcript thoroughly.

Reliability was strengthened by using a predefined protocol for guided interviews and the focus group. The instrument is highly structured and includes a large number of probes that are listed in the instrument itself. At times, these questions and probes were supplemented with additional questions suggested by Hall and Hord (1987). Participants all had an opportunity to review the questions prior to the interview or focus group.

For this case study, triangulation of disparate data sources was important. If one data collection activity did not replicate the results of another data collection activity regarding the same research question, then those findings might not be reliable.

 

3.7.1 Unit of Analysis and Its Boundaries

The main unit of analysis was the SOE, as distinct from other schools within the university, and from other schools of education at other universities. This is an urban, commuter school, so these results may not apply to resident campuses. The university has tended to lag behind other large universities in technology implementation, so the rate of adoption may not be the same as other universities that have already integrated Internet and Web use into their course requirements. The SOE is also a graduate school with mature adult learners, so these findings may not apply to undergraduates.

For the focus group, the unit of analysis was the Spring 1998 internship cohort of the School Psychology Program.

 

3.7.2 Lack of Direct Observations

In addition to interviews and an analysis of artifacts, case studies usually use direct observations as a data collection method in an attempt to achieve triangulation. Since observations were not appropriate in this case, and would most likely provide no useful information, I analyzed an on-line conference instead. My intent was for this to compensate for the lack of observations.

 

3.7.3 Ambiguous Data

Despite my best intentions, there were still ambiguities in some of the questions in the 1997 survey instrument that might have been remedied had I conducted a pilot test of the instrument before distributing it to the entire sample of the SOE. For example, staff and faculty do not have academic advisors; this question is irrelevant to them. Therefore, some wrote "does not apply" rather than circling a number in the Likert scale. Also, my suggestion for "interactive computer demos" referred "interactive computer-aided-instruction" (CAI), whereas some respondents interpreted it as "interactive demonstrations of computer software by real people", thereby contributing to the unreliability of the item. This distinction only became clear in the in-depth interviews.

Regarding the ranking of aids and supports, some participants still did not read the directions. Some ranked the items, some rated them, and some apparently reversed the scale—regardless of the fact that the directions were highlighted and spelled out in detail. If this instrument is to be used again, then the question that deals with usefulness of aids and supports will have to be reworded.

 

3.7.4 Non-Exhaustive Collection of Relevant Evidence

Given the constraints of time and resources, it was not possible to interview all five of Rogers’ adopter categories, ranging from innovators to resisters. Although I had initially contacted some "resisters"—especially those who had the power to make decisions, allocate resources, or persuade others of their beliefs—none of the faculty or students whom I had wished to include were willing or able to participate.

I hoped that listening to the voices of a limited number of faculty and students who exemplify the middle and both tails of this spectrum would yield sufficient information to make explicit the tacit beliefs, values, and assumptions of the late adopters; to achieve representativeness among the interviewees; and to permit me to make judicious recommendations based on multiple perspectives.

 

3.7.4 Limitations on Reliability of Coding Open-Ended Comments

Since I did not have a second rater to check my coding of the open-ended survey comments and the electronic conference messages, reliability presented a problem that I needed to deal with. At RMC Research Corporation, I rely on an independent coder to check a random sample of my analyzed data, and vice versa—a strategy that I had originally planned to use here, but that, in the end, was no longer available to me. In place of inter-rater reliability, I attempted to substitute stability over time, which introduces an element of subjectivity into the coding process.

I coded the 1997 survey comments on a first pass, using the types of categories that I had found in the literature (such as time, access, resources, technical support, and expertise) and themes that emerged from the data themselves (such as communication). Then I waited about a month so that my mind would be fresh, and I recoded all the survey comments. When a given comment did not fit into the same category both times, I examined it more carefully and put it where I felt it belonged the second time. Six out of the 84 comments were recoded, resulting in an intra-rater reliability of 93%.

The coding process for the on-line conference was not as straightforward. It took several passes to cut down the number of categories (response types) at the same time as I was attempting to classify the 107 messages into the then-existing categories. The reason for the increase complexity of coding is this: all of the survey comments were short sentences with a single type of message. However, many of the on-line conference messages were long, with a multitude of types of information. For each message, I tried to figure out the most important response type, which was a rather subjective process. On my first pass, I came out with too many categories (about 20), and so I tried collapsing them on a second pass, thereby reducing the number of keywords to 13. For example, any messages that had to do with the assigned journal articles were grouped under "reading evaluations". Any messages that were not directly related to a curriculum topic were grouped under "social interactions". Under the main categories of request – respond – acknowledge, I saw many separate keywords such as "ask question" vs. "ask for feedback"; "give clarification" vs. "give feedback" vs. "attempt answer"; and "support/share" vs. "I understand". Moreover, "comments" did not fall into any of these three main headings.

Using the same strategy that I had used with the 1997 survey comments, I waited about two weeks and recoded all 107 messages. On this third pass, I attempted to clearly distinguish between "report progress on final project", "attempt answer", "answer question", and the like—messages that fell into the "respond" category rather than the questioning or acknowledging category. At this point, I felt that the subtle differences between these keywords were clear enough that I could consider this third pass my last. As I went through this dynamic and evolving process, I constantly reworked the spreadsheet that I used to calculate the distribution of response types. Thus, I cannot calculate a value for intra-rater reliability for the coding process.

As I have mentioned before, 20-20 hindsight is always revealing. The limitations to this study represent areas that I have been addressing in my ongoing professional research at RMC Research Corporation, especially as principal investigator for the Vermont WEB Project. This is another innovative project that uses CMC for critique and improvement of student works-in-progress, which I am investigating with similar case study methods.