Sadavage, 'Framework for Analyzing the Alternative Methodologies FOR Investigating the Effectiveness of Hypermedia.', Arachnet Electronic Journal on Virtual Culture v2n04 (September 27, 1994) URL = http://hegel.lib.ncsu.edu/stacks/serials/aejvc/aejvc-v2n04-sadavage-framework The Arachnet Electronic Journal on Virtual Culture __________________________________________________________________ ISSN 1068-5723 September 27, 1994 Volume 2 Issue 4 EJVCV2N4 SADAVADGE A Framework for Analyzing the Alternative Methodologies FOR=20 Investigating the Effectiveness of Hypermedia. Gary Sadavage Lehigh University ges0@Lehigh.EDU Abstract This article proposes a framework for analyzing the efficacy=20 of evaluation methodologies used in examining hypermedia. An=20 examination of current research suggests that more systematic=20 and reliable methods need to be established for utilizing the=20 test instruments at our disposal. An examination of the=20 evaluation practices currently in place may lay an initial=20 foundation from which we as instructional technology=20 professionals build stronger research paradigms. I encourage=20 any researcher to use in part or in whole any of the=20 techniques suggested in this article and to not limit=20 themselves to the small scope of the initial framework=20 proposed. 1.0 Introduction =D2Each time a new technology is applied to teaching and=20 learning, questions about the fundamental principles and=20 methods arise=D3 (Marchionini, 1990, p. 355). These questions=20 now confront practitioners of educational technology who must=20 respond to these issues and transform learning theory into=20 practical applications of instruction (Clark, 1989). The=20 delivery medium of hypermedia promises great hope for putting=20 theory into practice but at the potential price of raising=20 more questions than it answers. Although hypermedia has been=20 used extensively for information retrieval and collaborative=20 problem-solving applications, =D2its efficacy is neither=20 established nor without likely problems=D3 (Jonassen &=20 Grabinger, 1990, p. 4). The establishment of sound=20 evaluation techniques would not only serve to validate=20 current practices in applying hypermedia for learning but=20 would begin to forge the link between theory and application. Although there are many anecdotal reports of hypermedia use,=20 there has been little hard research performed on the features=20 of hypermedia which lead to improved learning (Knussen,=20 Tanner, & Kibby, 1991). The research that is available=20 suggests that strong parallels exist between present=20 evaluation practices in hypermedia and "established"=20 practices from early efforts in computer-aided instruction=20 (Roblyer, Castine, & King, 1988). Researchers such as=20 Quentin-Baxter & Dewhurst (1992) have suggested that due to=20 the unique nature of hypermedia, current evaluation=20 strategies may not work. =D2As a fundamentally new concept for=20 presenting computer assisted learning material, hypermedia=20 has stepped into a previously unoccupied niche...Evaluating=20 the effect of learning via hypermedia-based computer-assisted=20 learning [CAL] may be more difficult than for other CAL=20 material...=D3 (Quentin-Baxter & Dewhurst, 1992, p. 179). The evaluation methodologies which are available to=20 researchers are many and varied, each with advantages and=20 disadvantages (Knussen, Tanner, and Kibby, 1991). Although=20 there are many variations, the methodologies may generally be=20 described as quantitative or qualitative in nature. It may=20 be helpful at this point to briefly examine some issues of=20 validity regarding each type. The majority of studies to date have utilized empirical,=20 quantitative designs (Nelson, 1992). These methodologies=20 require that the researcher formulate and propose a=20 hypothesis before beginning the study (Wiersma, 1991). This=20 requires that certain conditions of the learning environment=20 be known and can be controlled. By the very nature of=20 hypermedia, the learning environment is interactive and=20 exploratory. =D2That is, such an environment requires the=20 input or action of the user, and the user directs his/her own=20 path through the environment=D3 (Thompson, Simonson, &=20 Hargrave, 1992, p. 57). It may be that the rapid evolution=20 of hypermedia learning systems has created a situation in=20 which it may not be possible to meet the criteria required to=20 do effective quantitative research (Nelson, 1992). Naturalistic or qualitative research, in many ways, can be=20 considered the opposite of quantitative or empirical=20 research. =D2Empirical research, based on scientific=20 empiricism, seeks to explain the cause-and-effect of=20 phenomena...On the other hand, naturalistic research attempts=20 to describe a phenomena as it occurs in its natural setting=20 in order to draw inferences that have explanatory value.=D3=20 (Thompson, Simonson, & Hargrave, 1992, p. 21). These=20 methodologies do not pre-suppose that a hypothesis or=20 preconceived theory be formed before research be performed=20 (Wiersma, 1991). Since research is conducted in a natural=20 setting, there is no manipulation of variables or externally=20 imposed structure on the learning environment. Knussen,=20 Tanner, & Kibby (1992) have noted the following factors which=20 interact and affect the learning environment: 1) personal factors; =20 2) environmental or situational factors;=20 3) the needs and interactions of the users. =20 This suggests that the results from descriptive studies are=20 dependent upon a particular population in a particular=20 setting and thus may not have a high degree of external=20 validity. Within both realms of quantitative and qualitative=20 methodologies there are a variety of instruments which may be=20 used to evaluate learning in general. As has been already=20 suggested, there are inherent difficulties with each research=20 strategy which may make it difficult to establish baseline=20 data for hypermediated learning (Marchionini, 1990). The=20 question may be asked, =D2Are any of these methodologies=20 appropriate for evaluating the unique construction of a=20 hypermedia environment?=D3 The flow and evolution of our=20 knowledge base about hypermedia may be hindered by the=20 limitations of our assessment tools. Driscoll (1984) has=20 stated that the problems our research attempts to answer=20 should not be delimited by the construction of our research=20 paradigms. The solution to these problems may be a new=20 approach to evaluation (Marchionini, 1990). Accordingly, we=20 need to identify which instruments may potentially answer=20 which questions about learning. 2.0 Background: Need for Study As has been suggested by Clark (1989), the number of studies=20 conducted in instructional technology have increased in=20 =D2...number, ecological validity, treatment duration, and=20 statistical complexity=D3 (p. 57). He asserts that these=20 quantity indicators do not necessarily support the quality=20 judgments being made. He suggests that more time should be=20 spend on the front end of design in conceptualizing the=20 research to be performed. He states a need for researchers=20 =D2...to commit themselves to the time and energy required to=20 thoroughly survey and master all relevant research literature=20 in the problem areas they select for investigation=D3 (p. 65). Brandon (1988) reviewed the literature on the effects of=20 instructional hardware and software. He asserted that=20 although research on the effectiveness of instructional=20 software has shown some benefits for using computers, the=20 research has been flawed. He claimed that their evaluation=20 technique precluded the results from generalizing to other=20 educational interventions in other settings. Marchionini (1990 ) suggested a multi-faceted approach for=20 the evaluation of hypermedia-based learning. He suggested=20 five separate but interrelated criteria for evaluating=20 hypermedia software: environmental requirements (cost,=20 hardware requirements, e.g.), capability (program features),=20 usability (interface design), reliability, and performance. =20 He also suggested that individual methods may be needed if=20 proper controls for the newness of the medium are to be=20 employed. In addition, "learner-directed (created)=20 performance tests can yield generalizable results if large=20 enough numbers of results can be collected and compared." Romiszowski (1990) asserted that there are well tested=20 principles in the more traditional instructional situations. =20 Developers in hypermedia who create novel evaluation tools=20 due to the newness of the medium are warned that they may not=20 be supported by research. He states that "we are still in=20 the potential solution-seeks-compatible-problem stage of=20 development." Morariu (1988) stated that in education and training we=20 measure the effectiveness of the instructional design by=20 assessing the difference between what was known prior to the=20 treatment to what was learned afterwards. He questioned the=20 presupposition that the information and tasks in the=20 hypermedia treatment are discernible and measurable. To=20 determine if it is measurable "...it is necessary to define=20 the context and content of the learning experiences and their=20 outcome-oriented objectives." He additionally suggested that=20 it may be difficult to apply a measurement mechanism that=20 correlates with the level of learning the user has attained. 2.1 Suggested Purpose of the Study The suggested purpose of this study is to investigate the=20 methodologies employed in evaluating the effectiveness of=20 learning in a hypermedia environment. It does so through a=20 discussion of the context in which a number of assessment=20 instruments have been utilized in various research=20 methodologies; an appraisal of the learning outcomes the=20 instruments are suggested to assess as identified by Gagn=8E=20 (1988); a summary of the key features of each instrument in=20 determining the learning outcome; and a critical analysis of=20 the appropriateness of each methodology for evaluating the=20 effectiveness of hypermedia for learning. [Note: The advent of increasingly sophisticated graphics and=20 simulations now make it possible to replicate the look and=20 feel of many real work environments (Knirk & Christinaz,=20 1990). This suggests that a hypermedia system may find=20 useful application in such non-traditional computer areas as=20 the teaching of motor skills.] The proposed research is suggested to be a prescriptive and=20 analytic study. There would be no predetermined hypothesis=20 to be tested. This study =D2...is not designed to support or=20 refute particular theoretical positions, as would be the case=20 in traditional research, but rather to contribute to=20 recommendations for action=D3 (Lehigh University College of=20 Education, 1989, p. 4). Although there is no explicit=20 question or hypothesis stated, the statement of purpose=20 implies several questions, among them: o Is any methodology taken as a whole suited for analyzing =20 the effectiveness of learning in a hypermedia environment? o Are there elements of each methodology which may be useful=20 in answering a specific question about learning in a=20 hypermedia environment? o What learning outcomes (as identified by Gagn=8E, 1988) do=20 the assessment instruments address most effectively? o Which features of each assessment instrument make it =20 useful for evaluating a specific learning outcome? 2.2 Hypermedia defined Hypermedia, in broad terms, is the delivery of information in=20 forms that go beyond traditional list management and database=20 report methods (Sculley, 1988). It extends the nonlinear=20 representation and access to graphics, sound, animation, and=20 other forms of information transfer (Marchionini, 1988). It=20 encompasses an array of hardware and software which empower=20 the user to use technology to change the pattern of=20 information accessibility (Perry, 1987). In the context of=20 this paper, the term hypermedia will not refer to a=20 particular software or hardware product. A hypermedia system=20 for this study consists of text, audio, graphics, still=20 and/or motion video controlled by a computer through a=20 software interface which provides non-linear accessibility. =20 A hypermedia environment is simply a hypermedia system in=20 conjunction with a human user and the interactions and=20 learning that result. The quality of learning that takes=20 place in a hypermedia environment directly relates to the=20 interactions between human and hypermedia system. The nature=20 of the environment will change with each different human=20 user. 3.0 Where Others Have Tread: Analyzing the Effectiveness of=20 Hypermedia 3.1 The Role of Research Paradigms Driscoll (1984) suggests that the paradigms that guide our=20 research will naturally delimit the problems we can attempt=20 to solve. In scientific arenas where the field is firmly=20 established, one research paradigm will typically dominate. =20 This is not the case with developing sciences. Several=20 contrasting paradigms may be competing to establish itself as=20 the dominant methodology. This is stated to be the current=20 trend for the developing science of instructional systems. Driscoll states that early practitioners in instructional=20 systems drew from the theoretical basis of several fields,=20 including psychology and information systems. This was done=20 so that developmental efforts could be directed at building=20 upon an already established research platform. As the body=20 of research grew, it became apparent that a shift in=20 theoretical and research paradigms would have to occur to=20 address the questions not answered by current methodologies. =20 Methods of experimental or quasi-experimental designs did not=20 seem to answer the noncausal questions raised by emerging=20 instructional systems. =D2It is not always possible in=20 instructional research to randomly assign individual students=20 to treatment conditions or to assign some students to receive=20 a particular treatment which others will not receive=D3 (p. 3). =20 However, a variety of alternative methodologies ranging from=20 meta-analysis to ethnography are available which may enable=20 us to investigate problems delimited by other methods. These=20 and other research paradigms may indicate the new direction=20 educational technology should be heading. 3.2 Synergistic Approach to Evaluation Knussen, Tanner, & Kibby (1991) attack the problem of=20 evaluating hypermedia from a more pragmatic standpoint. They=20 suggest that requirements such as time, personnel, and cost=20 should be factored in designing the evaluation. However, the=20 primary emphasis should be placed upon the objectives,=20 context, and purposing of the evaluation as determined by the=20 evaluators. They state that, =D2In general, evaluators will=20 only find what they are looking for, and they must therefore=20 be prepared to change techniques in the light of changing=20 requirements=D3 (p. 22). Knussen, Tanner, & Kibby additionally suggest that a=20 synergistic approach be taken in selecting an evaluation=20 technique. Certain techniques may be more effective when=20 used in conjunction with complementary techniques (e.g.=20 observation with automated collection of the user=D5s=20 interaction with a system). It is suggested that evaluators=20 not be bounded by conventional thinking in developing an=20 evaluation strategy. An examination of the parameters and=20 aims of evaluation will facilitate the choice of methodology. 3.3 Contextual and Content Evaluation Quentin-Baxter & Dewhurst (1992) indicate that the=20 difficulties encountered in evaluating traditional computer=20 assisted learning materials are magnified in the realm of a=20 hypermedia environment. They state that these difficulties=20 are due to the unique construction of hypermedia which allows=20 users a great deal of control in exploring the information=20 base. Because the learning is individualistic they note that=20 =D2...it is difficult to compare learning per se, since it=20 cannot be assumed that students have witnessed the same=20 subject information=D3 (p. 179). Quentin-Baxter & Dewhurst suggest that a two pronged approach=20 be taken in evaluating hypermedia materials. One prong=20 consists of evaluating the efficiency of the program in=20 regards to the user interface and functional design. They=20 suggest that an efficient system is one which provides users=20 with the best opportunity to obtain all the relevant subject=20 information that the system is able to offer. The second=20 prong is targeted at examining the subject matter of the=20 hypermedia system. It is suggested that transparent methods=20 of data collection be used to quantify what and how often=20 subject matter information is accessed by the user. 3.4 Role of Auditing Techniques Misanchuk & Schwier (1991) discuss the use of audit trails=20 for analyzing the user=D5s learning experience with interactive=20 media. They state that audit trails may prove useful for=20 determining the effects of taking different paths through the=20 instruction. They have experimented with a number of=20 different ways of displaying the information collected from=20 audit trails including: linear; branching; and hypermedia=20 structures. They state that there are three distinct=20 purposes for which audit trails may be used: 1) as data collection devices for formative evaluation in=20 instructional design; 2) as tools for basic research into the instructional=20 design of computer-based instruction and hypermedia; 3) as a means of auditing usage of mediated presentations=20 in a public forum (p. 1). They acknowledge that =D2..collecting massive amounts of data=20 is one thing; making sense of the data is quite another=D3 (p.=20 13). Bigum & Gilding state that few studies concentrate on the=20 interaction between learner and computer. They are=20 specifically interested in the way children learn science=20 using computer-based learning programs. They suggest using a=20 video monitoring technique which produces a synchronized=20 record of the computer output and the children=D5s use of the=20 computer program. They have found the following advantages=20 and disadvantages to using this technique (p. 99): Advantages ---------- 1) The technique produces a synchronized record of the=20 program output and student=D5s use of the program. 2) A permanent record can be analyzed carefully and away from=20 the studio environment. Little information is lost. 3) Most of the students=D5 activities and all their=20 conversations are recorded which results in a more=20 complete record than other techniques produce. Disadvantages ------------- 1) The technique requires considerable time to set up and=20 prepare if a studio cannot be left in a state that is=20 arranged for this purpose. 2) A TV studio or similar environment is a potential source=20 of distraction to the students. 3) The analysis and reporting of the data on the tapes is not=20 easy and is particularly time consuming. 4.0 Description of General Analytic Methodology The framework from which analytic research takes place is=20 different from traditional quasi-experimental or qualitative=20 methods. A major facet of analytic research is that the=20 context of the study takes place in a contrived setting. =20 Special attention must be placed on interpreting the=20 information from selected studies from within the context of=20 the original research setting. =D2The context of the event must be emphasized in its=20 interpretation. Interpretation takes on special=20 importance... because the events have occurred, and they=20 occurred before the decision was made to study them. As=20 documents were produced... interpretation was involved in=20 preparing the document. As the researcher uses the document,=20 interpretation again takes place. The researcher discovers=20 data as the search is conducted through documents and other=20 sources=D3 (Wiersma, 1991, p. 203). The analytic method can therefore be seen to contrast sharply=20 compared with quantitative and qualitative methods where the=20 researcher is actively involved with collecting and producing=20 data. A brief description of the general process for conducting=20 analytical research may be helpful at this point. The=20 following is a synthesis of Wiersma=D5s (1991) model which=20 describes the process in four major phases. Phase 1 -> Phase 2 -> Phase 3 -> Phase 4 Collection Synthesis =20 Identification and of Analysis of the Evaluation Information Interpretation Research of from Formulation of Problem Source Source Conclusions Materials Materials Phase 1 of the model states the research problem in terms of=20 the purpose of the research without any explicitly stated=20 hypotheses or questions. =20 Phase 2 consists of collecting documents for the information=20 base but =D2...does not consist of simply assembling all=20 available documents that appear to have some relevance to the=20 research problem=D3 (Wiersma, 1991, p. 208). The document in=20 question must survive not only the external criticism but=20 more importantly the internal criticism of the researcher. =20 Phase 3 places the relative value of the sources of=20 information under scrutiny. =D2Central ideas must be pulled=20 together and continuity between them developed=D3 (Wiersma,=20 1991, p. 211). =20 Phase 4 is characterized by logical analysis about the=20 information from the documents, interpreting information from=20 the collected documents, and then formulating conclusions=20 about the research problem. 4.1 Description of Applied Analytic Methodology The goal of the proposed research is to investigate the=20 effectiveness of the methodologies employed for evaluating=20 learning in a hypermedia environment. It is hoped that the=20 results of this research will be useful in the preparation of=20 evaluation strategies for future studies. It is also hoped=20 that this research will encourage others to spend more time=20 in the conceptualization stage of their research (Clark,=20 1989). The research method consists of identifying and collecting=20 the research studies which meet the criteria for hypermedia=20 as defined earlier. It is important that the study under=20 inspection possess all of the attributes described in the=20 definitions section if this research is to be meaningful. =20 Otherwise the research may potentially suffer from the same=20 criticisms of intra-medium studies. That is, the learners=20 perception of the learning environment may be affected by the=20 kinds and presentation of information (Thompson, Simonson, &=20 Hargrave, 1992). By selecting the medium carefully and=20 applying the selection standard rigorously, the degree of=20 external validity is hoped to increase. The studies that meet the criteria of the hypermedia=20 definition must also be research based and not anecdotal in=20 nature. As has been already indicated, there have been many=20 reports of hypermedia use but little hard research on the=20 features which lead to improved learning (Knussen, Tanner, &=20 Kibby, 1991). A synthesis of information from studies=20 conducted in a research context will yield more useful=20 results than an assemblage of all available documents=20 (Wiersma, 1991). The studies which survive the above criteria will serve as=20 the information base for the rest of the research. The=20 general methodology and related assessment instrument(s) for=20 each study will be synthesized and placed in tabular form=20 (see below for minimal categories to be considered). Special=20 attention will be given to the context in which the=20 assessment instruments were applied and utilized. This will=20 be done to respond to the criticisms of Clark (1986) who=20 called for establishing external validity in meta-analytic=20 type studies. In comparing the methodologies and their=20 related instruments, attention must be given to the who,=20 what, where, and when of the study (Wiersma, 1991). Categories ---------- Principle Investigator and Date of Study Target Audience of Instruction Content of Instruction Time Spent using Hypermedia Methodology(s) Used for Evaluation Relevant Instrument Used with Each Methodology Attention will then be focused on analyzing the learning=20 outcomes each methodology purports to assess. The learning=20 outcomes to be identified are those described by Gagn=8E=20 (1988). Type of Outcome Examples of Performance ------------------ ----------------------- Intellectual Skill Demonstrating, using symbols, as in the following: Concrete Concept Identifying a square; the edge of an object Defined Concept Classifying a fortress, using a definition Rule Demonstrating the procedure of expressing a mixed number as a fraction Higher-Order Rule Generating a rule for finding the length of the diagonal of a rectangle Cognitive Strategy Using an efficient method for remembering the contents of a picture Verbal Information Stating what happened to the Titanic Motor Skill Catching a fly ball Attitude Choosing swimming as exercise Since Gagn=8E first described the learning outcomes in 1984=20 much progress has been made in computer technology. It may=20 now be possible to use a hypermedia system to teach such non- traditional computer areas as motor skills (Knirk &=20 Christinaz, 1990). An examination made in light of this new=20 capability may yield surprising results. Care will be taken=20 to cross-reference the results with the contextual=20 information synthesized from the previous categorical=20 analysis. It is suggested that this relationship of=20 information be maintained throughout the study since each=20 methodology or instrument may be applied successfully to=20 different learning outcomes. By maintaining this dependence,=20 a greater degree of external validity is hoped to be=20 achieved. The key features of each methodology=D5s instrument(s) relative=20 to determining the learning outcome will then be summarized. =20 The information will be presented in tabular fashion. Care=20 should once again be taken to cross-reference the summary=20 with the results obtained earlier. It is hoped that the=20 final synthesis and interpretation may reveal facets of each=20 instrument not apparent when viewed from within the context=20 of a single study. Finally, the results of the earlier analysis will be used to=20 determine the appropriateness of each methodology for=20 evaluating the effectiveness of hypermedia for learning. =20 This analysis and interpretation will be presented in=20 narrative fashion. The conclusions reached will at a minimum=20 address the questions suggested in the =D2Purpose of Study=D3=20 section: o Is any methodology taken as a whole suited for analyzing=20 the effectiveness of learning in a hypermedia environment? o Are there elements of each methodology which may be useful=20 in answering a specific question about learning in a=20 hypermedia environment? o Which learning outcomes (as identified by Gagn=8E, 1988) do=20 the assessment instruments address most effectively? o Which features of each assessment instrument make it=20 useful for evaluating a specific learning outcome? It is hoped that the analysis of evaluation methodologies and=20 their relative appropriateness for assessing the=20 effectiveness of hypermedia presented in this study will=20 provide guidance to researchers planning evaluation or pure=20 research related to hypermedia and learning. References Bigum, C. J., & Gilding, A. (1985). A video monitoring =20 technique for investigating computer-based learning=20 programs. Computer Education, 9(2), 95-99. Brandon, P. R. (1988, October). Recent developments in=20 instructional hardware and software. Educational =20 Technology, pp. 7-11. Clark, R. E. (1986). Evidence for confounding in computer- based instruction studies: analyzing the meta-analysis. =20 Journal of Educational Communications and Technology,=20 33(4), 249-262. Clark, R. E. (1989). Current progress and future directions =20 for research in instructional technology. Educational=20 Technology Research & Development, 37(1), 57-66. Driscoll, M. P. (1984). Alternative paradigms for research=20 in instructional systems. Journal of Instructional=20 Development, 7(4), 2-5. Jonassen, D. H., & Grabinger, R. S. (1990). Problems and=20 issues in designing hypertext/hypermedia for learning. In=20 Jonassen, D. H., & Mandl, H. (Eds.), Designing Hypermedia=20 for Learning. Germany: Springer-Verlag. Knirk, F. G., & Christinaz, D. (1990, April). Instructional=20 technology adoption in the best adult training=20 organizations. Paper presented at the annual meeting of=20 the American Educational Research Organization, Boston,=20 MA. Knussen, C.; Tanner, G.; & Kibby, M. (1991). An approach to=20 the evaluation of hypermedia. Computers and Education.=20 17(1), 13-24. Lehigh University: College of Education. (1989). =20 Procedures for the Matriculation of Students in the Ed.D.=20 and Ph.D Programs. Bethlehem, PA. Marchionini, G. (1988, November). Hypermedia and learning::=20 freedom and chaos. Educational Technology. pp. 8-12. Marchionini, G. (1990). Evaluating hypermedia-based learning.=20 In Jonassen, D. H., & Mandl, H. (Eds.), Designing=20 Hypermedia for Learning. Germany: Springer-Verlag. Misanchuk, E. R., & Schwier, R. (1991). Interactive media=20 audit trails; approaches and issues. In Proceedings of=20 Selected Research Presentations at the Annual Convention=20 of the Association for Educational Communications and=20 Technology (ED 334 996). Morariu, J. (1988, November). Hypermedia in instructional and=20 training: the power and the promise. Educational=20 Technology, pp. 17-20. Nelson, A. (1992). A descriptive exploration of an=20 integrated hypermedia training environment: A dissertation=20 proposal. Unpublished report, Lehigh University,=20 Pennsylvania. Perry, T. S. (1987, November). Hypermedia: finally here. IEEE=20 Spectrum, pp. 38-39. Quentin-Baxter, M. & Dewhurst, D. (1992). A method for=20 evaluating the efficiency of presenting information in a=20 hypermedia environment. Computers and Education. 18 (1-3),=20 179-182. Roblyer, M. D., Castine, W. H., & King, F. J. (1988). =20 Assessing the impact of computer-based instruction: A=20 review of recent research. New York: Haworth. Sculley, J. (1988, Spring) The relationship between business=20 and higher education: a perspective on the twenty-first=20 century. EDUCOM Bulletin. pp. 20-24. Thompson, A. D., Simonson, M. R., & Hargrave, C. P. (1992). =20 Alternative research designs. In A. D. Thompson, M. R.=20 Simonson, & C. P. Hargrave (Eds.), Educational technology:=20 A review of the research (pp. 20-22). Washington, DC: =20 Association for Educational Communications & Technology. Thompson, A. D., Simonson, M. R., & Hargrave, C. P. (1992). =20 Aptitude treatment interaction studies. In A. D.=20 Thompson, M. R. Simonson, & C. P. Hargrave (Eds.),=20 Educational technology: A review of the research (pp. 18- 20). Washington, DC: Association for Educational=20 Communications & Technology. Thompson, A. D., Simonson, M. R., & Hargrave, C. P. (1992). =20 Research on hypermedia. In A. D. Thompson, M. R.=20 Simonson, & C. P. Hargrave (Eds.), Educational technology:=20 A review of the research (pp. 57-61). Washington, DC: =20 Association for Educational Communications & Technology. Wager, W., & Gagn=8E, R. M. (1988). Designing computer-aided=20 instruction. In D. H. Jonassen (ed.), Instructional=20 designs for microcomputer courseware. New Jersey:=20 Lawrence Erlbaum. Wiersma, W. (1991). Research methods in education (5th ed.).=20 Massachusetts: Allyn & Bacon. Table of Contents _______________________ The Virtual Square ================== edited by James Shimabukuro (jamess@uhunix.uhcc.Hawaii.Edu) The Virtual Square is devoted to non-refereed opinion columns and essays. Articles and Sections of this issue of the _Electronic Journal on The CYBERSPACE MONITOR ====================== ________________________________________________________________________ The _Electronic Journal on Virtual Culture_ may be retrieved by gopher to refmac.kent.edu under "Electronic Journals" or via anonymous ftp to byrd.mu.wvnet.edu or via e-mail message addressed to LISTSERV@KENTVM or LISTSERV@KENTVM.KENT.EDU (instructions below) or GOPHER gopher.cic.net Papers may be submitted at anytime by email or send/file to: Diane K. Kovacs- Editor-in-Chief, _Electronic Journal on Virtual Culture_ ejvcedit@kentvm.kent.edu _________________________________ *Copyright Declaration* Copyright of articles published by Electronic Journal on Virtual Culture is held by the author of a given article. If an article is re-published elsewhere it must include a statement that it was originally published by Electronic Journal on Virtual Culture. The EJVC Editors reserve the right to maintain permanent archival copies of all submissions and to provide print copies to appropriate indexing services for for indexing and microforming. _________________________________ ____________________________ GOPHER Instructions ____________________________ GOPHER to refmac.kent.edu 70 Electronic Journals/ ____________________________ Anonymous FTP Instructions ____________________________ ftp byrd.mu.wvnet.edu login anonymous password: users' electronic address cd /pub/ejvc get EJVC.INDEX.FTP get filename (where filename = exact name of file in INDEX) quit _______________________________ LISTSERV Retrieval Instructions _______________________________ Send e-mail addressed to LISTSERV@KENTVM (Bitnet) or LISTSERV@KENTVM.KENT.EDU Leave the subject line empty. The message must read: GET EJVCV2N3 CONTENTS Use this file to identify particular articles or sections then send e-mail to LISTSERV@KENTVM or LISTSERV@KENTVM.KENT.EDU with the command: GET where is the name of the article or section (e.g., author name) and is the V#N# of that issue of EJVC.