Las mejores prácticas actuales para crear contenido eLearning efectivo y agradable

Kristin Marroletti

OnCourse Learning

Douglas A. Johnson

Western Michigan University

Descargar / Download PDF

Abstract

 

Given the rapid growth of technology use in business and industry, there is an increasing need to research best practices for producing online learning materials. For behavior analysts, eLearning is a topic worth researching, as it can potentially merge the promise of Skinner’s teaching machines with the growing area of organizational behavior management (OBM). This paper presents an overview of some of the published research that addresses three components of generating palatable and effective eLearning: characteristics of eLearning users, the most critical components of instructional design, and how technology can be used effectively to enhance eLearning. The results show that using technology provides both aesthetic benefits through multimedia and visual effects, as well as instructional benefits that can maximize learner retention of the course material. Understanding the best practices for creating eLearning allows all of these benefits to be realized and put into action.

Keywords: eLearning, internet, computer-assisted instruction, behavior analysis, behavior-based instructional design

 

Resumen

 

Dado el rápido crecimiento del uso de tecnología en negocios e industrias, hay una creciente necesidad para investigar las mejores prácticas para producir materiales de aprendizaje en línea. Para los analistas de la conducta, eLearning es un tema digno de investigar, ya que potencialmente puede fusionar la promesa de las máquinas de enseñanza de Skinner con el área en crecimiento de la gestión de conducta en organizaciones (OBM). El presente artículo muestra un resumen de algunas de las investigaciones publicadas que describen tres componentes para generar contenido eLearning efectivo y agradable: características de los usuarios de eLearning, los componentes críticos del diseño instruccional y como la tecnología puede ser usada de manera efectiva para mejorar el eLearning. Los resultados muestran que el uso de tecnología proporciona tanto beneficios estéticos a través del uso de multimedia y efectos visuales, como beneficios instruccionales que pueden maximizar la retención del material del curso por parte del aprendiz. Entender las mejores prácticas para crear contenido eLearning permite lograr todos estos beneficios y ponerlos en práctica.

Palabras clave: eLearning, internet, instrucción asistida por computadora, análisis de la conducta, diseño instruccional basado en el comportamiento

 

 From the early days of its inception, behavior analysis has been seen as a means of improving the welfare of individuals and promoting societal good (Skinner, 1948; 1953). One way of approaching the objective of improving lives on a mass scale would be to analyze the environmental conditions for activities with a high investment of time. Few activities for youth can rival the numbers of hours invested in education. For adults, the workplace seems to be the chief comparable investment of time. In regards to education, behavior analysis has produced a rich literature of teaching models and approaches (see Heward et al., 2005 or Moran & Malott, 2004 for examples), dating all the way back to B. F. Skinner’s experimentation with teaching machines in the 1950s (Skinner, 1954; 1958). Although Skinner’s teaching machines did not have the impact he intended (Skinner, 1963), the potential for automated instruction remains today. In regards to the workplace, behavior analysis has also produced a rich literature in the form of organizational behavior management (OBM) (see C. M. Johnson, Redmon, & Mawhinney, 2001 for examples), which has origins dating back to the early 1960s (Aldis, 1961). The world of business has many concerns; among them is the need to train people in the best ways possible. Organizations routinely invest considerable amounts of time, money and other resources into training solutions (D. A. Johnson & Rubin, 2011). Many businesses have turned to eLearning as a method for cutting costs, reaching geographically remote trainees and ensuring consistency in training. For behavior analysts, eLearning is a relatively unexplored topic worth researching, as it can potentially merge the promise of Skinner’s teaching machines with the growing area of OBM. Most current eLearning is not optimally effective because it lacks sound instructional design (Cook-Wallace, 2012). Given the rapid growth of technology use in business and industry, there is an increasing need to research best practices for producing online learning materials. According to an instructional design perspective rooted in behavior analysis (henceforth referred to as “behavior-based instructional design”), one of the most basic types of learning relations is emotional learning (Tiemman & Markle, 1990). Instructional material, including eLearning, should be structured to evoke approach responses and minimize avoidance responses. It has long been noted, perhaps with some lament, that the adoption of instructional materials can be influenced by factors such as entertainment, attractiveness of visuals, and other superficial considerations, rather than effectiveness with learners (Dutch, 2005; Follett, 1985). However, this could also be viewed as an opportunity to take a program that is already effective in terms of learning and also make it effective in terms of adoptability. This might be achieved by pairing the instructional material with other stimuli likely to elicit a positive emotional response, which may in turn increase the probability of approach behaviors. A more in-depth analysis of the relevant respondent and operant conditioning processes underlying such complex sources of motivation is beyond the scope of the present paper, but an example in the form of the reading program Headsprout® (Layng, Twyman, & Stikeleather, 2004) may help illustrate the potential benefits. This reading program was designed with behavior-based instructional design principles and as such, it efficiently brings verbal responses under the appropriate control of visual stimuli. Beyond rigorous and empirically supported instructional sequences is another element in the program worth attending to—the simple fact that Headsprout is fun for children who are learning to read. It utilizes entertaining animations and endearing characters to increase the probability of approach behaviors, which is likely an important factor in its popularity and success. The same lesson may hold true for adult learners in the workplace: eLearning that is attractive and engaging may also be more likely to be a) purchased by the company and b) used by its employees. For these reasons, it is important to maintain attending behaviors, not just initial approach behaviors. This is more likely to be achieved when the common learning histories of employees (i.e., characteristic preferences) are described and then capitalized upon during the design stages of eLearning. Finally, it is critical to utilize appropriate stimulus control to ensure that the ultimate training objectives are achieved in the form of a job-ready repertoire. In other words, a trained employee should come to acquire job-relevant knowledge, skills and abilities that he or she did not demonstrate before training. Ideal eLearning requires a successful balance between learner satisfaction and retention of the course material. That is, the best online learning products will successfully utilize attractive visuals and other multimedia, while also providing effective instruction to maximize learning of important information. The remainder of this paper will present a brief overview of some of the published research that addresses three components of generating palatable and effective eLearning, including:

  • What characteristics of eLearning users are relevant for creating successful eLearning?
  • What are the most critical components of instructional design when creating successful eLearning?
  • How can technology be used effectively to enhance eLearning?

Characteristics of eLearning Users

While there are many theories that eLearning should be tailored to fit the needs and characteristics of various learners, there is often limited evidence to support such claims. For example, one theory popular amongst traditional educators refers to the importance of learning styles when creating high quality education. Yet the theory that learning styles significantly influence learning outcomes in eLearning is generally unsupported (Pashler, McDaniel, Rohrer, & Bjork, 2008). Furthermore, preferences regarding individual learning styles seem to have little impact on learning in any medium (Landrum & McDuffie, 2010; Pashler, McDaniel, Rohrer, & Bjork, 2008). So long as eLearning is founded upon sound instructional principles, any learner who is intellectually capable and motivated will be able to learn the material (Cook, Gelula, Dupras & Schwartz, 2007). Despite frequent concerns about how Baby Boomers, Gen Xers, or Millennials learn differently from each other, there is little evidence to support the claim that generational differences greatly affect learning outcomes in eLearning. While some generational differences exist which may impact workplace behavior, they are not significant enough to merit differential instructional design or learning modalities (Reeves, 2008). However, differential gender perceptions do play a limited role in learners’ decisions to utilize eLearning, where men may be more likely to use eLearning when they perceive it to be useful and women may be more likely to use eLearning when they perceive it to be easy to use (Ong & Lai, 2006). It is also important to develop learning persistence for eLearning users, as many online learning outcomes rely heavily upon the dedication of the learner. While there is evidence of gender differences regarding qualities leading to satisfaction with eLearning, research also indicates that when any learners believe that eLearning content is useful and easy to navigate, they are generally more satisfied with the eLearning environment (Calli, Balcikanli & Calli, 2013). Subsequently, when learners are highly satisfied with eLearning in general, they are likely to be more persistent within the eLearning environment (Joo, Joung, Kim & Chung, 2012) and much less likely to drop out of an eLearning course (Park & Choi, 2009). Additionally, learners who are capable of interacting with different types of media are more likely to be successful at learning information from those types of media. In essence, learners are only able to learn eLearning content when they know how to use the eLearning interface (Hirumi, 2002). Therefore, it is crucial that eLearning be easy to use as well as relevant to learners’ educational and professional goals.

Instructional Design Pedagogy and Principles

In order to ensure that learners understand and retain important information, several instructional elements must be incorporated into eLearning materials. One of the most critical factors for successful learning is overt responding, where the learner actively interacts with the eLearning interface in some way (Miller & Malott, 1997). Overt responding can vary drastically in terms of how the learner responds (e.g., answering a multiple choice question, writing an essay, filling in a blank, clicking on a picture, etc.), and thus how meaningful the learner’s response is. Overt learner responding should be frequent (Kritch & Bostow, 1998) and include consistent and frequent practice of the course material (Martin, Klein & Sullivan, 2007). Frequent overt responding and learner practice have both been found to improve learning retention, where the learner can demonstrate skills relating to the previously learned material at a later time, and learner satisfaction. It is important to note, however, that the most effective overt responding is that which forces the learner to engage meaningfully with the material. For example, learners who are asked to answer fill-in-the-blank questions tend to have higher learning achievement scores than those who are asked to answer multiple choice questions (D. A. Johnson & Rubin, 2011). Thus, overt responding should produce meaningful and purposeful active interactions between the interface and the learner, if maximizing skill acquisition and mastery is to be a priority. In the terminology of behavior-based instructional design, this means that learners should be required to engage in meaningful responding (Markle, 1990). Simply pressing a button to advance screens in eLearning may be active, but it is not meaningful since learners are not demonstrating their mastery of the instructional content (D. A. Johnson & Dickinson, 2012). One useful method for increasing the effectiveness of overt responding and learner practice is to incorporate branching-type review questions over course material (Green, Eppler, Ironsmith & Wuensch, 2007). Branching-type review questions involve presenting an initial review question to the learner and subsequent questions are chosen based on the learner’s response to the initial question. If the learner responds correctly to the initial review question (indicating mastery of the information), a new review question on different information from the initial review question is presented. If the learner responds incorrectly to the initial review question (indicating that more review is needed), a new review question on the same information as the initial review question is presented. For example, if a learner is studying single digit and doubledigit addition, he may be presented with a single digit addition problem as a review question. If he answers this initial question correctly, a double-digit addition problem is presented. If he answers the initial question incorrectly, then another single digit addition problem is presented. This process repeats itself until the learner has correctly answered both single and double-digit addition questions correctly. Branching-type review questions enable learners to continue practicing material that they have not yet mastered, while quickly moving through material they already understand. As with theories about eLearning users, there are several popular claims about instructional design that are unsupported by empirical evidence in current literature. For instance, while learners may prefer to use a learner-controlled eLearning course (i.e., the learner controls the interactions and pace of the eLearning course) there are no definitive achievement advantages of such instruction over the more traditional program-controlled eLearning courses (i.e., the pace and interactions of the eLearning course are pre-programmed and unchanged by the learner). In fact, learner-controlled courses may not be appropriate for all learners, and creating these types of courses involves far more programming time than program-controlled courses (Schnackenberg & Sullivan, 2000). Other research has shown that the distribution of quiz questions within an eLearning environment has little to no effect on learners’ performance or reaction to the course (Whittam, Dwyer & Leeming, 2004). This indicates that while frequent responding and practice is important for successful learning, it does not matter whether responding occurs as three questions every five frames or 12 questions every 20 frames, or any other combination thereof. So long as the learner is overtly responding to material frequently throughout the eLearning environment, the distribution of responding is not likely to greatly influence learner outcomes.

Recommendations for Implementing eLearning Technology

One of the biggest benefits of using eLearning is the use of technology to enhance and support learning. The following recommendations pertain to methods for maximizing the potential of current technology and creating eLearning that generates successful learner outcomes. Using multimedia in eLearning is an extremely common practice today. However, inserting pictures and videos haphazardly into an eLearning course may not provide any additional information to the learner, and may actually hinder learning if not done correctly. Relying too heavily on multimedia, such as presenting auditory, textual and visual information simultaneously, may inhibit learning due to what is often termed “increased cognitive load” (Kalyuga, Chandler & Sweller, 1999). Cognitive load theory is often used to refer to limitations of working memory and the use of multiple sensory channels (Hollender, Hofmann, Deneke, & Schmitz, 2010). From a behavioral perspective, such events and structures are unnecessary inferences involving observed patterns of stimulus control. Such observations have shown that a dualmode presentation of information is much more effective when the two modes present different information, rather than simultaneously presenting identical information (e.g., a learner reading words on a screen while listening to narration of the words at the same time). These lines of research have also shown that when two separate but relevant stimuli are presented, such material should be placed in close physical proximity to one another (e.g., the text explaining a diagram should be located near that diagram; Morena & Mayer, 1999). Pictures appear to be helpful only when they relate directly to relevant information that the learner needs to know, which indicates that multimedia should be used to supplement information rather than increase learner satisfaction with the eLearning environment (Van Genuchten, Scheiter & Schuler, 2012). Similarly, placing videos in eLearning may sometimes lead to better learning outcomes, but this depends on the reason and method for using them (Zhang, Zhou, Briggs & Nunamaker Jr., 2006). For example, one study (Zhang, Zhang, Zhou, Briggs & Nunamaker Jr., 2006) provided evidence that using non-interactive videos, where the learner cannot use control buttons to interact with the video as it plays, did not increase learners’ test scores or satisfaction ratings. Additionally, students in the study expressed frustration when they could not browse videos or skip to specific information, and stated that this lack of control made them less likely to try to re-watch a video to review information they did not understand. Therefore, it may not be enough to simply place videos or other media into eLearning in order to produce better learning outcomes. Animations can also be integrated into eLearning in order to make eLearning more attractive, grab the attention of the learner and clarify information in the course (Weiss, Knowlton & Morrison, 2002). In order to use animations effectively, the physical properties of the animations (e.g. texture, color, size, etc.) should be consistent throughout the course and animations should be accompanied by some sort of verbal explanation (either verbal or textual, but not both concurrently). Timing methods within eLearning can also be used to enhance skill acquisition and learner retention. For example, many learners move too quickly through instructional programs, making frequent mistakes and learning less material as a result of this racing pattern of behavior. Postfeedback delays are one method of timing where learners are temporarily prevented from advancing between interactions for as little as five to ten seconds, thus allowing additional time for exposure to the relevant instructional material and programmed feedback. Even this brief delay can significantly improve learner performance in eLearning courses (D. A. Johnson & Dickinson, 2012). Another method of timing relates to quizzing and practice within eLearning. Timed quiz takers often take less time to complete quizzes than untimed quiz takers, and generally demonstrate greater retention of the course material, when assessed at a later time (Brothen & Wambach, 2004). Finally, color and contrast can be simple but powerful tools for making information quickly accessible and understandable to learners. Understanding learners’ reported preferences is an effective way to choose colors and formatting elements and information in eLearning. For example, in a study by Van Schaik & Ling (2003) learners were more accurate and satisfied when blue links were used against a white background compared to using black links against a white background. This may be due to the fact that many websites and online environments use blue links against white backgrounds to attract attention quickly to relevant information. As such, the colors may have come to acquire discriminative control for most learners. Instructional designers of eLearning should take this common learning history into account. Additionally, using high contrasts between elements in eLearning helps learners to rapidly identify important information (Lohr, 2000). Signaling serves as a visual prompt to learners, and has a highly positive effect on retention of important information because it increases the probability of attending behaviors (Jamet, 2014). When prompted by some sort of visual stimuli (e.g., color), learners are likely to orient towards appropriate information rather than distractors. Interestingly, prolonged guidance in the form of these visual prompts can also help learners to predict where relevant information is going to be even before the visual stimuli appears to prompt the learner (Jamet, 2014). For example, if relevant information always appears bolded and green at the top of the page, then the learner will eventually begin to automatically look for the important information at the top of the page, even before seeing bolded, green words on the screen.

Conclusion

lthough there is still much to discover in this area. Using technology provides both aesthetic benefits through multimedia and visual effects, as well as instructional benefits that can maximize skill acquisition and learner retention of the course material. Understanding the best practices for creating eLearning allows all of these benefits to be realized and put into action. Based on this preliminary literature review, several best practices and tentative assertions (due to limited availability of empirical demonstrations) have been discovered by recent research on eLearning. ELearning creators should focus less on characteristics of learners, such as age and learning styles, and instead put more effort into creating eLearning that is easy to use and relevant to learners’ educational and professional goals. Overt responding is critical but not always sufficient for effective eLearning. In order to maximize skill acquisition and retention, interactions between the learner and the eLearning interface should be active, frequent, and meaningful. One method to creating these types of interactions is to use branching-type review questions, which allow learners to practice and review material for which they have not yet demonstrated mastery, while quickly moving through material they already understand. When using media to enhance eLearning material, it is helpful to consider dualmode presentation of information, where the two modes present different information, rather than simultaneously presenting identical information. Media should be used carefully and only when relevant to the eLearning material being presented. Postfeedback delays and timed quizzes are also helpful tools for successful student outcomes. Finally, purposeful decisions regarding visual stimuli can help learners efficiently identify relevant information. Using high contrasts between elements in eLearning and using consistent visual prompts allow learners to attend to important information and filter out other distracting stimuli. Given the popularity and importance of eLearning solutions, behavior analysts should consider becoming more involved in this research to create the next standard for best practices. Much of the current research in this area is guided by inferred hypothetical processes and structures. Explanatory variables such as “mental schemas” and “cognitive load” are commonly cited in eLearning literature. Many of these variables involve observed limitations in controlling stimuli and the learning process. The problem lays in the standard cognitive explanations for the data, which include metaphorical inferences that do not help pinpoint the relevant environmental variables. The metaphors and inferences often drive investigators away from observable processes and environmental manipulations with practical value (Skinner, 1950). Instead of inferring models, behavior analysts can propel this research by investigating and describing behavior-environmental relations in concrete and practical terms. The business world is driven by the necessity to achieve results, not inferred structures or processes. As such, behavior analysts may be the ideal researchers to meet this need in eLearning and beyond.

References

  • Aldis, O. (1961). Of pigeons and men. Harvard Business Review, 39, 59-63.
  • Bernard, M., & Lida, B. (2002). A comparison of popular online fonts: Which size and type is best? Usability News, 4(1).
  • Blummer, B. A., & Kritskya, O. (2009). Best practices for creating an online tutorial: A literature review. Journal of Web Librarianship, 3, 199-216.
  • Brothen, T., & Wambach, C. (2004). The value of time limits on internet quizzes. Teaching of Psychology, 31, 62-64.
  • Brown, B. L. (2000). Web-based training. Report No. EDO-CE-00-218. Washington, DC: Office of Educational Research and Improvements.
  • Calli, L., Balcikanli, C., & Calli, F. (2013). Identifying factors that contribute to the satisfaction of students in e-learning. Turkish Online Journal of Distance Education, 14, 85-101.
  • Cook, D. A., & Dupras, D. M. (2004). A practical guide to developing effective webbased learning. Journal of General Internal Medicine, 19, 698-707. doi:10.1111/ j.1525-1497.2004.30029.x
  • Cook, D. A., Gelula, M. H., Dupras, D. M., & Schwartz, A. (2007). Instructional methods and cognitive and learning styles in web-based learning: Report of two randomised trials. Medical Education, 41, 897-905. doi:10.1111/j.1365-2923. 2007.02822.x
  • Cook-Wallace, M. K. (2012). Testing the significance of core components of online education. The Business Review, 19, 64-70.
  • Dikshit, J., Garg, S., & Panda, S. (2013). Pedagogic effectiveness of print, interactive multimedia, and online resources: A case study of IGNOU. International Journal of Instruction, 6, 193-210.
  • Dominguez, A., Saenz-de-Navarrete, J., de-Marcos, L., Fernandez-Sanz, L., Pages, C., & Martinez-Herraiz, J. (2013). Gamifying learning experiences: Practical implications and outcomes. Computers and Education, 63, 380-392. doi:10.1016/j. compedu.2012.12.020
  • Dutch, S. I. (2005). Why textbooks are the way they are. Academic Questions, 18, 34-48.
  • Ellis, R.A., Ginns, P., & Piggott, L. (2009). E-learning in higher education: Some key aspects and their relationship to approaches to study. Higher Education Research & Development, 28, 303-318. doi:10.1080/07294360902839909
  • Follett, R. (1985). The school textbook adoption process. Publishing Research Quarterly, 1, 19-23.
  • Fox, E. J., & Sullivan, H. J. (2007). Comparing strategies for teaching abstract concepts in an online tutorial. Journal of Educational Computing Research, 37(3), 307-330. doi:10.2190/EC.37.3.e
  • Green, R. S., Eppler, M. A., Ironsmith, M., & Wuensch, K. I. (2007). Review question formats and web design usability in computer-assisted instruction. British Journal of Educational Technology, 38(4), 679-686. doi:10.1111/j.1467-8535.2006.00649.x
  • Gunawardena, C. N., Linder-VanBerschot, LaPointe, D. K., & Rao, L. (2010). Predictors of learner satisfaction and transfer of learning in a corporate online education program. American Journal of Distance Education, 24, 207-226. doi:10.1080/089 23647.2010.522919
  • Heward, W. L., Heron, T. E., Neef, N. A., Peterson, S. M., Sainato, D. M., Cartledge, G., …Dardig, J. C. (Eds.). (2005). Focus on behavior analysis in education: Achievements, challenges, and opportunities. Upper Saddle River, NJ: Pearson Education, Inc.
  • Hirumi, A. (2002). A framework for analyzing, designing, and sequencing planned elearning interactions. The Quarterly Review of Distance Education, 3, 141-160.
  • Hollender, N., Hofmann, C., Deneke, M., & Schmitz, B. (2010). Integrating cognitive load theory and concepts of human-computer interaction. Computers in Human Behavior, 26, 1278-1288. doi:10.1016/j.chb.2010.05.031
  • Jamet, E. (2014). An eye-tracking study of cueing effects in multimedia learning. Computer in Human Behavior, 32, 47-53. doi:10.1016/j.chb.2013.11.013
  • Janicki, T., & Liegle, J. O. (2001). Development and evaluation of a framework for creating web-based learning modules: A pedagogical and systems perspective. Journal of Asynchronous Learning Networks, 5, 58-84.
  • Johnson, C. M., Redmon, W. K., & Mawhinney, T. C. (Eds.). (2001). Handbook of organizational performance: Behavior analysis and management. Binghamton, NY: Haworth Press, Inc.
  • Johnson, D. A., & Dickinson, A. M. (2012). Using postfeedback delays to improve retention of computer-based instruction. The Psychological Record, 62, 485-496.
  • Johnson, D. A., & Rubin, S. (2011). Effectiveness of interactive computer-based instruction: A review of studies published between 1995 and 2007. Journal of Organizational Behavior Management, 31, 55-94. doi:10.1080/01608061.2010.541821
  • Joo, Y. J., Joung, S., Kim, N. H., & Chung, H. M. (2012). Factors impacting corporate e-learners’ flow, satisfaction, and learning persistence. International Association for Development of the Information Society, Madrid, Spain.
  • Kalyuga, S., Chandler, P., & Sweller, J. (1999). Managing split-attention and redundancy in multimedia instruction. Applied Cognitive Psychology, 13, 351-371. doi:10.1002/(SICI)1099-0720(199908)13:4<351::AID-ACP589>3.0.CO;2-6
  • Ke, C., Sun, H., Yang, Y. (2012). Effects of user and system characteristics on perceived usefulness and perceived ease of use for the web-based classroom response system. Turkish Online Journal of Distance Education, 11, 128-143.
  • Kritch, K. M., & Bostow, D. E. (1998). Degree of constructed-response interaction in computer-based programmed instruction. Journal of Applied Behavior Analysis, 31(3), 387-398. doi:10.1901/jaba.1998.31-387
  • Landrum, T. J., & McDuffie, K. A. (2010). Learning styles in the age of differentiated instruction. Exceptionality, 18, 6-17. doi:10.1080/09362830903462441
  • Layng, T. V. J., Twyman, J. S., & Stikeleather, G. (2004). Selected for success: How Headsprout Reading Basics teaches children to read. In D. J. Moran & R. W. Malott (Eds.), Evidence-based educational methods (pp. 171-197). St. Louis, MO: Elsevier/ Academic Press.
  • Lohr, L. L. (2000). Designing the instructional interface. Computer in Human Behavior, 16, 161-182. doi:10.1016/S0747-5632(99)00057-6
  • Marchese, T. (2000). Learning and e-learning. Change: The Magazine of Higher Learning, 32, 4-4.
  • Markle, S. M. (1990). Designs for instructional designers. Champaign, IL: Stipes Publishing Company.
  • Martin, F., Klein, J. D., & Sullivan, H. (2007). The impact of instructional elements in computer-based instruction. British Journal of Education Technology, 38(4), 623- 636. doi:10.1111/j.1467-8535.2006.00670.x
  • Moreno, R., & Mayer, R. E. (1999). Cognitive principles of multimedia learning: The role of modality and contiguity. Journal of Educational Psychology, 91, 358-368. doi:10.1037/0022-0663.91.2.358
  • Miller, M. L., & Malott, R.W. (1997). The importance of overt responding in programmed instruction even with added incentives for learning. Journal of Behavioral Education, 7, 497-503. doi:10.1023/A:1022811503326
  • Moran, D. J., & Malott, R. W. (Eds.). (2004). Evidence-based educational methods. San Diego, CA: Elsevier Academic Press.
  • Ong, C., & Lai, J. (2006). Gender differences in perceptions and relationships among dominants of e-learning acceptance. Computers in Human Behavior, 22, 816-829. doi:10.1016/j.chb.2004.03.006
  • Park, J., & Choi, H. J. (2009). Factors influencing adult learners’ decision to drop out or persist in online learning. Educational Technology & Society, 12, 207-217.
  • Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological Science in the Public Interest, 9, 105-119. doi:10.1111/j.1539-6053.2009.01038.x
  • Reeves, T. C., & Oh, E. J. (2008). Do generational differences matter in instructional design? (2008). Instructional Technology Forum, University of Georgia. Accessed on March 17, 2014.
  • Schnackenberg, H. L., & Sullivan, H. J. (2000). Learner control over full and lean computer-based instruction under differing ability levels. Educational Technology Research & Development, 48(2), 19-35. doi:10.1007/BF02313399
  • Shute, V., & Towle, B. (2003). Adaptive e-learning. Educational Psychologist, 38, 105- 114. doi:10.1207/S15326985EP3802_5
  • Schlinger, H. D. (1995). A behavior analytic view of child development. New York, NY: Plenum Press.
  • Skinner, B. F. (1948). Walden Two. Englewood Cliffs, NJ: Prentice-Hall, Inc. Skinner, B. F. (1950). Are theories of learning necessary? Psychological Review, 57, 193-216. doi:10.1037/h0054367
  • Skinner, B. F. (1953). Science and human behavior. New York, NY: The Free Press.
  • Skinner, B. F. (1954). The science of learning and the art of teaching. Harvard Educational Review, 24, 86-97.
  • Skinner, B. F. (1958). Teaching machines. Science, 128, 969-977. doi:10.1126/science.128.3330.969
  • Skinner, B.F. (1963). Reflections on a decade of teaching machines. Teachers College Record, 65, 168-177.
  • Tiemann, P. W. & Markle, S. M. (1990). Analyzing instructional content: A guide to instruction and evaluation. Champaign, IL: Stipes Publishing Company.
  • Van Genuchten, E., Scheiter, K., & Schuler, A. (2012). Examining learning from text and pictures for different task types: Does the multimedia effect differ for conceptual, causal, and procedural tasks? Computers in Human Behavior, 28, 2209-2218. doi:10.1016/j.chb.2012.06.028
  • Van Schaik, P., & Ling, J. (2003). The effect of link colour on information retrieval in educational intranet use. Computers in Human Behavior, 19, 553-564. doi:10.1016/S0747-5632(03)00004-9
  • Weiss, R. E., Knowlton, D. S., & Morrison, G. R. (2002). Principles for using animation in computer-based instruction: Theoretical heuristics for effective design. Computers in Human Behavior, 18, 465-477. doi:10.1016/S0747-5632(01)00049-8
  • Whittam, K. P., Dwyer, W. O., & Leeming, F. C. (2004). Effects of quiz distribution on web-based instruction in an industrial training environment. Journal of Educational Computing Research, 30, 57-68. doi:10.2190/N88A-4CDW-J25G-Y6Y7
  • Zhang, D., Zhou, L., Briggs, R. O., Nunamaker Jr., J. F. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information and Management, 43, 15-27. doi:10.1016/j.im.2005.01.004

Inicia la discusión (0)

Parece que no hay comentarios en esta entrada. ¿Porqué no agregas uno e inicias la discusión?

Trackbacks y Pingbacks (0)

Abajo hay un recuento de los trackbacks y pingbacks relacionados con este artículo. Estos se refieren a los sitios que hacen mención o referencia de esta entrada.