Evidence-Based Best Practices for Interactive and Motivational Online Learning Environments - PowerPoint PPT Presentation

About This Presentation

Evidence-Based Best Practices for Interactive and Motivational Online Learning Environments


Evidence-Based Best Practices for Interactive and Motivational Online Learning Environments Dr. Curtis J. Bonk Associate Professor, Indiana University – PowerPoint PPT presentation

Number of Views:166
Avg rating:3.0/5.0


Transcript and Presenter's Notes

Title: Evidence-Based Best Practices for Interactive and Motivational Online Learning Environments

Evidence-Based Best Practices for Interactive and
Motivational Online Learning Environments
  • Dr. Curtis J. Bonk
  • Associate Professor, Indiana University
  • President, CourseShare.com
  • http//php.indiana.edu/cjbonk,
  • cjbonk_at_indiana.edu

Are you ready???
Brains Before and After E-learning
And when use synchronous and asynchronous tools
Tons of Recent Research
  • Not much of it
  • ...is any good...

Problems and Solutions(Bonk, Wisher, Lee, in
  1. Tasks Overwhelm
  2. Confused on Web
  3. Too Nice Due to Limited Share History
  4. Lack Justification
  5. Hard not to preach
  6. Too much data
  7. Communities not easy to form
  • Train and be clear
  • Structure time/dates due
  • Develop roles and controversies
  • Train to back up claims
  • Students take lead role
  • Use Email Pals
  • Embed Informal/Social

Benefits and Implications(Bonk, Wisher, Lee,
in review)
  1. Shy open up online
  2. Minimal off task
  3. Delayed collab more rich than real time
  4. Students can generate lots of info
  5. Minimal disruptions
  6. Extensive E-Advice
  7. Excited to Publish
  • Use async conferencing
  • Create social tasks
  • Use Async for debates Sync for help, office
  • Structure generation and force reflection/comment
  • Foster debates/critique
  • Find Experts or Prac.
  • Ask Permission

Basic Distance Learning Finding?
  • Research since 1928 shows that DL students
    perform as well as their counterparts in a
    traditional classroom setting.
  • Per Russell, 1999, The No Significant Difference
    Phenomenon (5th Edition), NCSU, based on 355
    research reports.
  • http//cuda.teleeducation.nb.ca/nosignificantdiffe

Online Learning Research Problems (National
Center for Education Statistics, 1999 Phipps
Merisotos, 1999 Wisher et al., 1999).
  • Anecdotal evidence minimal theory.
  • Questionable validity of tests.
  • Lack of control group.
  • Hard to compare given different assessment tools
    and domains.
  • Fails to explain why the drop-out rates of
    distance learners are higher.
  • Does not relate learning styles to different
    technologies or focus on interaction of multiple

Online Learning Research Problems(Bonk Wisher,
  • For different purposes or domains in our study,
    13 concern training, 87 education
  • Flaws in research designs
  • - Only 36 have objective learning measures
  • - Only 45 have comparison groups
  • When effective, it is difficult to know why
  • - Course design?
  • - Instructional methods?
  • - Technology?

Evaluating Web-Based Instruction Methods and
Findings (41 studies)(Olson Wisher, in review)
Evaluating Web-Based Instruction Methods and
Findings(Olson Wisher, in review)
  • there is little consensus as to what variables
    should be examined and what measures of of
    learning are most appropriate, making comparisons
    between studies difficult and inconclusive.
  • e.g., demographics (age, gender), previous
    experience, course design, instructor
    effectiveness, technical issues, levels of
    participation and collaboration, recommendation
    of course, desire to take addl online courses.

Evaluating Web-Based Instruction Methods and
Findings(Olson Wisher, in review)
  • Variables Studied
  • Type of Course Graduate (18) vs. undergraduate
    courses (81)
  • Level of Web Use All-online (64) vs.
    blended/mixed courses (34)
  • Content area (e.g., math/engineering (27),
    science/medicine (24), distance ed (15), social
    science/educ (12), business (10), etc.)
  • Attrition data (34)
  • Comparison Group (59)

Different Goals
  • Making connections
  • Appreciating different perspectives
  • Students as teachers
  • Greater depth of discussion
  • Fostering critical thinking online
  • Interactivity online

Electronic Conferencing Quantitative Analyses
  • Usage patterns, of messages, cases, responses
  • Length of case, thread, response
  • Average number of responses
  • Timing of cases, commenting, responses, etc.
  • Types of interactions (11 1 many)
  • Data mining (logins, peak usage, location,
    session length, paths taken, messages/day/week),
    Time-Series Analyses (trends)

Electronic Conferencing Qualitative Analyses
  • General Observation Logs, Reflective interviews,
    Retrospective Analyses, Focus Groups
  • Specific Semantic Trace Analyses, Talk/Dialogue
    Categories (Content talk, questioning, peer
    feedback, social acknowledgments, off task)
  • Emergent Forms of Learning Assistance, Levels of
    Questioning, Degree of Perspective Taking, Case
    Quality, Participant Categories

Overall frequency of interactions across chat
categories (6,601 chats).
Research on Instructors Online
  • If teacher-centered, less explore, engage,
    interact (Peck, and Laycock, 1992)
  • Informal, exploratory conversation fosters
    risktaking knowledge sharing (Weedman, 1999)
  • Four Key Acts of Instructors
  • pedagogical, managerial, technical, social
  • (Ashton, Roberts, Teles, 1999)
  • Instructors Tend to Rely on Simple Tools
  • (Peffers Bloom, 1999)
  • Job Varies--Plan, Interaction, Admin, Tchg
  • (McIsaac, Blocher, Mahes, Vrasidas, 1999)

Study of Four Classes(Bonk, Kirkley, Hara,
Dennen, 2001)
  • TechnicalTrain, early tasks, be flexible,
    orientation task
  • ManagerialInitial meeting, FAQs, detailed
    syllabus, calendar, post administrivia, assign
    e-mail pals, gradebooks, email updates
  • PedagogicalPeer feedback, debates, PBL, cases,
    structured controversy, field reflections,
    portfolios, teams, inquiry, portfolios
  • SocialCafé, humor, interactivity, profiles,
    foreign guests, digital pics, conversations,

Network Conferencing Interactivity (Rafaeli
Sudweeks, 1997)
  • 1. gt 50 percent of messages were reactive.
  • 2. Only around 10 percent were truly interactive.
  • 3. Most messages factual stmts or opinions
  • 4. Many also contained questions or requests.
  • 5. Frequent participators more reactive than low.
  • 6. Interactive messages more opinions humor.
  • 7. More self-disclosure, involvement,
  • 8. Attracted to fun, open, frank, helpful,
    supportive environments.

Week 4
Scattered Interaction (no starter)
Collaborative Behaviors(Curtis Lawson, 1997)
  • Most common were (1) Planning, (2) Contributing,
    and (3) Seeking Input.
  • Other common events were
  • (4) Initiating activities,
  • (5) Providing feedback,
  • (6) Sharing knowledge
  • Few students challenge others or attempt to
    explain or elaborate
  • Recommend using debates and modeling appropriate
    ways to challenge others

Online Collaboration Behaviors by Categories (US
and Finland)
Behavior Categories Conferences () Conferences () Conferences ()
Behavior Categories Finland U.S. Average
Planning 0.0 0.0 0.0
Contributing 80.8 76.6 78.7
Seeking Input 12.7 21.0 16.8
Reflection/ Monitoring 6.1 2.2 4.2
Social Interaction 0.4 0.2 0.3
Total 100.0 100.0 100.0
Dimensions of Learning Process(Henri, 1992)
  • 1. Participation (rate, timing, duration of
  • 2. Interactivity (explicit interaction, implicit
    interaction, independent comment)
  • 3. Social Events (stmts unrelated to content)
  • 4. Cognitive Events (e.g., clarifications,
    inferencing, judgment, and strategies)
  • 5. Metacognitive Events (e.g., both metacognitive
    knowledgeperson, and task, and strategy and well
    as metacognitive skillevaluation, planning,
    regulation, and self-awareness)

Some Findings (see Hara, Bonk, Angeli, 2000)
  • Social (in 26.7 of units coded)
  • social cues decreased as semester progressed
  • messages gradually became less formal
  • became more embedded within statement
  • Cognitive (in 81.7 of units)
  • More inferences judgments than elem
    clarifications and in-depth clarifications
  • Metacognitive (in 56 of units)
  • More reflections on exper self-awareness
  • Some planning, eval, regulation self qing

Surface vs. Deep Posts(Henri, 1992)
  • Surface Processing
  • making judgments without justification,
  • stating that one shares ideas or opinions already
  • repeating what has been said
  • asking irrelevant questions
  • i.e., fragmented, narrow, and somewhat trite.
  • In-depth Processing
  • linked facts and ideas,
  • offered new elements of information,
  • discussed advantages and disadvantages of a
  • made judgments that were supported by examples
    and/or justification.
  • i.e., more integrated, weighty, and refreshing.

(No Transcript)
Critical Thinking (Newman, Johnson, Webb
Cochrane, 1997)
  • Used Garrisons five-stage critical thinking
  • Critical thinking in both CMC and FTF envir.
  • Depth of critical thinking higher in CMC envir.
  • More likely to bring in outside information
  • Link ideas and offer interpretations,
  • Generate important ideas and solutions.
  • FTF settings were better for generating new ideas
    and creatively exploring problems.

Unjustified Statements (US)
  • 24. Author Katherine
  • Date Apr. 27 312 AM 1998
  • I agree with you that technology is definitely
    taking a large part in the classroom and will
    more so in the future
  • 25. Author Jason Date Apr. 28 147 PM 1998
  • I feel technology will never over take the role
    of the teacher...I feel however, this is just
    help us teachers...
  • 26. Author Daniel Date Apr. 30 011 AM 1998
  • I believe that the role of the teacher is being
    changed by computers, but the computer will never
    totally replace the teacher... I believe that the
    computers will eventually make teaching easier
    for us and that most of the children's work will
    be done on computers. But I believe that there

Indicators for the Quality of Students
Dialogue(Angeli, Valanides, Bonk, in review)
Social Construction of Knowledge (Gunawardena,
Lowe, Anderson, 1997)
  • Five Stage Model
  • 1. Share ideas,
  • 2. Discovery of Idea Inconsistencies,
  • 3. Negotiate Meaning/Areas Agree,
  • 4. Test and Modify,
  • 5. Phrase Agreements
  • In global debate, very task driven.
  • Dialogue remained at Phase I sharing info

Social Constructivism and Learning Communities
Online (SCALCO) Scale. (Bonk Wisher, 2000)
  • ___ 1. The topics discussed online had real world
  • ___ 2. The online environment encouraged me to
    question ideas and perspectives.
  • ___ 3. I received useful feedback and mentoring
    from others.
  • ___ 4. There was a sense of membership in the
    learning here.
  • ___ 5. Instructors provided useful advice and
    feedback online.
  • ___ 6. I had some personal control over course
    activities and discussion.

Kirkpatricks 4 Levels
  • Reaction
  • Learning
  • Behavior
  • Results

My Evaluation Plan
Measures of Student Success(Focus groups,
interviews, observations, surveys, exams, records)
  • Positive Feedback, Recommendations
  • Increased Comprehension, Achievement
  • High Retention in Program
  • Completion Rates or Course Attrition
  • Jobs Obtained, Internships
  • Enrollment Trends for Next Semester

1. Student Basic Quantitative
  • Grades, Achievement
  • Number of Posts
  • Participation
  • Computer Log Activitypeak usage, messages/day,
    time of task or in system
  • Attitude Surveys

1. Student High-End Success
  • Message complexity, depth, interactivity, qing
  • Collaboration skills
  • Problem finding/solving and critical thinking
  • Challenging and debating others
  • Case-based reasoning, critical thinking measures
  • Portfolios, performances, PBL activities

2. Instructor Success
  • High student evals more signing up
  • High student completion rates
  • Utilize Web to share teaching
  • Course recognized in tenure decisions
  • Varies online feedback and assistance techniques

3. TrainingOutside Support
  • Training (FacultyTraining.net)
  • Courses Certificates (JIU, e-education)
  • Reports, Newsletters, Pubs
  • Aggregators of Info (CourseShare, Merlot)
  • Global Forums (FacultyOnline.com GEN)
  • Resources, Guides/Tips, Link Collections, Online
    Journals, Library Resources

(No Transcript)
3. TrainingInside Support
  • Instructional Consulting
  • Mentoring (strategic planning )
  • Small Pots of Funding
  • Facilities
  • Summer and Year Round Workshops
  • Office of Distributed Learning
  • Colloquiums, Tech Showcases, Guest Speakers
  • Newsletters, guides, active learning grants,
    annual reports, faculty development, brown bags

RIDIC5-ULO3US Model of Technology Use
  • 4. Tasks (RIDIC)
  • Relevance
  • Individualization
  • Depth of Discussion
  • Interactivity
  • Collaboration-Control-Choice-Constructivistic-Comm

RIDIC5-ULO3US Model of Technology Use
  • 5. Tech Tools (ULOUS)
  • Utility/Usable
  • Learner-Centeredness
  • Opportunities with Outsiders Online
  • Ultra Friendly
  • Supportive

6. Course Success
  • Few technological glitches/bugs
  • Adequate online support
  • Increasing enrollment trends
  • Course quality (interactivity rating)
  • Monies paid
  • Accepted by other programs

7. Online Program or Course Budget (i.e., how
pay, how large is course, tech fees charged, of
courses, tuition rate, etc.)
  • Indirect Costs learner disk space, phone,
    accreditation, integration with existing
    technology, library resources, on site
    orientation tech training, faculty training,
    office space
  • Direct Costs courseware, instructor, help desk,
    books, seat time, bandwidth and data
    communications, server, server back-up, course
    developers, postage

8. Institutional Success
  • E-Enrollments from
  • new students, alumni, existing students
  • Additional grants
  • Press, publication, partners, attention
  • Orientations, training, support materials
  • Faculty attitudes
  • Acceptable policies (ADA compliant)

But how to determine the pedagogical quality of
courses and course materials you develop?
Just a Lot of Bonk
  • Variety tasks, topics, participants,
    accomplishments, etc.
  • Interaction extends beyond class
  • Learners are also teachers
  • Multiple ways to succeed
  • Personalization and choice
  • Clarity and easy to navigate course

Wishers Wish List
  • Effect size of .5 or higher in comparison to
    traditional classroom instruction.

Quality on the Line Benchmarks for Success in
Internet-Based Distance Ed (Blackboard NEA,
  • Teaching/Learning Process
  • Student interaction with faculty is facilitated
    through a variety of ways.
  • Feedback to student assignments and questions is
    provided in a timely manner.
  • Each module requires students to engage
    themselves in analysis, synthesis, and evaluation
    as part of their course assignments.
  • Course materials promote collaboration among
  • http//www.ihep.com/Pubs/PDF/Quality.pdf

Quality on the Line Benchmarks for Success in
Internet-Based Distance Ed (Blackboard NEA,
  • Other Benchmark Categories
  • Institutional Support incentive, rewards, plans
  • Course Development processes, guidelines, teams,
    structures, standards, learning styles
  • Course Structure expectations, resources
  • Student Support training, assistance, info
  • Faculty Support mentoring, tech support
  • Evaluation and Assessment review process,
    multiple methods, specific standards

The Sharp Edge of the Cube Pedagogically Driven
Instructional Design for Online
EducationSyllabus Magazine, Dec, 2001, Nishikant
  • five functional learning stylesapprenticeship,
    incidental, inductive, deductive, discovery.
  • http//www.syllabus.com/syllabusmagazine/article.a

New Methodology for Evaluation The Pedagogical
Rating of Online CoursesSyllabus Magazine, Jan,
2002, Nishikant Sonwalkar
  • The Pedagogical Effectiveness Index
  • (1) Learning Styles (see previous page)
  • (2) Media Elements text, graphics, audio, video,
    animation, simulation
  • (3) Interaction Elements feedback, revision,
    e-mail, discussion, bulletin
  • http//www.syllabus.com/syllabusmagazine/article.
  • For more info, e-mail Nish_at_mit.edu

New Methodology for Evaluation The Pedagogical
Rating of Online CoursesSyllabus Magazine, Jan,
2002, Nishikant Sonwalkar
  • Summative evaluation instrument for rating online
  • (1) Content Factors quality, media, authentic
  • (2) Learning Factors interactivity, testing
    feedback, collaboration, ped styles
  • (3) Delivery Support Factors accessible,
    reporting, user management, content
  • (4) Usability Factors clarity, chunk size,
  • (5) Technological Factors bandwidth, database
    connectivity, server capacity,browser

Best of Online Pedagogical Strategies
Changing Role of the TeacherThe Online Teacher,
TAFE, Guy Kemshal-Bell (April, 2001)
  • From oracle to guide and resource provider
  • From providers of answers to expert questioners
  • From solitary teacher to member of team
  • From total control of teaching environment to
    sharing as a fellow student
  • From provider of content to designer of learning

Dennens Research on Nine Online
Courses (sociology, history, communications,
writing, library science, technology, counseling)
Poor Instructors Good Instructors
  • Provided regular qual/quant feedback
  • Participated as peer
  • Allowed perspective sharing
  • Tied discussion to grades, other assessments.
  • Used incremental deadlines
  • Little or no feedback given
  • Always authoritative
  • Kept narrow focus of what was relevant
  • Created tangential discussions
  • Only used ultimate deadlines

What do we need???

Reflect on Extent of IntegrationThe Web
Integration Continuum(Bonk et al., 2001)
  • Level 1 Course Marketing/Syllabi via the Web
  • Level 2 Web Resource for Student Exploration
  • Level 3 Publish Student-Gen Web Resources
  • Level 4 Course Resources on the Web
  • Level 5 Repurpose Web Resources for Others
  • Level 6 Web Component is Substantive Graded
  • Level 7 Graded Activities Extend Beyond Class
  • Level 8 Entire Web Course for Resident Students
  • Level 9 Entire Web Course for Offsite Students
  • Level 10 Course within Programmatic Initiative

2. Reflect on InteractionsMatrix of Web
Interactions(Cummings, Bonk, Jacobs, 2002)
  • Instructor to Student syllabus, notes, feedback
  • to Instructor Course resources, syllabi,
  • to Practitioner Tutorials, articles,
  • Student to Student Intros, sample work, debates
  • to Instructor Voting, tests, papers,
  • to Practitioner Web links, resumes
  • Practitioner to Student Internships, jobs,
  • to Instructor Opinion surveys, fdbk,
  • to Practitioner Forums, listservs

But there is a Problem
(No Transcript)
How Bad Is It?
  • Some frustrated Blackboard users who say the
    company is too slow in responding to technical
    problems with its course-management software have
    formed an independent users group to help one
    another and to press the company to improve.
  • (Jeffrey Young, Nov. 2, 2001, Chronicle of Higher

Must Online Learning be Boring?
What Motivates Adult Learners to Participate?
Motivational Terms?See Johnmarshall Reeve
(1996). Motivating Others Nurturing inner
motivational resources. Boston Allyn Bacon.
  1. Tone/Climate Psych Safety, Comfort, Belonging
  2. Feedback Responsive, Supports, Encouragement
  3. Engagement Effort, Involvement, Excitement
  4. Meaningfulness Interesting, Relevant, Authentic
  5. Choice Flexibility, Opportunities, Autonomy
  6. Variety Novelty, Intrigue, Unknowns
  7. Curiosity Fun, Fantasy, Control
  8. Tension Challenge, Dissonance, Controversy
  9. Interactive Collaborative, Team-Based, Community
  10. Goal Driven Product-Based, Success, Ownership

1. Tone/Climate Ice Breakers
  • Eight Nouns Activity
  • 1. Introduce self using 8 nouns
  • 2. Explain why choose each noun
  • 3. Comment on 1-2 peer postings
  • Coffee House Expectations
  • 1. Have everyone post 2-3 course expectations
  • 2. Instructor summarizes and comments on how they
    might be met
  • (or make public commitments of how they will fit
    into busy schedules!)

2. FeedbackRequiring Peer Feedback
  • Alternatives
  • 1. Require minimum of peer comments and give
    guidance (e.g., they should do)
  • 2. Peer Feedback Through Templatesgive templates
    to complete peer evaluations.
  • 3. Have e-papers contest(s)

3. EngagementElectronic Voting and Polling
  • 1. Ask students to vote on issue before class
    (anonymously or send directly to the instructor)
  • 2. Instructor pulls our minority pt of view
  • 3. Discuss with majority pt of view
  • 4. Repoll students after class
  • (Note Delphi or Timed Disclosure Technique
    anomymous input till a due date
  • and then post results and
  • reconsider until consensus
  • Rick Kulp, IBM, 1999)

4. Meaningfulness Job or Field Reflections
  • 1. Field Definition Activity Have student
    interview (via e-mail, if necessary) someone
    working in the field of study and share their
  • As a class, pool interview results and develop a
    group description of what it means to be a
    professional in the field

5. ChoiceDiscussion Starter-Wrapper
  • Starter reads ahead and starts discussion and
    others participate and wrapper summarizes what
    was discussed.
  • Start-wrapper with roles--same as 1 but include
    roles for debate (optimist, pessimist, devil's
  • Alternative Facilitator-Starter-Wrapper Instead
    of starting discussion, student acts as moderator
    or questioner to push student thinking and give

6. Variety Just-In-Time-Teaching
  • Gregor Novak, IUPUI Physics Professor (teaches
    teamwork, collaboration, and effective
  • Lectures are built around student answers to
    short quizzes that have an electronic due date
    just hours before class.
  • Instructor reads and summarizes responses before
    class and weaves them into discussion and changes
    the lecture as appropriate.

7. CuriosityElectronic Seance
  • Students read books from famous dead people
  • Convene when dark (sync or asynchronous).
  • Present present day problem for them to solve
  • Participate from within those characters (e.g.,
    read direct quotes from books or articles)
  • Invite expert guests from other campuses
  • Keep chat open for set time period
  • Debrief

8. Tension Role Play
  • A. Role Play Personalities
  • List possible roles or personalities (e.g.,
    coach, optimist, devils advocate, etc.)
  • Sign up for different role every week (or 5-6 key
  • Perform within rolesrefer to different
  • B. Assume Persona of Scholar
  • Enroll famous people in your course
  • Students assume voice of that person for one or
    more sessions
  • Enter debate topic, respond to debate topic, or
    respond to rdg reflections

9. Interactive Critical/Constructive Friends,
Email Pals, Web Buddies
  1. Assign a critical friend (perhaps based on
  2. Post weekly updates of projects, send reminders
    of due dates, help where needed.
  3. Provide criticism to peer (i.e., what is strong
    and weak, whats missing, what hits the mark) as
    well as suggestions for strengthening.
  4. Reflect on experience.

10. Goal Driven Gallery Tours
  • Assign Topic or Project
  • (e.g., Team or Class White Paper, Bus Plan, Study
    Guide, Glossary, Journal, Model Exam Answers)
  • Students Post to Web
  • Experts Review and Rate
  • Try to Combine Projects

How will you design courses that motivate
thinking?(Sheinberg, April 2000, Learning
Some Final Advice
Or Maybe Some Questions???
Write a Comment
User Comments (0)
About PowerShow.com