agregador de noticias
Interesting question to which the answer may be 'yes'. Dawn Poulos suggests, "Being static means being stale, and for instructional designers, stale content is the fastest road to irrelevancy." In fact, the discipline is changing, as exemplified by this list of 'aha moments':
- "Aha!" Moment No. 1: Reusing Content is a Game Changer"
- "Aha!" Moment No. 2: I Can Share My Content Outside the L& D Organization"
- "Aha! Moment No. 3: Collaboration Lets Us Deliver Better Content Faster"
- "Aha! Moment No. 4: Yes, I Really Can Personalize Learning Content"
- "Aha! Moment No. 5: Structure Provides Flexibility"
What do you get if you actually implement these five principles? I would argue that you get a cMOOC. But your mileage may vary. Here's the full report (you will have to pay for it with your social network information).[Link] [Comment]
This is an interesting look at the metaphors used to describe the Learning Management System (LMS), including a reference to a fun paper from 2007 describing the ways people described Blackboard ("The metaphors of ‘ tree branches,’ ‘ 7/11 store,’ ‘ river of information,’ ‘ fun game,’ and ‘ light bulb revolution’ reveal the communication, information, educational, political, and philosophical aspects of Blackboard cyberinfrastructure implementation... the educational usage of Blackboard did not emerge as the most prominent rationality for Blackboard"). Tom Woodward suggests here that " the LMS is a fast-food franchise kitchen. It does exactly what it is meant to do. It is built for people with minimal skills to make cheap food quickly at scale. It isn’ t meant to be a training ground so people can move up to gourmet cooking. These skills don’ t transfer. You aren’ t even meant to graduate to being a line cook at Friday’ s." Heh.[Link] [Comment]
Part of the problem with social media is that being profitable and making money do not mix well together. Witness Eat24's breakup letter to Facebook: "Not to be rude, but you aren’ t the smart, funny social network we fell in love with several years back. You’ ve changed. A lot. When we first met, you made us feel special. We’ d tell you a super funny joke about Sriracha and you’ d tell all our friends and then everyone would laugh together. But now? Now you want us to give you money if we want to talk to our friends." Via < a href="http://thenextweb.com/socialmedia/2014/04/13/breaking-facebook-brands-young-users-going/"> TNW< /a> .[Link] [Comment]
*Sigh* "Heartbleed arose inside a version of open-source OpenSSL cryptographic software. Information sitting inside the memory of a server should be encrypted, but a little bit of data could be pulled out under an attack. Most recently, a report emerged alleging that the U.S. National Security Agency had known about Heartbleed for more than two years, and even exploited it. The NSA later denied the allegations." OLDaily and MOOC.ca users are not affected by the OpenSSL bug.[Link] [Comment]
The Australian Open Access Support Group has posted a good series of articles on irruses related to paying for open access publication. On this model, commonly called the 'Gold Model', authors or institutions pay publishers fees up front to process and make available the article as open access (by contrast, the 'Green Model' proposes that institutions manage their own article repositories). Many funding agencies, including the NHMRC and ARC in Australia, require that outcomes be published as open access.
Topics covered in the servies include "the cost of hybrid, addressing double dipping, a discussion about whether open access funds support open access, and a look at what hybrid actually pays for. There is also an analysis of the membership model for open access publishing with a discussion of the attendant issues relating to managing article processing charges."[Link] [Comment]
The Web Literacy Learning Pathways map is a supplement to the Mozilla Web Literacy Standard. It helps users explore the skills they should work on next to improve their web literacy, based on their current competences.Interest Area: Learning & Society
The Web Literacy Learning Pathways map is a supplement to the Mozilla Web Literacy Standard. It helps users explore the skills they should work on next to improve their web literacy, based on their current competences.Interest Area: Learning & Society
In celebration of National Library Week, Oxford University Press is making all of its online resources freely available for this week only!Interest Area: Schools Higher Education Training & Work Learning & Society
Tras más de cuatro años alojados en wordpress.com, y manteniendo la misma plantilla, hemos decidido trasladar esta bitácora a una instalación de wordpress.org. Hemos aprovechado la migración para darle un aspecto más moderno y mejorar la usabilidad, simplificando el acceso a los casi 600 artículos y páginas que se han publicado desde enero de 2010, al tiempo que se integra con el espacio web de Conecta13, en el cual también estamos trabajando.
Respecto a nuestro anterior blog los cambios más notables, en cuanto a diseño y organización de los contenidos, son el aumento del ancho de pantalla dedicado a los artículos, al eliminar la columna lateral de la izquierda, un nuevo menú con muchas opciones para localizar los recursos asociados a cada opción y un pie de página que muestra alguna información relevante, aligerando así la columna lateral.
En el nuevo blog no se muestran los artículos completos en la vista general, facilitando así recorrer los últimos artículos, y pudiendo ampliarlos para su lectura detallada con el botón ‘Leer más’.
En estas próximas semanas vamos a reactivar la newsletter mensual de [e-aprendizaje], por lo que hemos destacado la suscripción a la misma con un faldón en la parte superior del blog, el cual, además de que se puede cerrar, se oculta en el momento en el que comenzamos a hacer scroll para continuar con la lectura del blog.
En cuanto a las suscripciones se ha solicitado a WordPress.com que migraran los suscriptores que había antes del despliegue del nuevo blog. No obstante si detectas que no recibes notificaciones por los nuevos artículos te recomiendo que vuelvas a suscribirte usando tu correo electrónico.
Por supuesto, para cualquier duda en relación a los contenidos y organización del blog no tienes más que preguntar.imagen destacada | DE MUDANZA… por Javier Martin Espartosa con licencia CC-by-nc-sa
Some days, the internet gods are kind.
On April 9th, I wrote,
We want talking about educational efficacy to be like talking about the efficacy of Advil for treating arthritis. But it’s closer to talking about the efficacy of various chemotherapy drugs for treating a particular cancer. And we’re really really bad at talking about that kind of efficacy. I think we have our work cut out for us if we really want to be able to talk intelligently and intelligibly about the effectiveness of any particular educational intervention.
On the very same day, the estimable Larry Cuban blogged,
So it is hardly surprising, then, that many others, including myself, have been skeptical of the popular idea that evidence-based policymaking and evidence-based instruction can drive teaching practice. Those doubts have grown larger when one notes what has occurred in clinical medicine with its frequent U-turns in evidence-based “best practices.”
Consider, for example, how new studies have often reversed prior “evidence-based” medical procedures.
*Hormone therapy for post-menopausal women to reduce heart attacks wasfound to be more harmful than no intervention at all.
*Getting a PSA test to determine whether the prostate gland showed signs of cancer for men over the age of 50 was “best practice” until 2012 when advisory panels of doctors recommended that no one under 55 should be tested and those older might be tested if they had family histories of prostate cancer.
And then there are new studies that recommend women to have annual mammograms, not at age 50 as recommended for decades, but at age 40. Or research syntheses (sometimes called “meta-analyses”) that showed anti-depressant pills worked no better than placebos.
These large studies done with randomized clinical trials–the current gold standard for producing evidence-based medical practice–have, over time, produced reversals in practice. Such turnarounds, when popularized in the press (although media attention does not mean that practitioners actually change what they do with patients) often diminished faith in medical research leaving most of us–and I include myself–stuck as to which healthy practices we should continue and which we should drop.
Should I, for example, eat butter or margarine to prevent a heart attack? In the 1980s, the answer was: Don’t eat butter, cheese, beef, and similar high-saturated fat products. Yet a recent meta-analysis of those and subsequent studies reached an opposite conclusion.
Figuring out what to do is hard because I, as a researcher, teacher, and person who wants to maintain good health has to sort out what studies say and how those studies were done from what the media report, and then how all of that applies to me. Should I take a PSA test? Should I switch from margarine to butter?
He put it much better than I did. While the gains in overall modern medicine have been amazing, anybody who has had even a moderately complex health issue (like back pain, for example) has had the frustrating experience of having a billion tests, being passed from specialist to specialist, and getting no clear answers.1 More on this point later.
Larry’s next post—actually a guest post by Francis Schrag—is an imaginary argument between an evidence-based education proponent and a skeptic. I won’t quote it here, but it is well worth reading in full. My own position is somewhere between the proponent and the skeptic, though leaning more in the direction of the proponent. I don’t think we can measure everything that’s important about education, and it’s very clear that pretending that we can has caused serious damage to our educational system. But that doesn’t mean I think we should abandon all attempts to formulate a science of education. For me, it’s all about literacy. I want to give teachers and students skills to interpret the evidence for themselves and then empower them to use their own judgment.
To that end, let’s look at the other half of Larry’s April 9 post, the title of which is “What’s The Evidence on School Devices and Software Improving Student Learning?”Lies, Damned Lies, and…
The heart of the post is a study by John Hattie, a Professor at the University of Auckland (NZ). He’s done meta-analysis on an enormous number of education studies, looking at effect sizes, measured on a scale from 0.1, which is negligible, to 1.0, which is a full standard deviation.
He found that the “typical” effect size of an innovation was 0.4.
To compare different classroom approaches shaped student learning, Hattie used the “typical” effect size (0.4) to mean that a practice reached the threshold of influence on student learning (p. 5). From his meta-analyses, he then found that class size had a .20 effect (slide 15) while direct instruction had a .59 effect (slide 21). Again and again, he found that teacher feedback had an effect size of .72 (slide 32). Moreover, teacher-directed strategies of increasing student verbalization (.67) and teaching meta-cognition strategies (.67) had substantial effects (slide 32).
What about student use of computers (p. 7)? Hattie included many “effect sizes” of computer use from distance education (.09), multimedia methods (.15), programmed instruction (.24), and computer-assisted instruction (.37). Except for “hypermedia instruction” (.41), all fell below the “typical ” effect size (.40) of innovations improving student learning (slides 14-18). Across all studies of computers, then, Hattie found an overall effect size of .31 (p. 4).
The conclusion is that changing a classroom practice can often produce a significant effect size while adding a technology rarely does.
But as my father likes to say, if you stick your head in the oven and your feet in the freezer, on average you’ll be comfortable.
Let’s think about introducing clickers to a classroom, for example. What class are you using them in? How often do you use them? When do you use them? What do you use them for? Clickers in and of themselves change nothing. No intervention is going to be educationally effective unless it gets students to perceive, act, and think differently. There are lots of ways to use clickers in the classroom that have no such effect. My guess is that, most of the time, they are used for formative assessments. Those can be helpful or not, but generally when done in this way are more about informing the teacher than they are directly about helping the student.
But there are other uses of clicker technologies. For example, University of Michigan professor Perry Samson recently blogged about using clickers to compare students’ sense of their physical and emotional well-being with their test performance:
FIGURE 2. EXAMPLE OF RESULTS FROM A STUDENT WELLNESS QUESTION FOR A SPECIFIC CLASS DAY. NOTE THE GENERAL COLLINEARITY OF PHYSICAL AND EMOTIONAL WELLNESS.
I have observed over the last few years that a majority of the students who were withdrawing from my course in mid-semester commented on a crisis in health or emotion in their lives. On a lark this semester I created an image-based question to ask students in LectureTools at the beginning of each class (example, Figure 2) that requested their self assessment of their current physical and emotional state.
Clearly there is a wide variation in students’ perceptions of their physical and emotional state. To analyze these data I performed cluster analysis on students’ reported emotional state prior to the first exam and found that temporal trends in this measure of emotional state could be clustered into six categories.
FIGURE 3. TRENDS IN STUDENTS’ SELF REPORTED EMOTIONAL STATE PRIOR TO THE FIRST EXAM IN CLASS ARE CLUSTERED INTO SIX CATEGORIES. THE AVERAGE EMOTIONAL STATE FOR EACH CLUSTER APPEARS TO BE PREDICTIVE OF MEDIAN FIRST EXAM SCORES.
Perhaps not surprisingly Figure 3 shows that student outcomes on the first exam were very much related to the students’ self assessment of their emotional state prior to the exam. This result is hard evidence for the intuitive, that students perform better when they are in a better emotional state.
I don’t know what Perry will end up doing with this information in terms of a classroom intervention. Nor do I know whether any such intervention will be effective. But it seems common sense not to lump it in with a million billion professors asking quiz questions on their clickers to aggregate it into an average of how effective clickers are.
To be fair, that’s not Larry’s point for quoting the Hattie study. He’s arguing against the reductionist argument that technology fixes everything—an argument which seems obviously absurd to everybody except, sadly, the people who seem to have the power to make decisions. But my point is that it is equally absurd to use this study as evidence that technology is generally not helpful. What I think it suggests is that it makes little sense to study the efficacy of educational technologies or products outside the context of the efficacy of the practices that they enable. More importantly, it’s a good example of how we all need to get much more sophisticated about reading the studies so we can judge for ourselves what they do and do not prove.Of Back Mice and Men
I have had moderate to severe back pain for the past seven years. I have been to see orthopedists, pain specialists, rheumatologists, urologists, chiropractors, physical therapists, acupuncturists, and massage therapists. In many cases, I have seen more than one in any given category. I had X-rays, CAT scans, MRIs, and electrical probes inserted into my abdomen and legs. I had many needles of widely varying gauges stuck in me, grown humans walking on my back, gallons of steroids injected into me. I had the protective sheathes of my nerves fried with electricity. If you’ve ever had chronic pain, you know that you would probably go to a voodoo priest and drink goat urine if you thought it might help. (Sadly, there are apparently no voodoo priests in my area of Massachusetts—or at least none who have a web page.) Nobody I went to could help me.
Not too long ago, I had cause to visit my primary care physician, who is a good old country doctor. No specialist certificates, no Ivy League medical school degrees. Just a solid GP with some horse sense. In a state of despair, I explained my situation to him.
He said, “Can I try something? Does it hurt when I touch you here?”
It turns out that I have a condition called “back mice,” also called “episacral lipomas” when it is referred to in the medical literature, which, it turns out, happens rarely. I won’t go into the details of what they are, because that’s not important to the story. What’s important is what the doctor said next. “There’s hardly anything on them in the literature,” he said. “The thing is, they don’t show up on any scans. They’re impossible to diagnose unless you actually touch the patient’s back.”
I thought back to all the specialists I had seen over the years. None of the doctors ever once touched my back. Not one. My massage therapist actually found the back mice, but she didn’t know what they were, and neither of us knew that they were significant. It turns out that once my GP discovered that these things exist, he started finding them everywhere. He told me a story of an eighty-year-old woman who had been hospitalized for “non-specific back pain.” They doped her up with opiates and the poor thing couldn’t stand up without falling over. He gave her a couple of shots in the right place, and a week later she was fine. He has changed my life as well. I am not yet all better—we just started treatment two weeks ago—but I am already dramatically better.
The thing is, my doctor is an empiricist. In fact, he is one of the best diagnosticians I know. (And I have now met many.) He knew about back mice in the first place because he reads the literature avidly. But believing in the value of evidence and research is not the same thing as believing that only that which has been tested, measured, and statistically verified has value. Evidence should be a tool in the service of judgment, not a substitute for it. Isn’t that what we try to teach our students?
- But I’m not bitter.
Fishman, R. (2013) State U Online Washington DC: The New America Foundation
Fishman, R. (2014) Seeking Your Input on Online Consortia and Online Community Colleges WCET Frontiers
It would seem obvious that there would be great advantage in building consortia for online courses, so that courses could be shared between institutions, thus saving institutions the cost of developing new courses that are already being offered by other institutions. In particular, when you have a single state system of universities and two year colleges, it seems even more obvious. This is basically the idea behind the new Ontario Online initiative, for universities (Ontario already has a collaborative system, OntarioLearn, a partnership of 24 Ontario community colleges that have pooled their resources to increase online learning options.)
However, credit-based online courses have been around for many years, and yet there are very few successful consortia (Open Universities Australia is one good example.) The University of Florida System is a more recent example, as is the Kentucky Community and Technical College System.
Rachel Fishman’s report, State U Online, was funded by the Bill and Melinda Gates Foundation, and tracks the development of online university consortia in the USA. She identifies five steps that a state can take to build an integrated state-wide online system, and provides case studies of systems and institutions that have reached each ‘level.’
- Clearinghouse: State institutions collaborate to provide a clearinghouse of courses and degrees that a student can easily search. However, students must apply to each institution individually, and credit transfer between institutions is not automatic. Contact North provides such a portal in Ontario.
- Shared contracts: State institutions join together to purchase shared contracts for resources like a common LMS or services such as web conferencing or professional development around online learning. BCcampus operates something similar in British Columbia.
- Shared student services: state systems provide a variety of student support services at all (participating) institutions within the system, such as advising, local study centres, or even more common, proctored examination centres.
- Shared and articulated credentials: state systems have created carefully articulated efforts that include easy transfer of credit among institutions and shared credentialing. (This would include OntarioLearn)
- Shared credentials beyond state borders: Several state systems create collaborative inter-institutional and inter-state efforts that take all of the previous steps, and allow students to move freely beyond state borders. Great Plains IDEA is an example from the USA, and Open Universities Australia is another example.
Fishman argues in the report that ‘public institutions should strongly consider adopting a system wide or consortia approach, in a manner that fits their unique contexts‘ and makes seven recommendations that will help strengthen such consortia.
However, in her blog post for WCET Frontiers, where she is asking for input for a new study on consortia in two-year colleges, she acknowledges that ‘[these five] categories may not be as distinct or as linear as I have made them out to be. And for some states, there are many barriers already in place that prevent institutions from even being able to come together and collaborate in the first place.’
The State U Online report should be compulsory reading for politicians and policy makers interested in course sharing and creating consortia.
However, what the report does not adequately address are the economics of online learning. Course sharing is not just about delivery of content, but also about providing learner support. If an institution takes a course from another institution, who will provide that ongoing learner support and assessment? It is the learner support that costs money (at least twice the cost of course development), and it is in the details of who will do the teaching of the online course – and how that gets paid for – where consortia so often break down. Having a strong and robust business model that adequately ensures the costs of all partners are adequately covered, and any surplus revenues are appropriately shared, is essential for successful consortia, but these conditions are very difficult to meet.
Another major barrier is academic distrust of other institutions: ‘Our courses are always good; yours are garbage.’ Also, for obvious reasons, faculty often feel uncomfortable teaching a course designed by someone else, and into the design of which they had no input.
For consortia to work, there has to be a synergy and a mutual respect for the other partners in the consortium. In a large system it is unrealistic to expect automatic transfer of credits between every institution in the system, although some states, such as California and Florida, have gone a long way to building equivalencies between courses in different institutions that facilitate formal credit transfer arrangements, through subject discipline articulation committees. But that is very hard work, takes many years to build, and requires a common vision and mutual respect. That is very hard to achieve in systems that put so much emphasis on competition and rankings.
So yes, consortia are desirable, but it ain’t easy. In the meantime, if you know of any successful online consortia let Rachel Fishman know (and me, too!)
As the shift from location specific learning to untethered learning gathers pace, so the personal device gains increasing importance. Distributed forms of learning are burgeoning, and geographical distance between learners and their parent institutions is less of a problem. This is because learners are intimately familiar with the capabilities of their own devices, and are able to use them to learn in creative and productive ways. 'Bring your own device' is now common place in universities and students no longer need to study in a single location. If students are no longer required to be in the same place as their teachers, several things become apparent.
The first is that traditional spaces such as the lecture hall, computer suite and classroom assume less significance. Some would argue that the millions invested in building traditional rooms for education in the last few years might easily have been spent on more relevant and future-oriented projects. However, physical spaces retain their importance for many universities, and the manner in which they are configured needs to be reappraised. Why, for example are lecture halls designed with fixed chairs in rows and tiers, thus limiting student interaction? Where do students plug their laptops and mobile phones into when they need to recharge their batteries? Are today's lecture halls designed with enough power sockets within easy reach? Why do we still tether computers to desks, row upon row in computer suites? Does this not simply replicate the style of traditional lecture halls? Does computing still need to take place in a specific location? This is misplaced ICT and such resources could be deployed more effectively. If learning spaces on university campuses are to remain physical spaces, then a radical overhaul of their design is long overdue.
Secondly, the pedagogy that underpins higher education is in need of reform. Although the traditional lecture still has its place, it is becoming increasingly anachronistic in an age where much knowledge is shifting from objective certainty to subjective provisionality. A number of other effective alternatives are possible when each student owns a mobile, personal learning device to accommodate their individual needs. Learning becomes more self-directed, creating knowledge as well as receiving it. Students become more active and wide-ranging in their learning approach. Collaborative learning activities become more feasible, and can extend beyond traditional times and physical locations. Learning through making, doing and problem solving is extended as students are no longer constrained by class times and space boundaries. Ultimately, the role of the teacher changes, as lecturers assume supportive and facilitative, rather than directive duties. They remain as experts, but acknowledge that their students can also bring knowledge to the learning process, and can also teach each other.
Thirdly, if students are now connecting remotely into campus services, the development of digital content and the provision of better communication channels is required to ensure the success of distributed learning methods. If students study exclusively, or predominantly away from the traditional campus, their prime connection to peers, experts and content will be through their personal devices. If this fails for any reason, students are suddenly separated from their resources and expert support. Universities must therefore ensure that institutional services such as Learning Management Systems and the provision of other centralised software remain stable and accessible at all times.
As the personal device becomes more prevalent among student populations, so universities will need to reappraise their strategies for course provision. One of the most important decisions to be made is to ensure that student expectations are met, whether they attend the traditional campus or not.
Photo by Reader Walker
Personal devices in higher education by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.Posted by Steve Wheeler from Learning with e's
A while back, I wrote about my early experiences as a member of the Digital Working Group for the AAC&U General Education Maps and Markers (GEMs) initiative and promised that I would do my homework for the group in public. Today I will make good on that promise. The homework is to write-up an exemplar practice of how digital tools and practices can help support students in their journeys through GenEd.
As I said in my original post, I think this is an important initiative. I invite all of you to write up your own exemplars, either in the comments thread here or in your own blogs or other digital spaces.
The template for the exemplar is as follows:
Evocative Examples of Digital Resources and Strategies that can Improve General Education: What are these cases a case of?
Brief Description of practice:
- In what ways is the practice effective or transformative for student learning? What’s the evidence? How do we know? (If you can tie the practice to any of the outcomes in the DQP and/or the LEAP Essential Learning Outcomes, that would be great.)
- How does the practice reflect the digital world as lived student culture? What are the skills and content associated with the digital practice or environment? How does the practice deepen or shape behavior of students with digital tools and environments with which they may be variously familiar?
- What does it take to make the practice work? What is the impact on faculty time? Does it take a team to design, implement, assess? What are the implications for organizational change?
- How is it applicable to gen ed (if example doesn’t come from gen ed)?
- Are there references or literature to which you can point that is relevant to the practice?
In this practice, every effort is made to move both direct instruction and formative assessment outside of class time. The “flipped classroom” (or “flipped learning”) approach is employed, providing students with instructional videos and other supplemental content. In addition, a digital homework platform is employed, enabling students to get regular formative assessments. In order to give students more time for these activities, the amount of in-class time is reduced, making the course effectively a blended or hybrid course. In-class time is devoted either to class discussion, which is supported by the instructor’s knowledge of the students’ performance on the regular formative assessments, and by group work.
In what ways is the practice effective or transformative for student learning? What’s the evidence? How do we know?
This is a particular subset of a practice that the National Center for Academic Transformation (NCAT) calls “the replacement model”, and they have a variety of course redesign projects that demonstrated improved outcomes relative to the control. For example, a redesign of a psychology Gen Ed course at Missouri State University produced the following results:
- On the 30-item comprehensive exam, students in the redesigned sections performed significantly better (84% improvement) compared to the traditional comparison group (54% improvement).
- Students in the redesigned course demonstrated significantly more improvement from pre to post on the 50-item comprehensive exam (62% improvement) compared to the traditional sections (37% improvement).
- Attendance improved substantially in the redesigned section. (Fall 2011 traditional mean percent attendance = 75% versus fall 2012 redesign mean percent attendance = 83%)
- Over a three-semester period following the redesign, the course DFW rate improved from 24.6% to 18.4% (most of which was because of a significant drop in the withdrawal rate).
One of the investigators of the project, who also was a course instructor, indicated that the quality of class discussion improved significantly as well.
Possible reasons why the practice is effective include the following:
- Teacher/student contact time is maximized for interactivity.
- Regular formative assessments with instant feedback help students to be better prepared to maximize discussion time with the teacher and with peers.
- Feedback from the homework system enabled the instructor to walk into class knowing where students need the most help.
- Reduced number of physical class meetings reduces the chances that a student will withdraw due to grade damaging absences.
How does the practice reflect the digital world as lived student culture? What are the skills and content associated with the digital practice or environment? How does the practice deepen or shape behavior of students with digital tools and environments with which they may be variously familiar?
Students are used to getting their information online. They are also often very effective at “time slicing,” in which they use small increments of time (e.g., when they are on a bus or waiting for an appointment) to get things done. This exemplar practice enables students to do that with the portions of academic work that are suited to it while preserving and actually expanding room for long and deep academic discussion.
What does it take to make the practice work? What is the impact on faculty time? Does it take a team to design, implement, assess? What are the implications for organizational change?
The redesign effort is significant and, because the creation of significant digital resources is involved, is often best done by a team (although that is not strictly necessary). For the purposes of this design, the homework platform need not be cutting-edge adaptive, as long as it provides formative assessments that are consistent with the summative assessments and provides both students and instructors with good, regular feedback. That said, implementing the technology is often not seamless and may take several semesters to work the kinks out. The shift to a flipped classroom also puts new demands on students and may take several semesters for the campus culture to adjust to the new approach.
How is it applicable to gen ed (if example doesn’t come from gen ed)?
This model is often used in Gen Ed. It is particularly appropriate for larger classes where the DFW rate is high and where a significant percentage of the subject matter—at least the foundational knowledge on the lower rungs of Bloom’s taxonomy—can be assessed through software.
Are there references or literature to which you can point that is relevant to the practice?
Hack Education Weekly News: Heartbleed, Data Insecurity, But Hey Lots of Money for Ed-Tech Startups Nonetheless
Kelly, R. (2014) EDUCAUSE and UCF launching blended learning MOOC Campus Technology, 3 April
EDUCAUSE and the University of Central Florida are offering a free MOOC called ‘BlendKit2014 – Becoming a Blended Learning Designer‘, which will run initially from April 21 to May 27.
It is aimed primarily at faculty and instructional designers, will come away with best practices for developing design documents, content pages and peer review feedback tools. In particular it will offer:
- a consideration of key issues related to blended learning and
- practical step-by-step guidance in producing materials for a blended course (e.g., developing design documents, creating content pages, and receiving peer review feedback at one’s own institution).
The course was developed and will be taught by two staff members from the UCF Center for Distributed Learning: associate director Kelvin Thompson and department head Linda Futch.
Participants may also choose to pursue an official “UCF/EDUCAUSE Certified Blended Learning Designer” credential. Those who choose this more rigorous option will submit the materials they develop as part of the free MOOC for a portfolio review. This portfolio review is available for a US$89 fee.
Registration for BlendKit 2014 is open on Canvas Network for the class that begins April 21. Details can be found at www.canvas.net and on Twitter at #BlendKit2014.
It should be noted that UCF has a great deal of experience in this field, having offered blended and fully online courses for many years.