agregador de noticias

Ed Tech Cybersecurity: Suppose they gave a data breach and nobody came

e-Literate - Hace 2 horas 24 mins

It has now been four weeks since Chegg announced a data breach compromising personal information of up to 40 million users. Cue the crickets because the only coverage in ed tech press thus far is from EdWeek, which focuses on the K-12 market. That's a shame, because if ed tech companies want a case study to help understand the implications of FBI warnings or the European Union's new Global Data Privacy Regulations (GDPR), this example from Chegg should be illustrative. The same goes for institutions.

As a recap, Chegg discovered on September 19th a data breach dating back to April that "an unauthorized party" accessed a data base with access to "a Chegg user’s name, email address, shipping address, Chegg username, and hashed Chegg password" but no financial information or social security numbers. The company has not disclosed, or is unsure of, how many of the 40 million users had their personal information stolen. On September 25th Chegg notified the SEC about the breach, focusing on guidance for company financials. The company then started notifying users and "certain regulatory authorities" on September 26th.

A "hashed password" is a typical process where the entered password is converted to random-looking cryptographic characters not intended to be decrypted. Subsequent password entries use the same hash again and software compares not the passwords but the hashed passwords to see if they come out identical, thus meaning that most systems do not store actual passwords in the system.

This 2016 article in Wired gives a good overview of hashing and data breaches and notes that the level of compromise depends on the details.

In theory, no one, not a hacker or even the web service itself, should be able to take those hashes and convert them back into passwords. But in practice, some hashing schemes are significantly harder to reverse than others. The collection of 177 million LinkedIn accounts stolen in 2012 that went up for sale on a dark web market last week, for instance, had actually been hashed. But the company used only a simple hashing function called SHA1 without extra protections, allowing almost all the hashed passwords to be trivially cracked. The result is that hackers were able to not only access the passwords, but also try them on other websites, likely leading to Mark Zuckerberg having his Twitter and Pinterest accounts hacked over the weekend.

By contrast, a breach at the crowdfunding site Patreon last year exposed passwords that had been hashed with a far stronger function called bcrypt, the fact of which likely kept the full cache relatively secure in spite of the breach.

What is problematic with the Chegg data breach is that no further information has been made public and there has yet to be any interest from the broader ed tech press to dig up answers. We have no idea how serious this breach is, and I do not believe that the users with compromised personal information have had any updates since the initial email blast and associated post.

Less than one week before the Chegg discovery of the data breach, the FBI put out a warning about ed tech and K-12 schools, but the details could easily be applied to higher education.

The FBI is encouraging public awareness of cyber threat concerns related to K-12 students. The US school systems’ rapid growth of education technologies (EdTech) and widespread collection of student data could have privacy and safety implications if compromised or exploited.

EdTech can provide services for adaptive, personalized learning experiences, and unique opportunities for student collaboration. Additionally, administrative platforms for tracking academics, disciplinary issues, student information systems, and classroom management programs, are commonly served through EdTech services.

There is also the GDPR angle described in the EdWeek article.

One of the first to call attention to the Chegg breach was Hill, an education consultant and market analyst for the company MindWires Consulting who posted a blog and a tweet about the SEC disclosure. [snip]

One of the more pressing questions is whether the breach will draw the scrutiny of data-privacy regulators, said Hill in an interview. He pointed to the new rules put in place as part of GDPR, the sweeping European data privacy regulation that took effect earlier this year.

The European policy has come into focus recently with the admission by social media giant Facebook — which has a major presence in schools — that hackers gained access to 50 million of its accounts. European authorities have said they are investigating how many users on the continent were affected, and whether it would trigger GPDR enforcement.

The Facebook breach was no doubt more problematic, as its breach exposed far more personal information as well as access to Facebook Login, thus compromising third-party platforms. But both data breaches involve consumer-based systems and similar numbers of users. In legal terms, however, GDPR is based on protecting citizens of the European Union. When I asked a Chegg spokesman about the GDPR-based notifications, they replied in general terms.

We actually do have an office in Berlin. Chegg’s customer base is principally US-based, and the core focus of our business is the United States. We are providing notice to the particular regulatory agencies, in the US and Internationally- including Europe.

GDPR has been criticized as creating impossible to fully comply requirements, and there are two aspects worth covering here - Supervisory Authority and Notification of Data Breach. This article gives a good summary and whom to notify - the Supervisory Authority.

For most companies, choosing a GDPR Lead Supervisory Authority is a straightforward decision. A company based in Paris, France would appoint the supervisory authority in France as the lead supervisory authority. A UK-based company would choose the Information Commissioner’s Office (ICO), which is the supervisory authority for the UK.

For companies that operate in multiple EU member states, the lead supervisory authority would normally be the supervisory authority in the country where the company’s headquarters is or where its main business location is in the EU. More specifically, it would be the Supervisory Authority in the country where the final decisions are made about data collection and processing.

A U.S. company that does not have a base in an EU member state has a problem. If it does not have a base in an EU member state where data procession decisions are made, it will not benefit from the one-stop-shop mechanism. Even if a company has a representative in an EU member state, that does not trigger the one-stop-shop mechanism.

The company must therefore deal with the supervisory authority in every member state where the company is active, through its local representative.

In Chegg's case, presumably the Berlin office allows them to use the one-stop mechanism of a lead authority. But smaller ed tech companies may not have this benefit and require interactions with many different country regulators1.

What about notification requirements in the case of a data breach? The relevant section is Article 33 of GDPR where Chegg would be a "controller" [emphasis added].

  • In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. 2Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.
  • The processor shall notify the controller without undue delay after becoming aware of a personal data breach.
  • The notification referred to in paragraph 1 shall at least:
    1. describe the nature of the personal data breach including where possible, the categories and approximate number of data subjects concerned and the categories and approximate number of personal data records concerned;
    2. communicate the name and contact details of the data protection officer or other contact point where more information can be obtained;
    3. describe the likely consequences of the personal data breach;
    4. describe the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its possible adverse effects.

In this case, Chegg would have had to notify its Lead Supervisory Authority by September 22 the details described above. According to the SEC form, initial notifications to regulators beyond the SEC started September 26.

Would there be a lawsuit based on this delayed notification? We don't know yet, but one important distinction is that in the EU the process must go through the official data regulators. Article 77 of GDPR specifies these actions.

Without prejudice to any other administrative or judicial remedy, every data subject shall have the right to lodge a complaint with a supervisory authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the data subject considers that the processing of personal data relating to him or her infringes this Regulation.

In other words, a country regulator must decide whether it wants to pursue action against Chegg. In the US, similar complaints or lawsuits can be filed by individuals against the company with a data breach. The intention of GDPR is to go after the big tech companies - Google, Facebook, etc - and Chegg may be too low-profile to warrant close attention. Despite the large numbers involved of up to 40 million users, it is unknown how many are EU citizens.

Will there be further fallout for Chegg than the initial flurry of financial news that helped drive down its stock price by 21 percent since the notification? It looks like the biggest issue is job security for US lawyers, as there have been at least four dozen lawsuits seeking class-action status filed with the general theme of the company not securing its systems properly or not notifying investors of the risks of data security. I have no idea if any of these will stick2, but Chegg's initial focus on SEC and financial notifications seems well-placed.

In the meantime, other ed tech companies would do well to view this data breach as a case study and opportunity to figure out how secure their systems are, and if they would be able to comply with GDPR regulations (or if they would be required to do so). More broadly, how many companies collecting personal information use adequate protection of hashed passwords? How many know what to do in the case of a data breach? Now is the time to find out and take action, before the next event occurs.

I will repeat my call that Chegg needs to more fully disclose the details of the incident to the general public. There has been no new information shared by Chegg based on its investigation. I would add that this subject should get more attention from ed tech press.

  1. Genius system - make the process much more difficult for smaller companies.
  2. These types of lawsuits come out of the woodworks when stock prices drop.

The post Ed Tech Cybersecurity: Suppose they gave a data breach and nobody came appeared first on e-Literate.

Students to Learn App Design with Swift Programming Online

Campus Technology - 22 Octubre, 2018 - 22:20
Genesys Works teamed up with Columbia College Chicago Online to deliver online Swift coding programs for local students, using K-12 curriculum created by Apple for its Everyone Can Code initiative and resources created by the college.

Blockcerts Updates Open Source Blockchain Architecture

Campus Technology - 22 Octubre, 2018 - 19:17
Learning Machine announced updates to the Blockcerts open source project to enable native support for records issuance and verification using bitcoin and Ethereum blockchains.

Mights And Perils In The Moodle Gamification Of Flipped Learning

Moodle News - 22 Octubre, 2018 - 18:51
Every classroom-laden teacher should go ahead and flip it. At least try it out for a week of the school year. Of course, few succeed at climbing the Everest without supplemental oxygen. So for now,...

Extreme Redesign Challenge Calls on Students to Submit 3D Designs

THE Journal - 22 Octubre, 2018 - 18:46
College, middle and high school students have the opportunity to win scholarships by showing off their 3D design prowess.

Report: Evidence Lacking on Value of CTE Investments

Campus Technology - 22 Octubre, 2018 - 18:27
A new report has examined whether career and technical education (CTE) deserves all the attention it's getting. The jury's still out.

Report: Evidence Lacking for Current CTE Investments

THE Journal - 22 Octubre, 2018 - 16:34
A new report has examined whether career and technical education (CTE) deserves all the attention it's getting. The jury's still out.

Check Out Troy Patterson’s Friendly Guide To Say ‘Hi’ In Moodle

Moodle News - 22 Octubre, 2018 - 13:51
I think K12 blended learning teachers need quick and visual "How to's" that show unique ways to use Moodle. What do you think about this approach. https://t.co/xtJUxkld4n — Chris...

Gartner Tech Trends for 2019 Include Immersive Experiences, Autonomous Things and Blockchain

THE Journal - 22 Octubre, 2018 - 13:00
IT analyst firm Gartner has named its top 10 trends for 2019, and the "immersive user experience" is on the list, alongside blockchain, quantum computing and seven other drivers influencing how we interact with the world. The annual trend list covers breakout tech with broad impact and tech that could reach a tipping point in the near future.

Grants & Upcoming Events (Week of Oct. 22, 2018)

THE Journal - 22 Octubre, 2018 - 13:00
Preschool Development Grant applications due Nov. 6!

Gartner: Immersive Experiences Among Top Tech Trends for 2019

Campus Technology - 22 Octubre, 2018 - 13:00
IT analyst firm Gartner has named its top 10 trends for 2019, and the "immersive user experience" is on the list, alongside blockchain, quantum computing and seven other drivers influencing how we interact with the world. The annual trend list covers breakout tech with broad impact and tech that could reach a tipping point in the near future.

Upcoming Events, Webinars & Calls for Papers (Week of Oct. 22, 2018)

Campus Technology - 22 Octubre, 2018 - 12:30
Upcoming events include the Educause Annual Conference, T.D.W.I. Orlando Conference, Blockchain in Education East and O.L.C. Accelerate 2018.

Feds launch ‘digital academy’ for public service

OLDaily - 21 Octubre, 2018 - 17:38

Kathryn May, iPolitics, Oct 21, 2018

This article describes a new initiative at the Canada School of Public Service to launch the Canadian Digital Academy "to improve the 'digital acumen' of all levels of public servants who are working to modernize operations, and to deliver the kind of digital services that Canadians expect." Having worked with CSPS on two projects leading to greater digital capacability over the last year or so I am happy to see this development. As Maryantonett Flumian  says, “The school appears to be going through a renaissance. It has a very dynamic management team, and that team is shopping for who can provide the best service."

Web: [Direct Link] [This Post]

Machine Teaching, Machine Learning, and the History of the Future of Public Education

Audrey Watters - 20 Octubre, 2018 - 19:00

These are my prepared remarks, delivered on a panel titled “Outsourcing the Classroom to Ed Tech & Machine-learning: Why Parents & Teachers Should Resist” at the Network for Public Education conference in Indianapolis. The other panelists were Peter Greene and Leonie Haimson. I had fifteen minutes to speak; clearly this is more than I could actually fit into that timeframe.

I want to start off my remarks this morning by making two assertions that I hope are both comforting and discomforting.

First, the role that corporations and philanthropists play in shaping education policy is not new. They have been at this a long, long time.

Companies have been selling their products – textbooks, workbooks, maps, films, and so on – to schools for well over a century. Pearson, for example, was founded (albeit as a construction company) in 1844 and acquired along the long history various textbook publishing companies which have also been around since the turn of the twentieth century. IBM, for its part, was founded in 1911 – a merger of three office manufacturing businesses – and it began to build testing and teaching machines in the 1930s. Many companies – and certainly these two in particular – also have a long history of data collection and data analysis.

These companies and their league of marketers and advocates have long argued that their products will augment what teachers can do. Augment, not replace, of course. Their products will make teachers’ work easier, faster, companies have always promised. Certainly we should scrutinize these arguments – we can debate the intentions and the results of “labor-saving devices” and we can think about the implications of shifting of expertise and control from a teacher to a textbook to a machine. But I’d argue that, more importantly perhaps, we must recognize that there is no point in the history of the American public education system that we can point to as the golden age of high quality, equitable, commercial free schooling.

My second assertion: that as long as these companies and their evangelists have been pitching their products to schools, they have promised a “revolution.” (Perhaps it’s worth pointing out here: “revolutions,” to me at least, mean vast and typically violent changes to the social and political order.) So far at least these predictions have always been wrong.

Thomas Edison famously predicted in 1922, for example, “I believe that the motion picture is destined to revolutionize our educational system and that in a few years it will supplant largely, if not entirely, the use of textbooks.” He continued – and I think this is so very revealing about the goals of much of this push for technological reform, “I should say that on the average we get about two percent efficiency out of schoolbooks as they are written today. The education of the future, as I see it, will be conducted through the medium of the motion picture… where it should be possible to obtain one hundred percent efficiency.”

Educational films were going to change everything. Teaching machines were going to change everything. Educational television was going to change everything. Virtual reality was going to change everything. The Internet was going to change everything. The Macintosh computer was going to change everything. The iPad was going to change everything. Khan Academy was going to change everything. MOOCs were going to change everything. And on and on and on.

Needless to say, movies haven’t replaced textbooks. Computers and YouTube videos haven’t replaced teachers. The Internet has not dismantled the university or the school house.

Not for lack of trying, no doubt. And it might be the trying that we should focus on as much as the technology.

The transformational, revolutionary potential of these technologies has always been vastly, vastly overhyped. And it isn’t simply, as some education reformers like to tell it, that it’s because educators or parents are resistant to change. It’s surely in part because the claims that marketers make are often just simply untrue. My favorite ludicrous claim remains that of Knewton’s CEO who told NPR in 2015 that his company was a “mind reading robot tutor in the sky.” I don’t care how much data you collect about students – well, I do care – but that does not mean, as this CEO said at a Department of Education event, that “We literally know everything about what you know and how you learn best, everything.” (My man here does not even know how to use the word “literally.”)

This promised “ed-tech revolution” hasn’t occurred either in part because the predictions that technologists make are so often divorced from the realities of institutional and individual practices, from the cultures, systems, beliefs, and values of schools and their communities. No one wants a machine to read their children’s minds, thank you very much.

There is arguably no better example of this than the predictions made about artificial intelligence. (No surprise, that includes companies like Knewton who like to say they’re using AI – data collection, data analysis, and algorithms – to improve teaching and learning.) Stories about human-made objects having some sort of mental capacity are ancient; they’re legends. (I’m a folklorist. Trust me when I say they’re legends – exaggerated stories that some people do believe to be true.)

The field of artificial intelligence – programmable, digital computers functioning as some sort of electronic “brain” – dates back to the 1950s. And those early AI researchers loved the legend, making grandiose claims about what their work would soon be able to do: in 1965, for example, Herbert Simon said that “machines will be capable, within twenty years, of doing any work a man can do.” In 1970, Marvin Minsky said that “in from three to eight years, we will have a machine with the general intelligence of an average human being.” Fifty, sixty years later, we still don’t.

Sure, there have been some very showy achievements: IBM’s Deep Blue defeated Garry Kasparov at a game of chess. IBM’s Watson won at Jeopardy. IBM loves these sorts of PR stunts, and it continues to market its AI product-line as hinged on Watson’s celebrity – it purports to be the future of “personalized education.” It’s working with Sesame Street, which kills me. But Watson is not remotely close to the “artificial intelligence” that the company, and the industry more broadly, likes to tout. (A doctor using Watson for cancer treatment described it as “a piece of shit.”) Watson is not some sentient, hyper-intelligent entity. It’s not an objective and therefore superior decision-maker. It’s not a wise seer or fortune-teller.

None of AI is. (And I don’t think it ever can or will be.)

Mostly, today’s “artificial intelligence” is a blend of natural language processing – that is, computers being able to recognize humans’ language (either by typing or speaking) rather than being programmed via a computer language – and/or machine learning – that is, a technique that utilizes statistics and statistical modeling to improve the performance of a program or an algorithm. This is what Google does, for example, when you type something like “how many points LBJ” into the search bar, and you get results about LeBron James. Type “what percentage LBJ,” and you get results about how much the 36th president of the United States increased government spending. Google takes the data about what a website contains, along with how people search and what people click on, in part, to determine what to display in those “ten blue links” that show up on the first page of search results.

In some ways, that’s a lot more mundane than the hype about AI. But it’s an AI I bet we all use daily.

That doesn’t mean it’s not dangerous.

To be clear, when I assert that the push for technology is not new and that the claims about technology are overblown, I don’t mean to imply that this latest push for education technology is irrelevant or inconsequential. To the contrary, in part because of the language that computer scientists have adopted – artificial intelligence, machine learning, electronic brains – they’ve positioned themselves to be powerful authorities when it comes to the future of knowledge and information and when it comes to the future of teaching and learning. The technology industry is powerful, politically and economically and culturally, in its own right, and many of its billionaire philanthropists seem hell-bent on reforming education.

I think there’s a lot to say about machine learning and the push for “personalization” in education. And the historian in me cannot help but add that folks have trying to “personalize” education using machines for about a century now. The folks building these machines have, for a very long time, believed that collecting the student data generated while using the machines will help them improve their “programmed instruction” – this decades before Mark Zuckerberg was born.

I think we can talk about the labor issues – how this continues to shift expertise and decision making in the classroom, for starters, but also how students’ data and students’ work is being utilized for commercial purposes. I think we can talk about privacy and security issues – how sloppily we know that these companies, and unfortunately our schools as well, handle student and teacher information.

But I’ll pick two reasons that we should be much more critical about education technologies (because I seem to be working in a series of “make two points” this morning).

First, these software programs are proprietary, and we – as educators, parents, students, administrators, community members – do not get to see how the machine learning “learns” and how its decisions are made. This is moving us towards what law professor Frank Pasquale calls a “black box society.” “The term ‘black box,’” he writes, “is a useful metaphor… given its own dual meaning. It can refer to a recording device, like the data-monitoring systems in planes, trains, and cars. Or it can mean a system whose workings are mysterious; we can observe its inputs and outputs, but we cannot tell how one becomes the other. We face these two meanings daily: tracked ever more closely by firms and government, we have no clear idea of just how far much of this information can travel, how it is used, or its consequences.” This, Pasquale argues, is an incredibly important issue for us to grapple with because, as he continues, “knowledge is power. To scrutinize others while avoiding scrutiny oneself is one of the most important forms of power.”

I should note briefly that late last year, New York City passed a bill that would create a task force to examine the city’s automated decision systems. And that hopefully includes the algorithm that allocates spaces for students in the city’s high schools. How to resist: demand algorithmic transparency in all software systems used by public entities, including schools.

The second reason to be critical of AI in ed-tech is that all algorithms are biased. I know we are being told that these algorithms are better, smarter, faster, more accurate but they are, as a recent RSA video put it, “an opinion embedded in math.” (Indeed, anytime you hear someone say “personalization” or “AI” or “algorithmic,” I urge you to replace that phrase with “prediction.”)

Algorithms are biased, in part, because they’re built with data that’s biased, data taken from existing institutions and practices that are biased. They’re built by people who are biased. (Bless your hearts, white men, who think you are the rational objective ones and the rest of us just play “identity politics.”) Google Search is biased, as Safiya Noble demonstrates in her book The Algorithms of Oppression. Noble writes about the ways in which Search – the big data, the machine learning – maintains and even exacerbates social inequalities, particularly with regards to race and gender. Let’s be clear, Google Search is very much a “black box.” And again, it’s an AI we use every day.

That Google Search (and Google News and Google Maps and Google Scholar and so on) has bias seems to me to be a much bigger problem than this panel was convened to address. We are supposed to be talking about ed-tech, and here I am suggesting that our whole digital information infrastructure is rigged. I think we’ve seen over the course of the past couple of years quite starkly what has happened when mis- and dis-information campaigns utilize this infrastructure – an infrastructure that is increasingly relying on machine learning – to show us what it thinks we should know. Like I said, it’s not just the technology we should pay attention to; it’s those trying to disrupt the social order.

We are facing a powerful threat to democracy from new digital technologies and their algorithmic decision-making. And I realize this sounds a little overwrought. But this may well be a revolution, and it’s not one that its advocates necessarily predicted or is it, I’d wager one any of us want to be part of.

The History of the Future of High School

OLDaily - 20 Octubre, 2018 - 14:47

Audrey Watters, Vice, Oct 20, 2018

Audrey Watters nails it in a sentence. " The problem with American high school education, it seems, is not that students haven’t learned the “right skills.” The problem is that the systemic inequality of the school system has ensured that many students have been unable to participate fully in either the economy or, more fundamentally, in democracy."

Web: [Direct Link] [This Post]

Hack Education Weekly News

Audrey Watters - 19 Octubre, 2018 - 21:15

Each week, I gather a wide variety of links to education and education technology articles. All this feeds the review I write each December on the stories we are told about the future of education. I’m assembling this week’s news roundup on a flight with patchy WiFi, so I’m probably missing a bunch of stories.

(National) Education Politics

Via Education Week: “Betsy DeVos: Lack of Civics Education Draws Students to Ideas Like Socialism.” What is leading them to fascism, Betsy?

Via The Chronicle of Higher Education: “DeVos Calls Democratic Senator’s Public Criticism of Draft Title IX Rules ‘Unbecoming and Irresponsible’.”

Via ProPublica: “GOP Senator Pushed VA to Use Unproven ‘Brainwave Frequency’ Treatment.” That would be Nevada Senator Dean Heller.

Digital Promise Global has received a three-year, $1 million grant from the National Science Foundation to address equitable access to computational education in public schools,” Education Week reports.

There are more Department of Education stories in the financial aid section, in the accreditation section, and in the “guns are ed-tech” section below.

(State and Local) Education Politics

Via The New York Times and ProPublica: “‘You Are Still Black’: Charlottesville’s Racial Divide Hinders Students.”

Via The Oregonian’s Bethany Barnes: “Portland police inaction on child porn case against teacher ‘concerning’.”

Via New York Magazine: “Mark Zuckerberg Is Trying to Transform Education. This Town Fought Back.” Or at least CZI is trying to convince schools to buy into its Summit learning management system.

Via The New York Times: “Homelessness in New York Public Schools Is at a Record High: 114,659 Students.”

Via Chalkbeat: “Maine went all in on ‘proficiency-based learning’ – then rolled it back. What does that mean for the rest of the country?”

Via The Chronicle of Higher Education: “How This Sociologist’s Research Led a State to Abolish the Death Penalty.” “This sociologist” is University of Washington’s Katherine Beckett. The state, Washington.

Via Chalkbeat: “Eve Ewing explains why some communities just can’t get over school closings.” Buy her book on school closures in Chicago, Ghosts in the Schoolyard.

Immigration and Education

Via Chalkbeat: “School health clinics could take a hit under rule to restrict green cards for immigrants who receive public aid.”

Via Politico: “A proposed rule to change the H–1B visa lottery registration process also could change the educational composition of visa holders, according to a DHS official with knowledge of the regulation.”

Education in the Courts

Plenty of stories in the news this week about the court case alleging bias in Harvard admissions. Via The Chronicle of Higher Education: “Harvard Admissions Trial Opens With Arguments Focused on Diversity.” Via NPR: “Harvard Student Discusses Why She Opposes The University’s Admissions Process.” Also via NPR: “Harvard Student Discusses Why She Supports The University’s Admissions Process.” Via Buzzfeed: “Harvard Wants To Admit Donors’ Kids, Even If That Makes The School More White.” More via The Atlantic and via The Washington Post.

James Damore is moving his lawsuit against Google out of court,” The Verge reports.

Via The Washington Post: “Jury awards $1.3 million to professor in American University age-discrimination case.”

Via KRQE: “Charter school founder sentenced to 5 years for fraud.” That’s Scott Glasrud, founder of the Southwest Learning Centers chain of charters.

There is more legal wrangling in the for-profit higher ed section, in the financial aid section, and in the “guns are ed-tech” sections below.

The Business of Financial Aid

Via Politico: “Court win for student loan protections a setback for DeVos.”

NPR on “Why Public Service Loan Forgiveness Is So Unforgiving.”

More financial aid stories in the for-profit higher ed section below.

The “New” For-Profit Higher Ed

Via Inside Higher Ed: “In lawsuit filed against Betsy DeVos, Education Corporation of America argues that it needs major financial restructuring but that campuses will have to close without federal student aid.”

Jon Marcus writes in Education Next about the future of for-profit higher ed.

Online Education (and the Once and Future “MOOC”)

Via Forbes: “This Company Could Be Your Next Teacher: Coursera Plots A Massive Future For Online Education.”

There’s more MOOC news in the “business of education” section below.

Kara Swisher interviews 2U CEO Chip Paucek.

Meanwhile on Campus…

Via Inside Higher Ed: “Clark College, a community college in Washington State, has announced that it will call off classes and other activities on Monday, the day that Patriot Prayer, a far-right group whose rallies have been violent, will be holding one on campus.”

How many colleges and universities have closed since 2016?” asks Education Dive. It’s only interested in for-profits, I guess.

Via Inside Higher Ed: “MIT Announces Plan for $1B Effort on Computing, AI.”

Yes, Guns Are Ed-Tech (and It’s So F*cked Up that I Had to Make This a Category)

Via The Appeal: “Secretive Campus Cops Patrol Already Overpoliced Neighborhoods.”

Via OregonLive.com: “Patriot Prayer, fresh off wild street brawl, to talk guns at Vancouver colleges.” There’s more on this white nationalist group up in the “on campus” section.

Via The Huffington Post: “U.S. Department Of Education Is Sued For Withholding Information On Arming Teachers.”

Via The New York Times: “Mad Magazine’s ABCs of a School Shooting Give It a Boost of Relevance.”

Via the BBC: “Crimea attack: Gun attack at Kerch college kills 19.”

Accreditations and Certifications and Competencies

Via WCET Frontiers: “The Department of Education’s Plans for Overhauling Accrediting and Innovation Regulations.”

Testing

Via the AP: “Math Scores Slide to a 20-Year Low on ACT.”

Via the St. Louis Post Dispatch: “Tests Taken by High School Students 58 Years Ago Could Predict Whether They Get Alzheimer’s.”

Go, School Sports Team!

A sneak peek at Joshua Hunt’s new book in Pacific Standard: “The Secret Betrayal That Sealed Nike’s Special Influence Over the University of Oregon.”

Labor and Management

Via The New York Times: “Original Big Bird, Caroll Spinney, Leaves ‘Sesame Street’ After Nearly 50 Years.”

Via The LA Times: “Teacher who recounted Trump aide eating glue as a child is placed on paid leave.” That would be Stephen Miller, anti-immigrant glue-eater.

The Business of Job Training

Here’s the Boing Boing headline on a new journal article about the Manne seminars at George Mason University: “A data-driven look at the devastating efficacy of a far-right judge-education program.”

Via Inside Higher Ed: “Google Brings Computing Courses to 10 Colleges.”

I’m not sure this article goes in this section, but anyway… Via The New York Times: “Could an Ex-Convict Become an Attorney? I Intended to Find Out.”

This Week in Betteridge’s Law of Headlines

Can blockchain transform credentialing?asks eCampus News.

(Reminder: according to Betteridge’s Law of Headlines, “Any headline that ends in a question mark can be answered by the word no.”)

Upgrades and Downgrades

Via Motherboard: “The iPhone’s New Parental Controls Block Searches for Sex Ed, Allow Violence and Racism.”

Coming soon to a “flexible classroom” near you: “Panasonic’s human blinkers help people concentrate in open-plan offices,” says Dezeen.

“How a 5-Decade Old Education Company Reinvented Itself” – Edsurge interviews the CEO of the company formerly known as Curriculum Associates.

Congratulations to Edsurge, which just realized that “free textbooks are not always free.” There is still labor involved. Pay writers.

Sponsored content on Edsurge this week, paid for by MacMillan Learning, by ReadingPlus, by Google, and by Gutenberg Technology. Despite reading like an ad, this apparently is not sponsored content.

Robots and Other Education Science Fiction

“A Humanoid Robot Gave a Lecture in a West Point Philosophy Course,” says Futurism.com. Wait wait wait. The robot presents as a black female? JFC.

(Venture) Philanthropy and the Business of Education Reform

Sponsored content on Edsurge, sponsored by the Chan Zuckerberg Initiative, includes this on wellness programs,

Via EdWeek Market Brief: “Money Is Flowing to Social-Emotional Learning: Allstate Foundation Dedicates $45 Million.”

Via Futurism.com: “This Group is Giving Away Art Robots to Public Schools.” “This group” is the Conru Foundation.

Venture Capital and the Business of Education

Mrs. Wordsworth has raised $11 million from Trustbridge Partners, Reach Capital, and Kindred Venture Capital. The literacy startup has raised $13.5 million total.

Devonshire Investors has acquired MOOC startup NovoEd.

Veritas Capital has acquired Cambium Learning Group from Veronis Suhler Stevenson for $14.50 per share.

Inside Higher Ed on Chinese companies buying US colleges.

Data, Surveillance, and Information Security

“More than 500 law students at Georgetown University have signed a petition asking the law school to scrap its new exam software,” Inside Higher Ed reports, due to privacy and security concerns. The software in question: Exam4.

I’m keeping an eye on stories related to genetic testing, because I do hear some low-level rumblings about how this can be tied into the future of “precision education.” So bookmark this and this, I guess.

Research, “Research,” and Reports

Via Inside Higher Ed: “A new report from the Urban Institute used federal data to analyze the ‘mix-match’ between the share of residents with four-year degrees (or some college) and the share of jobs requiring college educations in 387 metropolitan areas. The institute found that mix-matches are common, and that this challenge is unlikely to change soon.”

Edsurge published a story by folks from Entangled Solutions, writing about a report they wrote, funded by the Edsurge funder Omidyar Network. Small world. But anyway, the headline: “As Alternative Higher-Ed Pathways Take Off, We’re Still Forgetting Parent Learners.”

Larry Cuban on “The Standardized Classroom (Part 1).”

Dan Cohen on “What We Learned from Studying the News Consumption Habits of College Students.”

“New research study finds more modest benefits to a Mooresville, N.C., laptop program that was once lauded as a national model,” says The Hechinger Report.

Via Chalkbeat: “What our local education reporters learned when we collaborated with ProPublica to look at equity data.”

Paging the grievance studies trio! Look at this problematic research! Via The New York Times: “Harvard Calls for Retraction of Dozens of Studies by Noted Cardiac Researcher.”

RIP

Bill Gates penned a remembrance for Paul Allen, fellow co-founder of Microsoft: “What I loved about Paul Allen.”

Icon credits: The Noun Project

Online Programs in Continuing Higher Ed Struggle to Retain Students, Find Faculty

Campus Technology - 19 Octubre, 2018 - 17:56
Even as online enrollments continued growing (3 percent last year) while overall higher education enrollments fell, a new survey among deans and directors has uncovered a "mismatch" between the online programs they help deliver and student preferences. These kinds of disconnects, according to a report on the findings, result in "mass-market opportunities that are being lost" by the institutions. The project was undertaken by the Association for Continuing Higher Education and Learning House.

B. F. Skinner: The Most Important Theorist of the 21st Century

Audrey Watters - 19 Octubre, 2018 - 01:00

This talk was presented (virtually) to Nathan Fisk’s class on digital media and learning at the University of South Florida.

I’m currently working on a book called Teaching Machines, and I think that means my thoughts today are rather disjointed – this talk is one part ideas I’m working through for the book, and another part ideas pertaining to this class. I’m not sure I have a good balance or logical coherence between the two. But if you get lost or bored or whatever, know that I always post a copy of my talks online on my website, and you can read through later, should you choose.

Ostensibly “working on a book” means less travel and speaking this fall, although I seem to have agreed to speak to several classes, albeit virtually. I’m happy to do so (and thank you for inviting me into your class), for as an advisor once cautioned me in graduate school when I was teaching but also writing my dissertation, “students will always be more engaging than a blank page.” Students talk back; the cursor in Word just blinks.

I am doing some traveling too, even though I should probably stay at home and write. I’m visiting the archives of the Educational Testing Service in a week or so – maybe you know the organization for its exams the GRE and TOEFL – to continue my research into the history of education technology. The history of testing and technology are deeply interwoven. I’m there to look at some of the papers of Ben Wood, an education psychology professor at Columbia and the president of ETS after his retirement from the university. I’m interested, in part, in the work he did with IBM in the 1930s on building machines that could automatically grade multiple choice exams – standardized tests. I don’t want to rehash a talk I gave a couple of weeks ago to a class at Georgetown on why history matters except to say “history matters.” Education technology’s past shapes its present and its future.

Design matters. Engineering matters. But so too does the context and the practices around technology. Culture matters. All of these systems and practices have a history. (That’s one of the key takeaways for you, if you’re taking notes.)

Why does the cursor blink, for example? How does the blink direct and shape our attention? How is the writing we do – and even the thinking we do – different on a computer than on paper, in part because of blinks and nudges and notifications? (Is it?) How is the writing we do on a computer shaped by the writing we once did on typewriters? How is the testing we take, even when on paper, designed with machines in mind?

The book I’m writing is about the pre-history of computers in education, if you will, as I am keen to help people understand how many of the beliefs and practices associated today’s education technologies predate the latest gadgetry. “Personalized learning,” for example, is arguably thousands of years old, and we can date the idea of individualizing education by using machines back to the 1920s. The compulsion for data collection and for data analysis might seem like something that’s bound up with computers and their capability to extract and store more and more personal data. But collecting data about schools and classrooms, measuring student performance, measuring teacher performance are also practices with very long histories.

A side-note here on the topic of data collection and information accessibility: there’s nothing quite like visiting a museum or archives or library and seeing all the objects that aren’t digitized or even digitize-able to recognize that the people who tell you “you can learn everything on the Internet” are, to put it bluntly, full of shit. Moreover, visiting these institutions and working with their artifacts serves as a reminder about how fragile our records of the past can be. They are fragile as paper; and they are fragile when digital.

I say this as someone who thinks a lot about her digital profile, about the data that she creates, about what she can control and what she cannot. I pay for my own domains to host my scholarship, my “portfolio” – something I would encourage all of you to do. I try to build and run as much of the infrastructure as I can. (You need not do that.) I do so with a lot of intentionality – I don’t have comments on my site, and don’t track who visits my websites, for example – the latter because I think a lot about security and surveillance, the former due to spam and trolls. I post my thoughts, my writing on my own websites, even though social media has really discouraged us from doing this. (Stop and think about the ways in which this occurs. Much like the blinking cursor, there is always intention to the design.) If you go to hackeducation.com or audreywatters.com, you can see hundreds of essays I’ve written; you can see how my thoughts have changed and developed over time.

So while there’s a record of my writing on my websites, elsewhere I have become a “deleter.” Over the past year or so, it’s become very clear to me that, as a woman who works adjacent to technology, as a woman with strong opinions about technology, as a woman with a fairly high profile in her field, that it’s not a bad idea for me to start deleting old social media posts. I delete all my tweets after 30 days; I regularly delete everything I’ve posted to Facebook; I delete old emails. I know full well this doesn’t prevent my data from being used and misused or hacked or stolen; it just makes it harder to take 140 characters I typed in 2011 and rip them from their context. Instructions for making risotto that I sent someone in 2015 – god forbid – will never be part of some Russian conspiracy to alter the course of US politics.

I confess, however, when I visit archives, I do feel bad that I delete things. I feel bad that I haven’t saved a lot of the papers and letters my mum assiduously kept as a record of my childhood and teenage years. When my little brother and I cleaned out my dad’s house a few years ago after he died, I threw a lot of that stuff away. And I worry sometimes about the essays and letters and Very Official Documents that perhaps I should have saved as an adult – not just the papers, but the digital files that are now (or perhaps soon to be) inaccessible because a file format has changed or because a disk became corrupted or because the Internet company that I used to store the stuff has gone out of business.

I felt particularly guilty about all this when I visited the archives of the famed psychologist B. F. Skinner at Harvard University. (Not that I much like the idea of or see the need for having my papers gone through by future scholars. But still.) Skinner’s papers fill 82 containers – almost 29 cubic feet of stuff. Correspondence. Newspaper clippings. Data from his labs. Photographs. Notes. Lectures. An abundance for a researcher like me. I spent a week there in the Harvard University Archives, snapping photos of letters with my iPhone, and I barely scratched the surface.

It would be a mistake to see something like Skinner’s papers as providing unfettered access to his thoughts, his life. An archive is collected and curated, after all. Items are selected for inclusion (either by the individual, by the family, or by the university, for example). These materials tell a story, but that story can give us only a partial understanding.

Even if you, like me, balk at the idea of your papers being housed at a university library, it’s worth thinking about what sort of record you’re leaving behind, what sort of picture it paints about you – about your values, your beliefs, your habits, your interests, your “likes.” I don’t just mean your “papers”; I mean your digital footprint too. I don’t just mean your “legacy”; I mean what data you’re leaving behind now on a day-to-day basis. It’s worth thinking about how digital technologies are designed to glean certain information about you. Not just the letters you’ve written and the data about what, when, where, to whom, and so on, but a whole raft other metadata that every click generates. It’s worth thinking about how your behavior changes (and does not change) knowing (and not knowing) that this data is being recorded – that someone on Facebook is watching; that someone at Facebook is watching.

We are clicking on a lot of things these days, flashing cursors and otherwise.

There’s a passage that I like to repeat from an article by historian of education Ellen Condliffe Lagemann:


I have often argued to students, only in part to be perverse, that one cannot understand the history of education in the United States during the twentieth century unless one realizes that Edward L. Thorndike won and John Dewey lost.


(I am assuming, I suppose, that you know who these two figures are: Edward L. Thorndike was an educational psychology professor at Columbia University who developed his theory of learning based on his research on animal behavior – perhaps you’ve heard of this idea of his idea, the “learning curve,” the time it took for animals to escape his puzzle box after multiple tries. And John Dewey was a philosopher whose work at the University of Chicago Lab School was deeply connected with that of other social reformers in Chicago – Jane Addams and Hull House, for example. Dewey was committed to educational inquiry as part of democratic practices of community; Thorndike’s work, on the other hand, happened largely in the lab but helped to stimulate the growing science and business of surveying and measuring and testing students in the early twentieth century. And this is shorthand for Condliffe Lagemann’s shorthand, I realize, but you can think of this victory in part as the triumph of multiple choice testing over project-based inquiry.)

Thorndike won, and Dewey lost. I don’t think you can understand the history of education technology without realizing this either. And I’d propose an addendum to this too: you cannot understand the history of education technology in the United States during the twentieth century – and on into the twenty-first – unless you realize that Seymour Papert lost and B. F. Skinner won.

(I am assuming here, I admit, that you have done some of what I think is the assigned reading for this course. Namely, you’ve looked at Papert’s The Children’s Machine and you’ve read my article on Skinner.)

Skinner won; Papert lost. Oh, I can hear the complaints I’ll get on social media already: what about maker-spaces? What about Lego Mindstorms? What about PBL?

I maintain, even in the face of all the learn-to-code brouhaha that multiple choice tests have triumphed over democratically-oriented inquiry. Indeed, clicking on things these days seems to increasingly be redefined as a kind of “active” or “personalized” learning.

Now, I’m not a fan of B. F. Skinner. I find his ideas of radical behaviorism to be rather abhorrent. Freedom and agency – something Skinner did not believe existed – matter to me philosophically, politically. That being said, having spent the last six months or so reading and thinking about the guy almost non-stop, I’m prepared to make the argument that he is, in fact, one of the most important theorists of the 21st century.

“Wait,” you might say, “the man died in 1990.” “Doesn’t matter,” I’d respond. His work remains incredibly relevant, and perhaps insidiously so, since many people have been convinced by the story that psychology textbooks like to tell: that his theories of behaviorism are outmoded due to the rise of cognitive science. Or perhaps folks have been convinced by a story that I worry I might have fallen for and repeated myself: that Skinner’s theories of social and behavioral control were trounced thanks in part to a particularly vicious book review of his last major work, Beyond Freedom and Dignity, a book review penned by Noam Chomsky in 1971. “As to its social implications,” Chomsky wrote. “Skinner’s science of human behavior, being quite vacuous, is as congenial to the libertarian as to the fascist.”

In education technology circles, Skinner is perhaps best known for his work on teaching machines, an idea he came up with in 1953, when he visited his daughter’s fourth grade classroom and observed the teacher and students with dismay. The students were seated at their desks, working on arithmetic problems written on the blackboard as the teacher walked up and down the rows of desks, looking at the students’ work, pointing out the mistakes that she noticed. Some students finished the work quickly, Skinner reported, and squirmed in their seats with impatience waiting for the next set of instructions. Other students squirmed with frustration as they struggled to finish the assignment at all. Eventually the lesson was over; the work was collected so the teacher could take the papers home, grade them, and return them to the class the following day.

“I suddenly realized that something must be done,” Skinner later wrote in his autobiography. This classroom practice violated two key principles of his behaviorist theory of learning. Students were not being told immediately whether they had an answer right or wrong. A graded paper returned a day later failed to offer the type of positive behavioral reinforcement that Skinner believed necessary for learning. Furthermore, the students were all forced to proceed at the same pace through the lesson, regardless of their ability or understanding. This method of classroom instruction also provided the wrong sort of reinforcement – negative reinforcement, Skinner argued, penalizing the students who could move more quickly as well as those who needed to move more slowly through the materials.

So Skinner built a prototype of a mechanical device that he believed would solve these problems – and solve them not only for a whole classroom but ideally for the entire education system. His teaching machine, he argued, would enable a student to move through exercises that were perfectly suited to her level of knowledge and skill, assessing her understanding of each new concept, and giving immediate positive feedback and encouragement along the way. He patented several versions of the device, and along with many other competitors, sought to capitalize what had become a popular subfield of educational psychology in the 1950s and 1960s: programmed instruction.

We know that story well in education technology. I know I have told it a hundred times. Skinner probably told it many more than that. It’s sort of the archetypal story for ed-tech, if you will. Man sees problem in the classroom; man builds technological solution. Man tries to sell technological solution; schools don’t want it, can’t afford it. The computer comes along; now teaching machines are everywhere. There’s a nice narrative arc there, a nice bit of historical determinism (which, for the record, I do not subscribe to).

The teaching machine wasn’t the first time that B. F. Skinner made headlines – and he certainly make a lot of headlines for the invention, in part because the press linked his ideas about teaching children, as Skinner did himself no doubt, to his research on training pigeons. “Can People Be Taught Like Pigeons?” Fortune magazine asked in 1960 in a profile on Skinner and his work. Indeed, pigeons weren’t the first time Skinner had made the news. The public was arguably already familiar with the name by the time the teaching machine craze occurred in the late 1950s. Although Project Pigeon – his efforts during World War II to build a pigeon-guided missile (yes, you heard that right) – wasn’t declassified until 1958, Skinner’s work training a rat named Pliny had led to a story in Life magazine in 1937, and in 1951, there were a flurry of stories about his work on pigeons. (The headlines amuse me to no end, as Skinner was a professor at Harvard by then, and many of them say things like “smart pigeons attend Harvard” and “Harvard Pigeons are Superior Birds Too.” Fucking Harvard.)

Like Edward Thorndike – and arguably inspired by Edward Thorndike (or at least by other behaviorists working in the field of what was, at the time, quite a new discipline) – Skinner worked in his laboratory with animals (at first rats, then briefly squirrels, and then most famously pigeons) in order to develop techniques to control behavior. Using a system of reinforcements – food, mostly – Skinner was able to condition his lab animals to perform certain tasks. Pliny the Rat “works a slot machine for living,” as Life described the rat’s manipulation of a marble; the pigeons could play piano and ping pong and ostensibly even guide a missile towards a target.

In graduate school, Skinner had designed an “operant conditioning chamber” for training animals that came to be known as the “Skinner Box.” The chamber typically contained some sort of mechanism for the animal to operate – a plate for a pigeon to peck (click!), for example – that would result in a chute releasing a pellet of food.

It is perhaps unfortunate then that when Skinner wrote an article for Ladies Home Journal in 1945, describing a temperature-controlled, fully-enclosed crib he’d invented for he and his wife’s second child, that the magazine ran it with the title “Baby in a Box.” (The title Skinner had given his piece: “Baby Care Can Be Modernized.”)

Skinner’s wife had complained to him about the toll that all the chores associated with a newborn had taken with their first child, and as he wrote in his article, “I felt that it was time to apply a little labor-saving invention and design to the problems of the nursery.” Skinner’s “air crib” (as it eventually came to be called) allowed the baby to go without clothing, save the diaper, and without blankets; and except for feeding and diaper-changing and playtime, the baby was kept in the crib all the time. Skinner argued that by controlling the environment – by adjusting the temperature, by making the crib sound-proof and germ-free – the baby was happier and healthier. And the workload on the mother was lessened – “It takes about one and one-half hours each day to feed, change, and otherwise care for the baby,” he wrote. “This includes everything except washing diapers and preparing formula. We are not interested in reducing the time any further. As a baby grows older, it needs a certain amount of social stimulation. And after all, when unnecessary chores have been eliminated, taking care of a baby is fun.”

As you can probably imagine, responses to Skinner’s article in Ladies Home Journal fell largely into two camps, and there are many, many letters in Skinner’s archives at Harvard from magazine readers. There were those who thought Skinner’s idea for the “baby in a box” bordered on child abuse – or at the least, child neglect. And there were those who loved this idea of mechanization – science! progress! – and wanted to buy one, reflecting post-war America’s growing love of gadgetry in the home, in the workplace, and in the school.

As history of psychology professor Alexandra Rutherford has argued, what Skinner developed were “technologies of behavior.” The air crib, the teaching machine, “these inventions represented in miniature the applications of the principles that Skinner hoped would drive the design of an entire culture,” she writes. He imagined this in his novel Walden Two, a utopian (I guess) novel in which he envisaged a community that had been socially and environmentally engineered to reinforce survival and “good behavior.” But this wasn’t just fiction for Skinner; he practiced this throughout his science and “gadgeteering,” inventing technologies and applying them to solve problems and to improve human behavior – all in an attempt to re-engineer the entire social order and to make the world a better place.

“The most important thing I can do,” Skinner famously said, “is to develop the social infrastructure to give people the power to build a global community that works for all of us,” adding that he intended to develop “the social infrastructure for community – for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”

Oh wait. That wasn’t B. F. Skinner. That was Mark Zuckerberg. My bad.

I would argue, in total seriousness, that one of the places that Skinnerism thrives today is in computing technologies, particularly in “social” technologies. This, despite the field’s insistence that its development is a result, in part, of the cognitive turn that supposedly displaced behaviorism.

B. J. Fogg and his Persuasive Technology Lab at Stanford is often touted by those in Silicon Valley as one of the “innovators” in this “new” practice of building “hooks” and “nudges” into technology. These folks like to point to what’s been dubbed colloquially “The Facebook Class” – a class Fogg taught in which students like Kevin Systrom and Mike Krieger, the founders of Instagram, and Nir Eyal, the author of Hooked, “studied and developed the techniques to make our apps and gadgets addictive,” as Wired put it in a recent article talking about how some tech executives now suddenly realize that this might be problematic.

(It’s worth teasing out a little – but probably not in this talk, since I’ve rambled on so long already – the difference, if any, between “persuasion” and “operant conditioning” and how they imagine to leave space for freedom and dignity. Rhetorically and practically.)

I’m on the record elsewhere arguing this framing – “technology as addictive” – has its problems. Nevertheless it is fair to say that the kinds of compulsive behavior that we display with our apps and gadgets is being encouraged by design. All that pecking. All that clicking.

These are “technologies of behavior” that we can trace back to Skinner – perhaps not directly, but certainly indirectly due to Skinner’s continual engagement with the popular press. His fame and his notoriety. Behavioral management – and specifically through operant conditioning – remains a staple of child rearing and pet training. It is at the core of one of the most popular ed-tech apps currently on the market, ClassDojo. Behaviorism also underscores the idea that how we behave and data about how we behave when we click can give programmers insight into how to alter their software and into what we’re thinking.

If we look more broadly – and Skinner surely did – these sorts of technologies of behavior don’t simply work to train and condition individuals; many technologies of behavior are part of a broader attempt to reshape society. “For your own good,” the engineers try to reassure us. “For the good of the world.”

Research Project to Explore Role of Libraries and Student Support Services in CCs

Campus Technology - 18 Octubre, 2018 - 18:36
Starting in November, a research firm will begin the first survey in a series intended to help community college library leaders, chief academic officers and others learn about the barriers their students run into as they attempt to access the library or other support services. Thanks to a $450,000 grant from the Institute of Museum and Library Services, Ithaka S+R will undertake the study to help two-year colleges improve their institutional practices and raise their success rate.

McGraw-Hill Education Adding Audio and Video Capture to Digital Course Materials

Campus Technology - 18 Octubre, 2018 - 18:21
Several of McGraw-Hill Education's digital course materials will now feature embedded audio- and video-capture capabilities from video platform GoReact.

Páginas