Although many of the milestones of the digital revolution have sprung directly from the research output of America’s colleges and universities, like Athena from Zeus’s forehead, on the instructional side, American higher education has taken a laid-back approach. Sure, there are more courses in computer science, millions of students taking courses online and MIT just committed $1 billion to build a new college for AI. But a campus-visiting time-traveler from 25 or 50 years ago would find a very familiar setting — with the possible exception of students more comfortable staring at their devices than maintaining eye contact.
This college stasis may be even more surprising to visitors from the transformed workplace. Jobs that made no or marginal use of digital devices 10 years ago now tether workers to their machines as closely as today’s students are glued to their smartphones. Processes that involved paper are now entirely digital. And experience with relevant function- and industry-specific business software is required in job descriptions for many entry-level jobs.
This hit home a few weeks ago when speaking to an audience of 250 college and university officials. I asked which of their schools provide any meaningful coursework in Salesforce, the No. 1 SaaS platform in American business.
Not one hand went up.
There are many reasons for this. Few if any faculty have dedicated their careers to (or even get marginally excited about) equipping students with the skills they need to secure and succeed in their first jobs. No one’s losing their job (yet) over failure to help students get jobs. Another is the cost of teaching; with strong employer demand for these skills, finding and hiring capable faculty costs more than teaching non-technical subjects. Finally, there’s the rapid pace of change in technology, and the sense that any educational effort will be obsolete in a few years. (Of course, the reality of business software is quite different; foundational platforms like Salesforce have a long shelf life — 10-plus years and counting — and some platforms are expected to last for a generation.)
But the primary reason colleges aren’t educating students on the software they need to launch their careers is the notion that it’s unnecessary because millennials (and now Gen Zers) are “digital natives.”
The idea of digital natives isn’t new. It’s been around for decades: Kids have grown up with digital technologies and so are adept at all things digital. It’s certainly true that today’s college students are proficient with Netflix and Spotify and smartphones. But it’s equally true that the smartphones they’ve grown up with haven’t remotely prepared them to use office phones, let alone career-critical business software.
Business software is really hard, even for digital natives.
Eleanor Cooper, co-founder of Pathstream, a startup partnering with higher education institutions to provide business software training, notes that millennials and Gen Zers are “accustomed to Instagram-like platforms which are both intuitive and instantly gratifying. But without exception, we find the user experience of learning business software to be exactly the opposite: instant friction and delayed gratification. Students first face an often multi-hour series of technical steps just to get the software set up before they begin working through tedious button-clicking instructions, which are at best mind-numbing and at worst outdated and inaccurate for the current version of the software.”
In an article in The New Yorker last month, “Why Doctors Hate Their Computers,” Dr. Atul Gawande describes the challenge of implementing Epic, a SaaS platform for managing patient care: “recording and communicating our medical observations, sending prescriptions to a patient’s pharmacy, ordering tests and scans, viewing results, scheduling surgery, sending insurance bills.”
First, there’s 16 hours of mandatory training. Gawande “did fine with the initial exercises, like looking up patients’ names and emergency contacts. When it came to viewing test results, though, things got complicated. There was a column of thirteen tabs on the left side of my screen, crowded with nearly identical terms: ‘chart review,’ ‘results review,’ ‘review flowsheet.’ We hadn’t even started learning how to enter information, and the fields revealed by each tab came with their own tools and nuances.”
Business software is really hard, even for digital natives. Today’s students are accustomed to simple interfaces. But simple interfaces are possible only when the function is simple, like messaging or selecting video entertainment. Today’s leading business software platforms don’t just manage a single function. They manage hundreds, if not thousands.
Gawande references a book by IBM engineer Frederick Brooks, The Mythical Man-Month, which sets forth a Darwinian theory of software evolution from a cool, easy-to-use program (“built by a few nerds for a few of their nerd friends” to perform a limited function), to a bigger program “product” that delivers more functionality to more people, to a “very uncool program system.” Gawande points to the example of Fluidity, a program written by a grad student to run simulations of small-scale fluid dynamics. Researchers loved it, and soon added code to perform new features. The software became more complex, harder to use and more restrictive.
And so beyond cumbersome interfaces, the second reason why business software is really hard is that it has become inextricably and tightly wound up with business processes. Salesforce consultants will tell you it’s easier to conform your business practices to Salesforce than to try to customize (or even configure) Salesforce to support the way you do business today. And that’s true for almost all business software. As Gawande notes, “as a program adapts and serves more people and more functions, it naturally requires tighter regulation. Software systems govern how we interact as groups, and that makes them unavoidably bureaucratic in nature.”
The myth of the digital native is convenient for colleges and universities, because it allows them to stay focused on what faculty want to teach rather than what students actually need to learn.
Software-defined business practices are increasingly standardized across functions and industries, and highly knowable. And because they’re knowable, hiring managers want to see candidates who know them. So it’s not just about educating students on software; inherent in preparing students on business software is equipping them with industry and/or job-function expertise. And that requires much more than 16 hours of training.
“Why can’t our work systems be like our smartphones — flexible, easy, customizable? The answer is that the two systems have different purposes,” Gawande explained. “Consumer technology is all about letting me be me. Technology for complex enterprises is about helping groups do what the members cannot easily do by themselves — work in coordination.”
The myth of the digital native is convenient for colleges and universities, because it allows them to stay focused on what faculty want to teach rather than what students actually need to learn. But it’s self-centered, superficial and silly. Rather than thinking about technology in terms of Netflix and smartphones, walk down the street and take a look at the software being utilized to manage your college’s admissions, financial aid and human resources functions. Indeed, 95 percent of your graduates will begin their careers working in places that look a lot more like this than like the faculty lounge. And that’s if they’re lucky. Otherwise they’ll begin their careers working in places that look a lot more like Starbucks.
In his article, Gawande notes that despite the many challenges of adapting to working (and living) on a business software platform, software is eating the world for a good reason: to improve outcomes for consumers. The Epic implementation should allow hospitals to scan records to identify patients who’ve been on opioids for more than three months in order to provide outreach and reduce risk of overdose, or to improve care for homeless patients by seeing that they’ve already had three negative TB tests and therefore don’t need to be isolated. “We think of this as a system for us and it’s not,” said the hospital system’s chief clinical officer. “It is for the patients.”
These improved outcomes are synonymous with the data analytics revolution — a revolution that has colleges and universities excited about new programs and increased enrollment. But all the additional data to improve these outcomes needs to be captured first. And that’s done with complex business software. So it’s unfair, or at least hypocritical, of colleges and universities to attempt to pick the fruit of big data without first sowing the seeds. And sowing the seeds entails a serious investment in preparing students with the technical and business process knowledge they’ll need to use the software that makes big data possible.
from TechCrunch https://ift.tt/2PDQpwz
No comments:
Post a Comment