IF YOU'RE UNDER 18 and you're looking to acquire the latest and greatest 21st-century skills, you might want to try to figure out a way into Thomas Jefferson High School for Science and Technology in Alexandria, VA.
The very selective public magnet school requires that all of its 1,800 students take at least one computer science course during their freshman year--a course that serves to familiarize them with programming languages such as Java and C. Afterward, many students go on to take additional courses to prepare themselves for the advanced placement exam in computer science, which they take at the end of their sophomore year.
And that's just as a warmup.
In post-AP electives, students can study everything from artificial intelligence to parallel programming, in addition to learning programming languages such as PHP, Perl, C++, Matlab, and Mason. By the time Jefferson students graduate, it's possible that they could know more programming languages than many 22-year-olds with bachelor's degrees in computer science.
"Our kids leave here with specific technology skills, but also with a great capacity for learning," says Shane Torbert, one of the school's computer science teachers. "Once you learn a couple of programming languages, it becomes that much easier to learn others and expand your abilities."
Jefferson High's curriculum is much more advanced than the average high school's, but its approach to preparing its students for life in the real world is worth considering. At a time when educators are talking about emphasizing the skills graduates must have to compete in the 21st century, some observers believe that schools are not in tune with what a 21st-century-skill truly is, and that not enough of them are teaching their students the very technologies they need to get ahead. Just what are 21st-century skills? Most conversations on the topic focus on concepts such as creativity and innovation, critical thinking and problem solving, and communication and collaboration. But are those skills something students can easily show off on a resume or college application, or in a job interview? Probably not.
"Job applicants would be hard-pressed to demonstrate problem-solving excellence or critical-thinking prowess through an application packet or a one-time meeting," says Michael Schmidt, director of education and community development for the Ford Motor Company Fund. "You can get a sense of these skills from work samples, but by and large they take time for people to showcase, and usually can't be demonstrated until after the person has the chance to join the company and shine."
The consensus from the education and business worlds seems to be that the best-equipped new graduates possess both abstract cognitive skills and practical technology know-how-and that having one without the other is a shortcoming. Ken Kay, president of the Partnership for 21st Century Skills, contends that 21st-century skills aren't so much about mastering one particular technology as they are about using technology to master a skill.
"Understanding specific technologies is important, but technologies change," Kay says. "Our definition of literacy is the ability to use any kind of technology to innovate, collaborate, and communicate. From there, everything else is extra."
As Kay suggests, hard-core tech skills and less demonstrable cognitive abilities can go hand in hand. In fact, the former is enhanced by the latter, as Jeffrey Yan sees it. Yan, CEO of Digication, an e-portfolio vendor in Providence, RI, says that when he's on the lookout for new employees, he tries to hire fresh-out-of-college kids who can demonstrate the holy trinity of skill sets: critical thinking, problem solving, and programming.
Specifically, Yan says he likes to hire people with knowledge of HTML, C++, Java, and Ajax; people who have the ability to build web and database applications that can grow over time. …