Teaching with Tech: How New Tools Change the Way Students Learn

Teaching with Tech: How New Tools Change the Way Students Learn

The professor is in your spectacles.

She is not actually in the room with you and the other dozen students, but in the lenses of the stylish smart glasses you’re wearing, she appears to be standing in front of you explaining the meaning of Newton’s Third Law of Motion.

A small bird suddenly and unexpectedly appears in her open hand, and as it flaps its wings and slowly rises, the professor explains that for every action there is an equal and opposite reaction. Then she asks if anyone in the class has any questions.

It’s not science fiction; it’s science fact. And it’s coming soon to a classroom near you.

“If you look a few years into the future, I think there will be some component of augmented reality coming into play with teaching,” noted Donald Davendra, chair of CWU’s Department of Computer Science. “It’s becoming so affordable that perhaps it will become part of the course fee, like a lab fee.”

Davendra and Szilard Vajda, an assistant professor in Computer Science, are enthusiastic ambassadors for new technologies and systems that will transform the way students learn and how instructors teach.

“The technology that we use is basically about the software and the hardware,” Davendra explained. “We use a lot of advanced software, we use parallel computing, we use machine learning, and we use a lot of technology based on that.”

One big change from years past is that students studying computer science no longer use computers.

“We don’t have any more computers, physical computers,” Vajda said. “We have virtual machines. We don’t even see them. We perceived them as machines, but nowhere is there physical hardware.”

Vajda and the other instructors in his department still make use of many cutting-edge technologies, though. For example, students in the department are learning things like mind-mapping (a way of brainstorming thoughts organically without worrying about order and structure), machine learning, robotics, virtual reality, augmented reality, artificial intelligence, video game design, medical imaging, and computer programming languages.

In fact, one of the future trends, according to Davendra, is teaching students how to program with software that uses modules and graphic interfaces. This will reduce the need to do actual computer coding. He also sees learning about augmented reality, machine learning, and virtual reality becoming increasingly important for students.

Davendra said one of the more impressive tools CWU students can utilize right now in their research is the university’s super computer, Turing, named after Alan Turing, a famous English mathematician. Turing consists of a cluster of linked, high-performance power systems enhanced by accelerators and other technology that make it operate ten to 100 times faster than a typical desktop computer.

Turing is only the beginning, however. Vajda said another technological revolution, quantum computing, will probably be commonplace in less than a decade. Quantum computers can operate infinitely faster than current supercomputers.

Meccy and Professor Tech in Classroom

Computer Science lecturer Rosemary Salter stands beside "Meccy," her robotic doppelganger.

One of the more interesting current applications of technology in the classroom is “Meccy,” a robotic device made by Double Robotics that Computer Science lecturer Rosemary Salter has used to teach classes. It resembles a tall, thin version of a Segway transporter device with a computer tablet affixed to the top. As Meccy moves around a room, the user’s face appears on the tablet.

“This is what you would call a telepresence robot. Basically, it makes it like there’s a person in the room,” she said. “It’s very real; it’s not just a machine. It really does feel to them [my students] like there’s somebody in the room.”

Salter said she decided to utilize the robot instructor—which costs about $4,000 and was funded from the Multimedia Education Center’s revenue budget—in her classes at CWU-Sammamish last year because there was only an hour break between a class she taught in Ellensburg and one she was scheduled to teach in Sammamish.

“You can’t get to Sammamish in an hour, not even driving 80 miles per hour,” she said. “I saw this robot they [Multimodal Learning] were testing in the hallway and I latched onto her and used her as my replacement in the classroom.”

Salter said in addition to being a useful teaching tool, the telepresence robot—controlled remotely from a desktop computer or laptop—is being used in medical applications, such as connecting a doctor to patients in remote areas.

Robots also are centerstage in classes taught by Computer Science professor Adriano Cavalcanti, who operates CWU’s robotics laboratory. Cavalcanti’s students not only program a NAO autonomous humanoid robot but they can use a brainwave-reading helmet that allows them to direct the robot using their thoughts.

“The OpenBCI [helmet] is a set of sensors on a cap that basically reads the person’s brainwaves,” he explained. “When you are thinking about something, your brain makes patterns or brainwaves. Basically, we integrate all the readings from the brainwave with machine learning and then, after training it, we have a pattern for certain actions or thoughts.”

In addition to allowing the user to give commands to a robot, the helmet has other applications, such as allowing a blind person to surf the Internet, he said.

“I think that the trend is integrating robotics with cloud computing with brain computer interfaces,” Cavalcanti continued. “Ten years ago, robotics was robotics, cellphones were cellphones, servers were servers, and biomedical tools and devices were biomedical tools and devices. They were all separated topics. Today, what we see is everything is converging and is coming together.”

One of those providing support services for much of the new technology is Nat Nickel, a senior media technician in the Multimodal Learning department. Nickel oversees 3-D scanners, virtual reality devices, and augmented reality equipment. His job is to educate faculty on the potential uses of new technology, particularly related to virtual reality (VR) and augmented reality (AR).

“I’m not sure how many faculty members are aware of it or familiar with it,” he explained. “Our job has been to reach out to the departments and say, ‘Hey, we have this [technology] available to you. Here are some ways you could think about using it in your classes.’”

Nickel said many faculty members have tended to use VR as a kind of “field trip” for students by taking them to the CWU Virtual Reality Lab, which opened in the past year, to experiment with a particular program, such as Google Earth.

But Nickel said he sees enormous potential for VR with distance education, and in the not-too-distant future he predicts many professors will use VR programs like “Lecture Capture,” which allows the instructor to interact with students via a three-dimensional image in the classroom.

“That’s already available,” he said. “Not only that, but those lectures can then be recorded and played back in VR.”

VR Headset Tech in Classroom

A CWU student experiments with a virtual reality headset.

Additionally, CWU’s Active Learning Classrooms have begun using “Merge Cubes,” which are a form of AR. The technology works by synchronizing a computer-generated image to a hand-held cube. When viewed through a device, such as a tablet or cellphone, the image of the cube is replaced by the computer-generated image. Nickel demonstrated this by making it appear he was holding the solar system in his hand on a cellphone screen.

He also sees other new technology—such as special glasses with extra information overlaid on the real world (a more sophisticated version of “Google Glass” devices)—as being adopted in classrooms within the next five years.

That’s not to imply that Computer Science has a monopoly on the use of technology across campus. Dozens of other CWU departments have incorporated cutting-edge devices into their curricula. From Art + Design (which uses 3D printers and digital processes) to Health Sciences (which has a state-of-the-art human motion analysis lab to study biomechanics), classroom technology has moved far beyond chalkboards, rulers, and mimeographed handouts.

For example, Angela Halfpenny, Director of the Murdock Research Laboratory, housed in the Department of Geological Sciences, oversees seven separate lab facilities containing $3 million in state-of-the-art equipment, including an emission scanning electron microscope, a benchtop X-ray diffractometer (XRD), portable handheld, X-ray fluorescent instruments, and several sophisticated spectrometers.

“We are actively trying to encourage people to add more of the instruments into their classes and it is growing each year,” she said. “So far we’ve had classes from history, geology, chemistry, and environmental sciences use various instruments in their classes.”

To prove her point, Halfpenny pointed to a close-up image of a cat’s tooth, taken by a student using the $1 million scanning electron microscope. The detailed scan depicted a scratched, pitting surface that gave a clear indication this particular feline’s diet included bird and mouse bones.

“You see the scratches on the tooth? This tells them [the students] how this creature lived and what it was eating,” she said.

Other projects handled by the lab have included a student project on detecting wine fraud—i.e., does the wine in a bottle match what the label says? They can even identify fake paintings—is a Monet actually a Monet?—by measuring trace elements in paint.

“We’re always producing research-level, publishable results,” Halfpenny said. “Even in our undergraduate and graduate classes, we are having them work at research-quality levels so they get a proper background and experience in what it would be like to get a job in an analytical facility.”

Robot Lab Tech in Classroom

Robotics students test an OpenBCI helmet, which allows them to use their minds to remotely control a robot.


comments powered by Disqus