Imagine
Mason, a new employee, is taking an online training for his company's time and attendance program. He is sitting at his desk in front of his computer with an attached webcam and microphone. On his wrist, Mason has a wearable tech watch that tracks his heart rate. Mason comes to a part of the training about requesting to work overtime that he doesn't understand. He frowns and sighs. As the lesson continues, his eyebrows raise as he is still confused about the process presented. After a total of thirty seconds of similar facial gestures and a rise in heart rate monitored by the system program, the lesson pauses automatically. A learning coach appears on the screen. "Mason," says the learning coach, "do you have a question, need additional explanation or require other help?" A menu stating these options appears now on his computer screen. "Mason, please say or select one of the options on the menu to tell me how I can help you further." Mason moves his mouse and chooses "Need additional explanation". The learning coach appears on the screen and says, "Mason, in a short statement tell me what needs additional explanation." Mason says, "I am unsure of the process for requesting overtime." The learning coach appears, "Thank you Mason, I will return to that section of the training and highlight the steps."
Possibilities
I am a training educator and work with adults. I really enjoy designing eLearning with authoring tools like Camtasia and Captivate. This is how I envision employee training in five years could be, how it could personalize the experience and make the student feel engaged and supported. Having taken many online trainings, I know they can be boring and I am guilty of pushing the "Next" button when the training has lost my attention. The information that was presented was remembered long enough to get pass the knowledge check or I retook the knowledge checks enough times to finally get a passing score. Having recently learned about the field of Affective Computing while working on my M.ED Applied Technology in Education, I am now keeping up on how Affective Computing is being and will be used in learning environments.
The above scenario is just one example of how Affective Computing can change the way we learn and teach.With the use of voice, facial and gesture recognition combined with bio-rhythmic data, feedback can be provided to the instructor to improve the delivery of training and instruction to students. Conversely, students will have better outcomes with their learning if it is personalized to their skill level and ability. If a student had the ability to have a virtual learning coach, I don't believe it would replace the instructor but rather act as an assistant to help with personalized attention when one student needs more attention than the rest of the group. It's not fair to rush someone along who learns at a slower pace than others in the group. Everyone deserves to learn and not be made to feel bad because they may not comprehend as fast as another student. The goal of education is to learn, not to see how fast you can learn.
I believe this concept of personalized learning achieved through Affective Computing works for any age, whether it is an adult learner or a school age student. So, if you are teacher read the paragraph about Mason below and see how this works just as well with school age students.
Mason, a sixth grader, is completing an online math lesson. He is sitting at his desk in front of his computer with an attached webcam and microphone. On his wrist, Mason has a wearable tech watch that tracks his heart rate. Mason comes to a part of the lesson about dividing fractions that he doesn't understand. He frowns and sighs. As the lesson continues, his eyebrows raise as he is still confused about the process presented. After a total of thirty seconds of similar facial gestures and a rise in heart rate monitored by the system program, the lesson pauses automatically. A learning coach appears on the screen. "Mason," says the learning coach, "do you have a question, need additional explanation or require other help?" A menu stating these options appears now on his computer screen. "Mason, please say or select one of the options on the menu to tell me how I can help you further." Mason moves his mouse and chooses "Need additional explanation". The learning coach appears on the screen and says, "Mason, in a short statement please tell me what needs additional explanation." Mason says, "I'm not sure that I understand how to divide fractions." The learning coach appears, "Thank you Mason, I will return to that section of the training and highlight the steps."
Sound Too Futuristic?
Actually, the NMC Horizon Report: 2016 Higher Education has Affective Computing listed as an emerging trend within the next three to five years. Perhaps this is because Affective Computing has been around since the mid 1990's. The term Affective Computing was coined in 1994 by Rosalind W. Picard, Sc.D., FIEEE, currently the Director of Affective Computing Research and Faculty Chair for MIT Mind+Hand+Heart. Rosalind Picard founded two affective computing technology companies Affectiva and Empatica. Affectiva, a company specializing in facial recognition software, works with companies to see the effectiveness ads will have on people who view the ads with the software before they are shown to the larger market. Empatica is a company specializing in wearable technology to improve the lives of epileptics through monitoring, tracking and notifications.
Below is a Piktochart I created providing an overview of Affective Computing.
Are You Using Wearable Technology?
How can wearable technology improve my ability to learn? About 4 years ago my husband and I were really into running so we bought each other the Nike+ runner's watch, it would track the miles we ran by GPS and would give a breakdown of how fast our pace was, total miles, etc. It did not have the ability to track our heart rate. So we bought the bluetooth chest strap that connected to the watch. We both really liked the idea of having the heart rate information. All of this information could be tracked on the Nike+ app. My husband was in the military and he ran all the time, I was pretty much a newbie and had a hard time putting in the miles like he did. After a run, you would download the data from the watch to the Nike+ app on the computer and it would analyze where you ran fast, where you slowed down, it showed your heart rate during the run, elevation, miles, pace, etc. Having the watch helped me to learn how to to run more efficiently and to recognize the points where I would need more motivation-where I would have to push myself. So, yes, I can say that having my watch did help me learn to run better.
I just recently got a new cell phone. My old phone, a Galaxy Note II was old--bought in December 2012, it couldn't keep up with the updates. My new Galaxy 7 Edge came with the Gear S2 watch. I LOVE it! Other than my runner's watch I haven't worn a watch since I can't remember when. I am still learning about this wearable but what I've learned so far is: I can take calls through my watch, I can track my heart rate, I can track my sleep and record when I snore (which is pretty rare and not very loud when I do, compared to the sleeping dragon--my husband).
You may already be using wearable technology yourself. If so, what has it taught you?
Facial Recognition Technology Isn't Just for Security Anymore
It may have been that facial recognition was originally developed to find criminals. Now billion dollar corporations are using it to determine marketability of products, movies, or television shows to see how the public reacts before taking the plunge on making something new. Does this technology really measure up? Can facial recognition really interpret your emotional response to watching an ad about a product or a show? If you want, you can try out the software that companies are already using to see how people just like you are reacting to their products or upcoming movies. All you need is a computer connected to the internet to try out Affectiva's demo software. Personally, I liked the demo that shows how you react to the YouTube video and you even get choose which one you watch. After you watch you get to see watch again and see your measured reactions at the same time. I'd say it was pretty accurate.
So that's it for now, I hope you have learned something new or at least I have given you something new to think about. Leave a comment and let me know about what you have learned from your wearable technology and let me know what demo you liked if you decide to try it out.
No comments:
Post a Comment