Improved technology brings more fun and interactive methods of employee assessment
Looking for productivity improvements, training departments are turning to employee assessment tools to gain insights into how a person works and what they can do to work better.
Technological improvements in e-learning, online teaching and the related audio-visual components have had a dramatic impact on the style and type of assessment tools available.
“It used to be that e-learning was really limited to technology training … and the testing was limited to very technical questions,” says Alan Ray, vice-president of Montreal-based e-learning products provider Docworks CPTI. But that is changing.
“There are so many multimedia technologies out there for the Internet that it has opened up the possibilities for some very fun, exciting, interactive quizzes that go beyond standard examinations with true and false or multiple-choice questions. And that means that you can mix a variety of media to really communicate what you’re trying to get with the question,” he says.
Of late, there has been a great deal of interest in using employee assessments to get a better gauge of soft skills. For example, an employee can be presented with a scenario requiring her to deliver some bad news to a colleague. She must choose from a number of possible approaches and through audio-visual simulations, the student is shown what the result of her choices could be.
Putting employees through soft-skill assessments gives them a better understanding of different personality types — both their own and others — and varying approaches to communication, says Denise Hughes, director of Career/Lifeskills Resources based in Concord, Ont.
Developing a better understanding of one’s soft skills, translates into better job performance.
“The net result is a more satisfied employee, there’s greater productivity within the organization, and the satisfaction is translated over to their clients,” says Hughes.
Hughes says assessments focusing on leadership development and coaching are particularly hot right now.
“We’re seeing this more and more, especially in the last five to six years. It’s incredible how people are doing it. And they’re looking more at self-discovery models, rather than the traditional pencil-and-paper approaches. We’ve seen growth particularly in instrument uses such as the Myers-Briggs Type Indicator, the True Colors model, the FIRO-B instrument, and professional reports for things like the Strong Interest Inventory.”
The Myers-Briggs instrument is believed to be the most widely-used personality inventory in the world. Based on the work of Carl Jung and in use for about 50 years now, Hughes says some three million people took the test in the last year alone in such applications as team-building, career pathing and organization and leadership development.
True Colours was developed in the mid-1980s, and grew out of an instrument used in the realm of TV, film and advertising, designed to ensure programs and commercial messages were properly designed to connect with the four principal temperaments or personality types.
The ultimate objective of these tests is to spur personal growth within the members of an organization in order to communicate more effectively with each other and with clients, says Hughes.
This realization has produced a considerable increase in demand for soft skills assessment instruments.
“Sales of these types of instruments have increased substantially over the past four years in particular, an increase in some cases of five times what they were four years ago.”
Similarly Docworks CPTI has seen notably higher demand for more of these types of e-learning and assessment tools in the last six months, says Ray.
Regan Legassie, a member of the board of directors of the Ontario Society for Training and Development (OSTD), says the growing emphasis on soft skills assessment reflects the trend in organizations to be leaner and more service-oriented.
“They’ve thinned everybody out, moved everybody together, and different people now have to interface with the public on a regular basis. So what you’re seeing is these soft skills tend to become more important,” says Legassie.
“The other reason is, with the change and downsizing and the re-organization, we’re seeing a lot more collaborative-type work going on… You’re seeing that more and more people have to be able to relate to other people, to work with other people, to get along with other people. So the soft skills have really come to the forefront,” he says.
“At OSTD, we’re seeing more and more interest in professional development workshops focussed on the value of self-discovery soft skills training.”
Regardless of the nature of the assessment — whether it analyzes soft skills or more tangible parameters like proficiency in required skills — companies and organizations appear to be viewing testing as a valuable investment from a business standpoint. The challenge, as it so often the case with HR initiatives, is capturing the value of the investment though there are some examples of organizations doing just that.
“Training is treating itself as a business now, whereas before we were thought of as a service. And so, for us to try to prove our worth, it means we have to get to measuring business results, and then measuring return on investment,” says Robert Yee, manager of learning solutions for TD Bank Financial Group.
TD has created its own separate measurement department that reports on all data produced through training and testing.
In part, it assesses the company’s training results on the basis of business results: have specific business targets, such as quantifiable increases in sales or reductions in errors, been achieved after training was conducted? Similarly, from a return on investment standpoint: do the net benefits outweigh the original investment in training and assessment?
“It actually causes the training — and testing on the training — to be very focused because you want, if you can get it, a positive return on investment in the first year. So that means you have to really focus yourself on ‘What exactly do we need to train them on?’ and not waste time, and not waste energy, and to deliver it in the best method for the learning and for the learner,” says Yee.
TD uses testing — “evaluation” is their preferred term — on a number of levels.
Mastery testing ensures employees achieve the requisite level of knowledge or proficiency in a given area. A recent example is the newly-launched Managing @ TD program for all “people managers” within the organization. This is a blended learning program with an online component encompassing four different modules. TD managers must pass a mastery test — again, conducted online — for each module before they can attend the workshop portion of the program.
“We use mastery testing whenever there is a business requirement or a compliance requirement to ‘know that the people know,’” says Yee. “There is also, of course, a huge employee win when there is a mastery test, because they validate their own learning at the same time.”
Even where there is no official requirement for a mastery test, TD training programs incorporate “knowledge checks” designed to provide employees with an opportunity to track their own learning progress.
Once the training is completed and the employees are back on the job, another evaluation is conducted three to six months later. Yee says this is where the tale really gets told, as to whether new behaviours are actually being applied as a result of the training.
This leads to the next level of evaluation: business result evaluation. “Did those new behaviours result in the whole reason why the training was created in the first place, which was to reach some kind of business target — whether it’s increased sales, or reduced errors, or customer meetings or whatever it might be.”
The last evaluation assesses return on investment, and Yee says that’s often difficult to do with precision.
“Training is typically only one part of the whole initiative. There might be a new system put in place, there might be a new compensation scheme, there might be some new management decisions that have to be made, and training is just one of those elements.”
However, Yee says TD is continuing to push its assessment system in this direction, and the bank feels it’s making progress.
“We believe we can get reliable data to prove that there is a business impact of training, and we do believe that eventually we can prove that training will result in a positive return on investment,” says Yee.
Bob Reid writes about training and development issues for the Ontario Society for Training and Development.
Technological improvements in e-learning, online teaching and the related audio-visual components have had a dramatic impact on the style and type of assessment tools available.
“It used to be that e-learning was really limited to technology training … and the testing was limited to very technical questions,” says Alan Ray, vice-president of Montreal-based e-learning products provider Docworks CPTI. But that is changing.
“There are so many multimedia technologies out there for the Internet that it has opened up the possibilities for some very fun, exciting, interactive quizzes that go beyond standard examinations with true and false or multiple-choice questions. And that means that you can mix a variety of media to really communicate what you’re trying to get with the question,” he says.
Of late, there has been a great deal of interest in using employee assessments to get a better gauge of soft skills. For example, an employee can be presented with a scenario requiring her to deliver some bad news to a colleague. She must choose from a number of possible approaches and through audio-visual simulations, the student is shown what the result of her choices could be.
Putting employees through soft-skill assessments gives them a better understanding of different personality types — both their own and others — and varying approaches to communication, says Denise Hughes, director of Career/Lifeskills Resources based in Concord, Ont.
Developing a better understanding of one’s soft skills, translates into better job performance.
“The net result is a more satisfied employee, there’s greater productivity within the organization, and the satisfaction is translated over to their clients,” says Hughes.
Hughes says assessments focusing on leadership development and coaching are particularly hot right now.
“We’re seeing this more and more, especially in the last five to six years. It’s incredible how people are doing it. And they’re looking more at self-discovery models, rather than the traditional pencil-and-paper approaches. We’ve seen growth particularly in instrument uses such as the Myers-Briggs Type Indicator, the True Colors model, the FIRO-B instrument, and professional reports for things like the Strong Interest Inventory.”
The Myers-Briggs instrument is believed to be the most widely-used personality inventory in the world. Based on the work of Carl Jung and in use for about 50 years now, Hughes says some three million people took the test in the last year alone in such applications as team-building, career pathing and organization and leadership development.
True Colours was developed in the mid-1980s, and grew out of an instrument used in the realm of TV, film and advertising, designed to ensure programs and commercial messages were properly designed to connect with the four principal temperaments or personality types.
The ultimate objective of these tests is to spur personal growth within the members of an organization in order to communicate more effectively with each other and with clients, says Hughes.
This realization has produced a considerable increase in demand for soft skills assessment instruments.
“Sales of these types of instruments have increased substantially over the past four years in particular, an increase in some cases of five times what they were four years ago.”
Similarly Docworks CPTI has seen notably higher demand for more of these types of e-learning and assessment tools in the last six months, says Ray.
Regan Legassie, a member of the board of directors of the Ontario Society for Training and Development (OSTD), says the growing emphasis on soft skills assessment reflects the trend in organizations to be leaner and more service-oriented.
“They’ve thinned everybody out, moved everybody together, and different people now have to interface with the public on a regular basis. So what you’re seeing is these soft skills tend to become more important,” says Legassie.
“The other reason is, with the change and downsizing and the re-organization, we’re seeing a lot more collaborative-type work going on… You’re seeing that more and more people have to be able to relate to other people, to work with other people, to get along with other people. So the soft skills have really come to the forefront,” he says.
“At OSTD, we’re seeing more and more interest in professional development workshops focussed on the value of self-discovery soft skills training.”
Regardless of the nature of the assessment — whether it analyzes soft skills or more tangible parameters like proficiency in required skills — companies and organizations appear to be viewing testing as a valuable investment from a business standpoint. The challenge, as it so often the case with HR initiatives, is capturing the value of the investment though there are some examples of organizations doing just that.
“Training is treating itself as a business now, whereas before we were thought of as a service. And so, for us to try to prove our worth, it means we have to get to measuring business results, and then measuring return on investment,” says Robert Yee, manager of learning solutions for TD Bank Financial Group.
TD has created its own separate measurement department that reports on all data produced through training and testing.
In part, it assesses the company’s training results on the basis of business results: have specific business targets, such as quantifiable increases in sales or reductions in errors, been achieved after training was conducted? Similarly, from a return on investment standpoint: do the net benefits outweigh the original investment in training and assessment?
“It actually causes the training — and testing on the training — to be very focused because you want, if you can get it, a positive return on investment in the first year. So that means you have to really focus yourself on ‘What exactly do we need to train them on?’ and not waste time, and not waste energy, and to deliver it in the best method for the learning and for the learner,” says Yee.
TD uses testing — “evaluation” is their preferred term — on a number of levels.
Mastery testing ensures employees achieve the requisite level of knowledge or proficiency in a given area. A recent example is the newly-launched Managing @ TD program for all “people managers” within the organization. This is a blended learning program with an online component encompassing four different modules. TD managers must pass a mastery test — again, conducted online — for each module before they can attend the workshop portion of the program.
“We use mastery testing whenever there is a business requirement or a compliance requirement to ‘know that the people know,’” says Yee. “There is also, of course, a huge employee win when there is a mastery test, because they validate their own learning at the same time.”
Even where there is no official requirement for a mastery test, TD training programs incorporate “knowledge checks” designed to provide employees with an opportunity to track their own learning progress.
Once the training is completed and the employees are back on the job, another evaluation is conducted three to six months later. Yee says this is where the tale really gets told, as to whether new behaviours are actually being applied as a result of the training.
This leads to the next level of evaluation: business result evaluation. “Did those new behaviours result in the whole reason why the training was created in the first place, which was to reach some kind of business target — whether it’s increased sales, or reduced errors, or customer meetings or whatever it might be.”
The last evaluation assesses return on investment, and Yee says that’s often difficult to do with precision.
“Training is typically only one part of the whole initiative. There might be a new system put in place, there might be a new compensation scheme, there might be some new management decisions that have to be made, and training is just one of those elements.”
However, Yee says TD is continuing to push its assessment system in this direction, and the bank feels it’s making progress.
“We believe we can get reliable data to prove that there is a business impact of training, and we do believe that eventually we can prove that training will result in a positive return on investment,” says Yee.
Bob Reid writes about training and development issues for the Ontario Society for Training and Development.