Training That Delivers Results

Instructional Design That Aligns with Business Goals

 Training That Delivers Results

Author: Dick Handshaw
Pub Date: May 2014
Print Edition: $34.95
Print ISBN: 9780814434031
Page Count: 224
Format: Paper or Softback
e-Book ISBN: 9780814434048

Buy the book:

Buy the book thru Barnes and Noble. Buy the book thru AMAZON. Buy the book thru indiebound. Buy the eBook.
See other vendors.



A New Model for Results

Training That Delivers Results aims to change learning organizations and their leaders by offering a strategic model that focuses on achieving desired business results. The strategic instructional design process described in this book produces observable, measurable, and repeatable training programs that deliver results. Observable means that you are able to clearly see the intended behaviors of and outcomes achieved by your performers. Measurable means that you are able to compare the results of your learner's performance against a predetermined standard. The system is efficient and predictable yet still offers room for flexibility and creativity.

The Handshaw Instructional Design Model applies principles of both performance improvement and instructional design to a variety of learning situations. Achieving business--focused outcomes begins by identifying both learning and nonlearning solutions to performance problems. Instructional design practiced this way doesn't cost time and money, it saves time and money.


Some people in our profession consider themselves instructional designers; others consider themselves to be performance consultants. An effective way to deliver value is by integrating the skills of both performance improvement and instructional design. Adding the steps of performance consulting from the Handshaw Instructional Design Model enables you to link learning goals to strategic business goals.

A New Model

The Handshaw Instructional Design Model integrates the principles of performance improvement with those of classic instructional design (see Figure 1.1). Although many of the parts of this model are not new, the concept of combining elements of performance improvement and instructional design into one straightforward, easy--to--use model is new. If your instructional design model is not saving you time and delivering business results, then this may be an approach to consider. By identifying both learning and nonlearning solutions up front, designers are better able to spend their time and resources delivering solutions that solve the right performance challenges. I have spent more than thirty years working with my team and our clients to refine our model and its application in a wide variety of situations. The following section describes the basics for applying the model.


Proactive Performance Consulting

You can establish a consulting relationship with your client through proactive performance consulting (see Figure 1.2). The purpose of establishing this relationship is to ensure that the training you develop aligns with your client's business goals. You can develop a "trusted partner" relationship with your client by having regular proactive consulting meetings. These meetings are informal, conversational, and simple to conduct. The eight -well--tested principles of the successful proactive consulting meeting are detailed in Chapter 2.

Reactive Performance Consulting

Most instructional designers are accustomed to meeting with their clients to react to training requests. These meetings present an opportunity to reframe the training request in order to align the training need with the business need. The reactive consulting meetings position you to take responsibility for results and outcomes of a learning program and help you transition from training "order taker" to trusted partner. (See Figure 1.3.)

Needs/Gap Analysis

The gap analysis is a frequently used type of needs analysis. It may be used after a successful proactive or reactive meeting in which appropriate learning needs have been identified. The first gap you identify is the difference between the current and the expected business outcomes. Next, determine the gap between the current performance and the performance required to achieve the business result. The exact level of performance required to close the gap is defined during the Analysis and Design phases in the model. Armed with this information, you can identify both learning and nonlearning solutions to help you and your client close both the performance and the business gaps. (See Figure 1.4.)

Learning Analysis

Once you have identified the skills gap, the Learning Analysis phase begins (Figure 1.5). The first step, a task analysis, is essentially a snapshot of the perfect performer engaging in a work task in a way that achieves the desired business result. Develop-ing a task analysis verified by subject matter experts (SMEs) and stakeholders before beginning training design is an essential step. It is cost effective and helps you avoid project delays and cost overruns.

There are three other types of analysis that help you make decisions later, during the Design phase. You can conduct an audience analysis to find out what your learners already know about the training program's content. Collecting helpful demographic information about the intended audience is also part of this analysis. The audience analysis doesn't require a lot of time and can be reused when conducting training for the same audience in the future.

Many organizations overlook the importance of culture. Culture can be one of the greatest enhancers or strongest barriers to a successful learning event. Conduct a learning culture analysis to leverage your organization's culture to impact the ultimate, lasting success of your learning design.

The last piece of analysis you should complete is called a delivery systems analysis. This may be more useful for outside service providers than for internal practitioners, but conducting this type of analysis might help you avoid the -really embarrassing situation of specifying the use of an unpopular or poorly performing learning delivery system. A delivery systems analysis can also be revised and reused for future projects.


The Design phase (Figure 1.6) begins with the development of performance objectives (you can substitute learning objectives if you prefer that term). Successful selection and design of measurement instruments begins with well--written objectives. Although I am a proponent of flexibility, I don't recommend it here. You will reap the benefits of good objectives when measuring learning -outcomes---for example, when you are required to measure a learner's mastery of performance objectives.

It also makes sense to select and design your testing instruments once you have agreed upon objectives. When designing for results, you should limit your test design to criterion--referenced tests (CRTs) only. The criteria you reference in this case are the performance objectives. The payback for following these steps will be apparent when you define your instructional strategy. You can eliminate misunderstandings by defining a measurement strategy before you try to select an instructional strategy.


You won't find the blueprint meeting in any traditional instructional design model such as ADDIE (Analysis, Design, Develop, Implement, and Evaluate). This tool is essential to the development of a performance partnership with your clients. A blueprint meeting is a forum that allows you to present your measurement strategy and instructional strategy to your stakeholders, subject matter experts, and others on the design team. The meeting can be held in person or virtually and is ideal for answering questions, clarifying misunderstandings, and gaining consensus. If you really want to be a trusted partner, invest two or three hours in this meeting. (See Figure 1.7.)

Prototype and Learner Tryout

Another useful tool for preventing "do-overs" and gaining consensus as a trusted partner is the simple step of developing and testing a prototype. First, select a training module for your prototype that is a fair representation of your measurement and instructional strategies. Then, before you go too far in developing the rest of the course, test your prototype with a small group of sample learners in a learner tryout. Use this structured test to yield valuable data for verifying your chosen strategies. You may discover differences of opinion with your subject matter experts or even among designers on your own team. I've found that observing sample learners during a learner tryout always uncovers the correct approach. The feedback loops in the model allow you to go back and revise the analysis and subsequent design steps. (See Figure 1.8.)

Production and Field Test

The Production phase is the largest and costliest of all the phases in the model (see Figure 1.9). It involves development of content and testing instruments. Because this phase is so time consuming, it is important to ensure that the other phases are done correctly in order to avoid rework.

The evaluation carried out in this phase is called a field test (some of our clients prefer the term pilot test). Whatever label you use, you need to observe sample learners as they test the entire learning solution under the exact conditions they will experience during implementation. Testing your course with a controlled audience provides a measure of reassurance to you and your client that the rollout will go smoothly. This is a good feeling to have if your course is going to be released to a large audience in a global organization.


If you've carefully followed the recommended steps in the model, the Implementation phase should proceed according to plan (Figure 1.10). But sometimes well--designed and --executed learning programs fail due to poor implementation, despite the use of a detailed implementation plan. Why would this happen? Look for answers in your audience and learning culture analyses. These documents guide you in designing your implementation plan. Your implementation plan also should include change management plans, timelines, resources, logistics, and measurement of business impact and return on investment (ROI).


The following case study illustrates how even a new learning practitioner can combine the practice of performance improvement and instructional design to solve a performance problem and achieve real business results. Ken, the instructional designer in this case study, is a recent graduate of a master's program in Instructional Systems Technology. The client request discussed in this narrative came about three months after Ken began work for a regional bank.

The Request

Ken was getting comfortable with his new job when he was contacted by Bill. Bill was responsible for an insurance product offered along with auto loans that would pay off the auto loan note in the case of the death or disability of the borrower. Revenue was down for this product, and Bill was concerned because fee income was becoming an increasingly important part of the -bank's revenue stream. Bill approached Ken and asked him to produce some training on product knowledge. He handed Ken a PowerPoint file from another -bank's training program and asked Ken to develop both instructor--led training and e--learning.

The Analysis

Having been schooled in instructional systems design, Ken asked to meet with Bill to develop task and audience analyses. Bill was polite and gracious, but he informed Ken that a two--day meeting of his entire team was scheduled to occur in about a month and the training had to be ready by then. Bill said he wanted to devote up to half of that time to training. Bill again insisted that his script be repurposed as an instructor--led event for the meeting. He also emphasized that he wanted an online version prepared for anyone who couldn't attend the meeting or anyone needing a refresher after the meeting.

Being new to the corporate world, Ken faced a dilemma. He had been taught the value of analysis, so he wasn't prepared for a client who didn't understand why analysis was important even with a tight deadline. Still, he did the only thing he could do. He took the script and promised to have everything ready on time.

Later that afternoon, Ken dropped by Bill's office with a few technical questions on the model script. While he was in Bill's office, Ken asked Bill if he could talk to some of the people who were having difficulty with the product so that he could better understand the problem. Bill told Ken there was no time to gather information, but he did offer to take Ken's questions with him on a trip he had previously scheduled to visit some of the bank's branches. Ken quickly crafted six questions about product knowledge based on what he knew from his review of the PowerPoint script and gave them to Bill.

When Bill returned from his trip a few days later, Ken asked Bill how the trip had gone. He was especially interested in the answers to the questions Bill had taken with him.

"Those questions you gave me were just questions about product knowledge," Bill commented.

"That's right," said Ken. "So, how did they do?"

"Well, they did pretty well. Almost everybody I asked knew the answers to all six of your questions."

Bill had to admit that everyone he questioned had adequate product knowledge. Ken saw this as an opening to make a point.

"So, Bill," Ken said, "do you still think spending all your training time on product knowledge is what you want me to do?"

For the first time, Bill looked a little perturbed, but he kept his patience and asked, "Well, what do you think?"

This was the turning point in the relationship. Ken had just crossed the line from being an order taker to becoming a trusted partner. He knew his next question was the most important one of his new career.

"Bill, can you tell me, from a business perspective, what you need to accomplish? What is your business goal?"

Bill's demeanor changed completely. He shared with Ken that for the last three months, only 30 percent of auto loans had any insurance products connected with them. His target was to have insurance products connected to 60 percent of all auto loans. Ken immediately recognized that this information was the basis for a learning and performance goal. He asked Bill if they could schedule a meeting the next day to discuss some other options. This time, Bill quickly agreed.

While answering questions about how the performers should go about presenting the insurance product to potential auto loan customers, Bill mentioned a key piece of information. He said that most lending officers admitted that they never or rarely mentioned the insurance product when closing an auto loan. Many lending officers said that they just didn't have time. Some said they didn't really know when the product was good for their customers, and some said they didn't like to mention the insurance product because many customers complained that it cost too much money. Some said they just liked being bankers and didn't want to feel like insurance salespeople. Now our designer had some data that he could use to solve Bill's problem.

With just a little bit of data, the solution became clear. Ken got permission to speak with some of the top performers who were achieving higher sales of the insurance products so he could produce a high--level task analysis of a sales process that was working. The gaps in performance were clear:

Successful performers mentioned the insurance product early in their overall loan closing process. Most of the loan officers never mentioned the insurance products at all.

Successful performers used a series of questions to qualify their loan customers to identify them as good candidates for the insurance products. Most of the loan officers never mentioned the insurance products at all.

Successful performers were able to relate stories of past customers who had benefitted from the purchase of the insurance products. Most of the loan officers never mentioned the insurance products at all.

Successful performers saw customer objections, especially to price, as buying signals and had a strategy for handling these objections. Most of the loan officers never mentioned the insurance products at all.

Finally, successful performers never mentioned lack of product knowledge as a barrier in selling the insurance products.

The Proposal

Based on the identified performance gaps and the additional analysis, Ken proposed the following solution:

                1.            Begin the instructor--led session with real stories of customers who benefitted from having purchased the insurance products. This information was easy to gather from product files.

                2.            Show video role plays of the successful loan closing process highlighting the presentation, qualifying, and objection handling for the insurance products. Include some role plays of common mistakes.

                3.            Conduct sample role plays with as many participants as possible, inviting feedback from other participants.

                4.            Conduct additional role plays handling different types of customer objections.

                5.            Include some drill and practice on product knowledge (just to keep Bill happy). Design and provide a job aid for product knowledge.

The Solution

Bill accepted the entire proposal. Ironically, the only thing he actually questioned was the need for the last component, concerning product knowledge. Ken convinced Bill to keep this point mostly for the value of the job aid and to further increase the confidence level of the participants. Bill was convinced that increasing the confidence level of the participants was likely to have the most significant impact on achieving his business goal of a 60 percent close rate. The instructor--led component, including sample videos and other written materials, was completed in the remaining three and a half weeks before the meeting, and all of the training was completed in the first full day of the two--day meeting.

The first class was attended by thirty--five participants. By splitting participants into different rooms, almost everyone was able to participate in a live role--play situation. Everyone received feedback on the role play based on a checklist, which was based on the task analysis and the objectives. Bill was able to identify the successful performers in the group, and he was also able to single out those who needed addi-tional coaching.

The Results

Three months after the meeting, 60 percent of all loans had insurance products connected to the loan. That rate was double the 30 percent rate for each of the three months prior to the meeting. Bill was so happy that he told everyone in the bank about his success and about his new strategic business partner.

And Now, the Rest of the Story . . .

The preceding case study narrative was mostly true, except the young designer's name was not Ken. His real name was Dick, and the story recalls my experience as a newly minted instructional designer in 1979. But, as they say, the more things change, the more they stay the same. The PowerPoint script was actually a script from a sound--slide presentation. As a historical note, a sound--slide presentation is a 1970s version of today's PowerPoint that contained words, graphics, and a sound track read by a narrator.

How would the prospects for designing an effective learning program have been any different if you had received an email from a contemporary like Bill with a PowerPoint file attached? Learning professionals call this kind of training program "check--in--a--box" training---that is, performing a learning activity without an ex-pectation of any measurable results.

In my early career example, the client achieved his business goal and I met the tight, four--week deadline without spending much money. I didn't go through a long, complicated process. I didn't employ a complicated instructional design model. I didn't even know there were such things as performance improvement models. What I did was reframe my client's request to focus on business outcomes rather than training activities. I got permission to conduct a little more analysis. My client identified the business gap for me, and together we identified the performance gap. With the help of his best performers, we identified the best practices, analyzed the task, and developed measurement and instructional strategies to achieve the performance goal. Achieving the performance goal in turn achieved the business goal.


Today's learning professionals are faced with more challenges than ever before. Not only are they expected to work with clients as expert instructional designers, they are also expected to be training facilitators, adept in the use of different e--learning authoring tools and learning management systems (LMSs). It's clearly unreasonable to expect these diverse talents to exist in one person.

Still, if you are part of a small learning team, it's inevitable that you will be expected to be adept at a variety of skills. For some people, this expectation offers a good opportunity to discover their strengths and weaknesses. Instructional design is definitely a team sport. It is a far better career strategy to improve as much as you can in your areas of strength and rely on your team members to make up for your weaknesses. How well you engage with your team determines your success as a learning professional.

If you are serious about this profession, a master's degree in Instructional Systems Technology or a related field will open many doors, but you won't find many undergraduate programs that prepare you to be a learning professional in the corporate world. A degree and experience in public education are an excellent background for entering the corporate learning field.

Ongoing professional development is also important for learning professionals. While many of the design principles remain the same, their application (especially with constantly evolving technology) is ever changing. You can keep up with the field by attending conferences or meetings that base their topics on research and evidence--based practices. Refer to the Appendix for a short list of the leading professional associations.

Finally, learning professionals today need to be responsible for business results. Training departments are often the first place organizations cut costs. The only way to change this dynamic is to partner with your organization to achieve business goals. Being a training order taker who accepts only the responsibility to complete a required number of training programs is not a partnering relationship that provides value to your client or organization.


Chapter 2 helps you build a trusted partner relationship with your clients. You will learn how to reframe some training requests to identify business goals and subsequently define training goals that help achieve the business goals. You will be able to identify other solutions to the results you seek besides just training solutions. You will also be able to take a proactive approach to help you keep close to your clients and establish your consulting relationship before your client comes to you with a request.

Search the full text of this book


Order Now!

For single copy purchases of any AMACOM title, you can connect directly to the online retailer of your choice, from the list below, to buy the title you have selected. Most of our links will take you directly to that title on the site, making your shopping experience easier. You can also visit your local retailer, and if the book is not on their shelves they can special order it for you.

Retailers: Please contact us to change or add a listing.

Buying in Bulk?

We have very competitive discounts starting at 5 copies, as well as personal service, for bulk orders. Simply contact our Special Sales Department. Call 800-250-5308 or 212-903-8420 and ask for Special Sales. You can also email: