Instructional Design - Analyse - Step 1

Standing on the Shoulders of Giants: Rediscovering the Original ADDIE Model

In a world obsessed with speed, hacks, and shiny tech, it's easy to forget that some of the best ideas in instructional design have been around for nearly 50 years. The ADDIE model, born out of the U.S. military's need for performance-based training, remains one of the most robust frameworks we've ever seen. This article is part of a five-part series revisiting the original ADDIE model, step by step, and showing how even solo developers and small teams can apply its principles today. It's time to get ...

What’s Old is New Again: Analyse the Job (ADDIE - Phase I, Step 1)

Output of this Step: A list of observable tasks performed on the job, each documented with conditions, tools, and standards. Some tasks will be marked for instruction based on frequency, criticality, and difficulty to learn.

Subheading: "Know the Job, Nail the Training"

If you’ve ever jumped into designing a course without really understanding the job behind it, you’re not alone. Most of us have. But here’s the thing: the very first step in the original ADDIE model – and I’m talking about the real deal, the 1975 Interservice Procedures for Instructional Systems Development (IPISD) – wasn’t about learning objectives, digital tools, or delivery modes. It was about getting to know the job. Properly.

This step, called "Analyse the Job," was never about designing instruction just yet. Its only output? A well-organised, validated list of tasks that someone in the role needs to perform on the job. That’s it. But that one list becomes the foundation for everything else you’ll do later.

And here’s the kicker: you don’t get that task list by chatting with Subject Matter Experts. Not yet. SMEs come later. Right now, your job is to watch the people who actually do the work. Ideally, you're there beside them – asking questions, taking notes, observing the conditions, the tools, the pace. But if you can’t be there in person, a video call or even a phone-recorded walkthrough can work just as well.

This article is your guide to doing just that — building a sharp, clear, and practical task inventory by getting your boots on the ground, even if it’s a digital ground floor.

What Job Analysis Really Means (and Doesn’t)

The IPISD model didn’t ask us to make assumptions about what learners need to know. It said: go and find out what the actual job involves — by watching it, not by asking someone else to describe it. This stage is not about SMEs. It’s about seeing the work, unfiltered and unpolished.

Your goal isn’t to collect second-hand commentary from supervisors or instructional designers. Your goal is to observe the work being done. First-hand. Ideally in person, on site. If that’s not possible, then:

  • Join a video call and ask to see the job done live
  • Ask someone on the job to film a walkthrough using their phone
  • Request a recording of typical job activities

Whatever you do, resist the urge to bring in the SME too early. SMEs can over-explain, editorialise, and unintentionally skew the list toward theory or policy. You want to build your task list from raw observation — the real stuff, warts and all.

At this stage, you are not designing anything. You are simply building a task inventory based on what people actually do.

What Makes a Good Task List

In the original IPISD model, task documentation included fields like Skills and Knowledge, Attitude, and Environment. These categories were meant to help define the task's requirements and training context.

For modern instructional designers — especially those working in eLearning — we’ve found it can help to reframe these under more practical headings:

  • Conditions – What’s going on when the task is performed?
  • Tools – What physical or digital tools does the person use to complete the task?
  • Standards – How well must the task be done?

A good task list includes:

  • Tasks that are discrete and observable
  • Tasks that can actually be trained
  • Descriptions using conditions, tools, and standards

Validate your list with practitioners. Then filter with:

  • Frequency
  • Criticality
  • Difficulty to Learn

Wrapping Up: What You Take With You

At the end of this step, you should have a reviewed task inventory. Mark those tasks that are frequent, critical, or hard to learn — that’s your shortlist for training.

No objective or content belongs unless it supports one of those tasks. Do this well, and you’re not just making content — you’re building capability.

And that’s something the giants of 1975 understood better than most of us today.