
Getting the Most out of Your Utility Data
Most energy managers have access to utility data. Few know how to use it. This project started as a simple request to build a training program and became something more rigorous: a capability-based learning architecture built on challenged assumptions, a stress-tested framework, and deliberate design trade-offs.
Overview
Status: Program design complete, Subject Matter Expert (SME) review in progress, Phase 1 development upcoming
Client: Goldfin Consulting Inc. (internal)
Audience: Entry-level energy managers across all sectors
My Responsibilities: Performance Consulting, Needs Analysis, Capability Framework Design, Learning Program Architecture, Instructional Design, SME Consultation Planning
The Problem
Energy managers are not using or are underusing the utility data they already have access to. The result is that energy insights and cost-saving opportunities go unrecognized, not because the data isn't there, but because the people responsible for it lack established practices for managing and analyzing it.
​
On the surface, this looks like a training problem. But I didn't treat it as one, at least not yet.
Challenging the Brief
The initial idea was to build a training as a follow-up to an existing program. My first move was to question whether the problem and its root cause were actually validated.
​
We shouldn't build based on assumed problems and assumed root causes. We need to verify whether this is actually what people are experiencing and why.
​
This matters because the same surface problem (not using utility data) can have very different root causes: some energy managers lack the capability (a training problem), some have the capability but face organizational or infrastructure constraints (not a training problem), and some have the capability but face time constraints (a different kind of design problem). Distinguishing between these early shapes everything downstream.
​
A second assumption surfaced early: the idea of converting a workshop into an online course. I challenged that too. Format is a design decision, not a starting point. Choosing a format before understanding the problem puts production before strategy.
Building the Framework
Once I had a validated performance problem to work with, I mapped what energy managers actually need to do with data from first access all the way to sustained practice.
This produced a six-stage capability chain:
-
Access: Locate, retrieve, and prepare data so it can be used
-
Understand: Know what the data represents and what questions it can answer
-
Analyze: Apply methods to identify patterns, trends, and anomalies
-
Interpret: Connect data patterns to real operational causes
-
Act: Translate insights into decisions, investigations, or recommendations
-
Sustain: Make data practice part of ongoing operations
The chain became the backbone of the design, not just an organizing concept, but an active tool I kept returning to. I stress-tested it by asking, for each proposed module: which capabilities does this address? What's covered strongly? What's missing?
Key Design Decisions
Topic-based modules, not capability-sequenced modules. Learners arrive at different points in their data journey. Organizing modules by practical topic (e.g., "Energy Benchmarking") rather than by capability stage means learners can identify where they need to start based on a recognizable gap, not by diagnosing their own capability level.
​
Modular and stand-alone, but coherent together. Each of the seven modules works independently. But the modules also fit together as a complete program, the capability chain ensures coherence across the whole. This shapes how the program can be deployed, sold, and scaled.
​
Phased development to scope up, not scope down. Phase 1 covers the three foundational modules first. It's easier to scope up than scope down. Starting with what's most foundational limits the cost of being wrong and creates a usable program faster.
​
Each module extends slightly into the next capability. A strict capability-by-module design would have each module cover exactly one stage. Real learning doesn't work that way. Each module extends slightly into the next capability, enough to make the learning feel complete without overextending the module's scope.
​
AI as enabler, not replacement. Every module ends with an AI-assisted workflow component, not because AI is trending, but to solve a specific identified barrier, time. Even energy managers who have the skill may not sustain a data practice if it takes too long. AI tools reduce the barrier to actually using the capability, while keeping human judgment at the center.