Project Rationale

What was being accepted
- Adults not positioned as learners
- Replacing real instruction to buy for students a few hours a year of off-the-shelf curricula or guest speakers serving fast-casual inspiration, all netting zero ROI
- No programmatic measurement of impact, conveniently hiding that there isn’t any while the 17% of students with disabilities still represent 50% of victims
What I saw instead
Adults didn’t need to teach kids how to behave toward others. They needed to show them — by learning about others. Likewise, literacy instruction isn’t what you pause to make time for this learning work. It’s what you use to make learning work.
What I built
- Change management project proposal rationale that leverages a single, unresolved pain point — bias-based bullying
- Preempts rebuttals
- Drives from the district’s own stated values, existing state priorities (MTSS), national data, and approximately 30 peer-reviewed sources
Why I built it this way
Because it’s the universal experience of every uncomprehending, immobile, mostly blind human entering life itself, disability is the absolutely perfect entry point for every learner entering this experience — including staff, families, and community members who may have otherwise had a knee-jerk resistance to instruction about identity. Co-design positions the district to go first — inviting the community and student population as collaborators. That divestment of power transforms an amygdala-triggering environment into a prefrontal-cortex-friendly one, where literacy can finally strengthen.
Scope
- PK-12 all students, all instructional and support staff, all families, community members with lived experience
- Addresses bullying, attendance, engagement, literacy, and community trust simultaneously
- Tier 1 MTSS — universal intervention, not targeted remediation
- Designed to scale to any district regardless of size
What guarantees sustainability
Once the rationale is read, the assumptions it exposes — that kids are the problem, that purchased curricula work, that no one needs to measure adult behavior — become visible and indefensible. Each preemptive rebuttal forecloses a specific exit route. The disability-first framing precludes the political attacks that have ended DEI initiatives nationwide by grounding the work in universality rather than ideology. The literacy integration makes passing on the program equivalent to passing on the district’s own primary academic goal.
Scope & Sequence
What was being accepted
- Initiatives imposed top-down without stakeholder voice in design
- No operational accountability structures beyond implement and hope
- No defined coaching relationships or feedback protocols
- Community members and students positioned as recipients, not as experts
What I saw instead
In every initiative I can remember, progress was arrested at one of four predictable points: when intimidating scale hinders adoption; when facilitation drifts from design intent before launch; during launch, when momentum stalls before results are visible; and long-term, when the system depends on any single person — the designer who built it or a principal who championed it. Each was preventable with the right operational architecture.
What I built
- 30-month, 4-phase operational architecture (Discovery, Development, Implementation, Sustainability) with sequenced milestones
- SME recruitment protocols across three expertise types: administrative (organization), community (identity and lived experience), and instructional (learners)
- Coaching norms codifying hierarchy inversion — students as SMEs on their own learning environment, community members as SMEs on identity, instructors as SMEs on learners
- Feedback protocols operationalizing clear-is-kind with explicit norms for giving, receiving, and timing
- Iteration cycles built into every phase
Why I built it this way
Each phase maps to a vulnerability window. Discovery addresses adoption resistance by building stakeholder investment before the work begins. Development addresses facilitation drift by codifying design intent through co-authorship. Implementation addresses stalled momentum through scheduled measurement that makes progress visible. Sustainability addresses single-person dependency by training peer leaders from beta participants and embedding every norm in writing so the work outlasts any individual.
Scope
- All buildings, all grade bands (K-2, 3-5, 6-8, 9-12), differentiated materials for each
- Community SMEs compensated
- Families positioned as co-designers, not recipients
What guarantees sustainability
Once coaching norms establish that lived experience equals or exceeds academic study in a SME role, the hierarchy inversion is structural, not aspirational. Once SMEs hold contractual roles with defined responsibilities, their expertise can’t be casually re-sidelined. Once milestones are sequenced with accountability at each vulnerability window, drift from design intent becomes correctable before it derails the initiative. Once the standard has been named out loud, compliance isn’t guaranteed, but discomfort when violating them is.
Learner Experience
What was being accepted
- Passive delivery — learners have little to no agency over pacing, sequence, or depth
- One-size content with no differentiation by role or adaptability to diversity in community contexts
- Social-emotional learning disconnected from academic content, competing for instructional time rather than serving it
What I saw instead
Every cultural competence training I had attended required nothing of the learner — speakers presenting to passive audiences, slides advancing without interaction, sessions ending without visible evidence that comprehension had occurred, let alone application. It’s the same design failure that makes compliance training ineffective across every sector. What was different here was the subject matter — unlike generic compliance content, this material was inherently the most personally relevant content any learner in the room had ever encountered, because the subject was themselves. Interaction, collaboration, and demonstration of application weren’t features to add. They were what the content was already asking for.
What I built
- Recursive 4-component quarterly learning architecture: Orientation (5 Big Ideas), Focus-in (one identity domain per quarter), Tutorial, and Schoolyard Simulator (branching-scenario formative assessment)
- Identity sequence scaffolded across quarters: Disability → Gender → Race → Intersections
- Staff practice prevention and response in the Simulator; students practice bystander intervention — both witness outcomes of their choices
- Admin Demo in Articulate Storyline introducing PREfessor Cortex as guide, with interactive, self-paced, choice-driven content
- Literacy-integrated — every unit serves narrative perspective-taking and informational comprehension
Why I built it this way
Orientation exists because the cognitive demand of this content is unusually high — every session asks learners to examine implicit bias not about abstract information, but about themselves and the people in the room. Adults and students re-entering mid-sequence need a scaffold that reactivates prior learning before they can absorb what comes next. Focus-in moves them from reorientation into new material that gets progressively less universal — and less personally familiar — with each quarter, so the pattern recognition built in earlier sessions can disarm potential resistance before new concepts trigger it. Tutorial operationalizes the mechanics: how this particular learning experience works. The Simulator closes the sequence by making stakes visible — every branching scenario reflects real skill demands, surfacing the distance between knowing and doing at exactly the moment learners have the most capacity to close it.
Scope
- Four identity domains across four quarters, differentiated for K-2, 3-5, 6-8, 9-12, and all staff
- Branching scenarios generating application-level performance data
- Recursive annual cycle — each cohort deepens understanding across their entire enrollment
- Co-designed with staff, student, and community SMEs
What guarantees sustainability
Once learners have vocabulary for identity, intersectionality, bias, and boundaries, they can name what they previously could only react to. Once staff have practiced intervention strategies in branching scenarios and witnessed the outcomes, passive bystanding requires actively overriding trained responses. The content regenerates with every cohort because the subject matter is the learners themselves — no videos starring people with hairstyles that look dated in ten years, no materials that age out. The subject matter walks into the room every morning.
Evaluation System

What was being accepted
- Completion rates logged as evidence of learning
- Satisfaction surveys measuring how participants felt about training, not what they learned or changed
- No disaggregation by role, building, or grade band — gaps in specific populations invisible in aggregate data
- No longitudinal measurement — no means to distinguish early adoption from sustained behavioral change
What I saw instead
The measurement gap wasn’t incidental — it was structural. Initiatives that don’t build evaluation infrastructure into the design have no reliable mechanism for detecting their own failure. Satisfaction data and completion logs answer “did it happen” — not “did it work, for whom, and what needs to change.” What was missing wasn’t willingness to measure; it was a measurement architecture designed to surface the right questions at the right intervals.
What I built
- Progress Tracker — milestone sequencing tool that sequences survey administration intervals across a 30-month implementation
- Culture & Climate Survey — 30-item instrument measuring knowledge acquisition and behavioral change; Q29 tracks bystander intervention progression in real-world contexts across eight quarterly administrations over two years
- KPI Gap Analysis Matrix — disaggregates CCS data by role, building, and grade band to surface where gaps persist and direct resources accordingly
Why I built it this way
The sequence is load-bearing. The Progress Tracker establishes the measurement cadence the entire implementation depends on. The CCS generates the raw signal; the KPI Matrix makes it actionable by disaggregating data into specific gaps. Without disaggregation, a district-wide average can mask a building where little is changing. Each tool answers a different question: Are we on schedule? What is actually changing? Where do we need to intervene?
Scope
- Measurement cadence spans full 30-month implementation across all four phases
- CCS administered eight times over two years — tracks individual and cohort progression, not single snapshots
- KPI Matrix disaggregates by role (instructional staff, support staff, administration), building, and grade band
- Data available to district leadership, building principals, and phase leads
What guarantees sustainability
Once behavioral change data is collected at defined intervals rather than at project end, the absence of progress becomes visible mid-implementation rather than after the investment is complete. Once the KPI Matrix has established a baseline, future administrations generate comparison data without rebuilding the framework each cycle. Once disaggregation is standard, aggregate reporting that obscures building-level gaps becomes insufficient by established precedent.

