Hi, I'm Clare Brown and I run Homeschool of 1. When my son was younger, "curriculum" usually started with whatever he was into that month. One week it was sharks, another it was flags, then suddenly it was space again. I'd grab one solid library book, print a couple pages, and plan just enough to get us going. If he was still interested, we kept going. If not, we moved on and I didn't force it. What worked best for us was having a simple rhythm (read something, make something, write/draw something), but letting the topic change fast. The minute it started feeling like a chore, it stopped being useful. Bio: Clare Brown, founder of Homeschool of 1 (homeschoolof1.com).
I've spent 30+ years building curriculum--not just for myself, but for 17,000+ people across our eight church campuses and through Momentum Ministry Partners. The key insight I've learned: structure matters, but *direction* matters more. Start with one clear question you want to answer or one skill you want to develop, then build backward from there. Here's what actually works. I use what I call the "Head, Heart, Hands" method from our youth ministry work. Head = What do I need to *know*? (Pick 2-3 core books or courses). Heart = Why does this *matter* to me personally? (Journal weekly on applications). Hands = How will I *use* this? (Create a project with a deadline). When I developed our OneStep Discipleship Journals, this exact framework helped thousands of people actually complete their learning goals instead of abandoning them. For concrete structure, I recommend 90-day learning sprints. Pick one topic, schedule 30 minutes daily (I do mine 6-7am before my day explodes), and create a tangible output by day 90--whether that's teaching someone else, writing a summary, or implementing a new system. Our staff of 150+ uses this model for professional development, and the completion rate is around 73% versus the typical 12% for self-directed learning. The mistake most people make is building curricula that look like degree programs--too broad, too long, no accountability. Instead, think like a youth pastor planning a semester: tight theme, clear outcomes, and built-in community checkpoints. Find one person to report progress to weekly, even if it's just a 5-minute text update.
I spent nearly a decade in aerospace engineering before buying A Better Fence Construction, and the biggest lesson from that transition was learning to treat skill acquisition like reverse-engineering a project. In aerospace, we'd start with the end product--say, a precision component for a defense system--and work backward through every specification, material requirement, and process step. I applied that exact framework when I needed to learn the fencing business: I identified what "expert-level fence contractor" looked like (quality standards, customer satisfaction metrics, material knowledge), then built a learning roadmap backward from there. My curriculum had three concrete pillars. First was technical mastery--I spent 6 months studying construction materials, soil conditions in Oklahoma, and installation techniques by shadowing the previous owner and our crew leads daily. Second was business operations--I took apart our financials every night for 90 days until I could predict cash flow and estimate jobs accurately. Third was customer psychology--I read every single one of our 115+ reviews multiple times and called past clients to understand what actually mattered to them (turns out: communication frequency beat price sensitivity). The key was treating each pillar like a mini-project with measurable checkpoints, just like we did at Kratos or Textron. For example, I couldn't move from "learning materials" to "pricing jobs" until I could identify commercial-grade vs residential-grade steel posts on sight and explain the 3-5 year lifespan difference. I gave myself pass/fail tests weekly. When I acquired the company, I gave myself 12 months to match the founder's installation quality--I hit it in 8 months because I tracked every mistake and ran "failure analysis" on wonky gates or sagging posts like I would've done on a failed aerospace part. The structure that worked: 70% hands-on application, 20% studying adjacent experts (I joined contractor forums and called suppliers with questions), 10% reflection and adjustment. Most people flip this and spend too much time reading. My engineering background taught me that you learn structural design by actually designing structures and breaking them--same applies to anything worth learning.
I approach personal curriculum building through the lens of clinical supervision--the same framework I use when developing registrars into independent practitioners. The critical starting point isn't "what should I learn?" but "what question am I trying to answer?" In my practice, trainees who begin with a genuine clinical question (like "why do my clients with trauma seem stuck despite using evidence-based protocols?") build far more coherent learning paths than those who chase credentials. Here's what works: I borrowed from Lacanian psychoanalysis and built learning around *lack* rather than completion. When I was developing our EMDR program, I didn't create a syllabus of "everything about EMDR." Instead, I tracked every moment I felt incompetent during sessions--couldn't explain why a client dissociated, didn't understand bilateral stimulation timing--and those gaps became my curriculum. I kept a running document of "I don't know" moments and spent 20 minutes after each clinical day researching just one. Within eight months, I'd built specialized knowledge that no textbook sequence would've given me. The structure that's proven effective across our supervision groups: commit to one *problem* for 90 days, not one *topic*. A registrar recently wanted to "learn about personality disorders"--too broad, went nowhere. We reframed it to "understand why three of my current clients sabotage therapy right when they improve." She read specific papers on rupture-repair cycles, tried interventions, tracked what happened, adjusted. She now runs our complex case consultations. The sustainability piece is counterintuitive--I schedule "unlearning" time. Every quarter, I deliberately identify one thing I believed that's probably wrong and spend a month trying to disprove it. Last year I challenged my assumptions about short-term CBT effectiveness for complex trauma. That discomfort keeps learning alive rather than turning into credential collecting.
I'm not an academic, but I've built my own curriculum over 42 years in the plumbing trade--and honestly, the best learning happens when you chase your actual problems, not someone else's syllabus. Here's what worked for me: I started by identifying one thing that frustrated me (contractors overcharging customers), then reverse-engineered what I needed to learn to fix it. That meant studying business licensing, pricing structures, and eventually chamber of commerce networking across seven Orange County cities. Each skill stacked on the last one because they all solved the same core problem I cared about. The key is picking a real-world project first, then building your curriculum backward from there. When I started the Huntington Beach Manor Haunted House charity event, I had to learn set design, fundraising logistics, and community outreach--none of which were "plumbing." But I learned them fast because I had a deadline and people counting on me. That's way more powerful than abstractly deciding "I should learn fundraising." Track what you actually use versus what you thought you'd need. Half the chamber memberships taught me nothing about running a better business--they were just networking. But the BBB listings? Those forced me to learn customer service documentation that changed how we operate. Let your results tell you what to study next, not some pre-made curriculum.
I've built educational frameworks for over 30 years, but not in classrooms--in crisis. When someone's just moved off the streets into housing, you can't hand them a traditional "syllabus for stability." What works is reverse-engineering from a concrete goal they actually care about, then identifying the 3-4 critical skills standing between them and that outcome. We had a veteran in our FSS program who wanted to buy a house but had zero financial literacy. Instead of enrolling him in a generic budgeting course, we mapped backward: homeownership requires credit score X, which needs Y months of payment history, which means opening specific accounts this week. He learned exactly what he needed, when he needed it, in sequence. That targeted approach got him keys to his own place in 18 months. The mistake I see in personal learning is collecting information instead of building capacity. Our 98.3% housing retention rate doesn't come from residents knowing everything about tenancy law--it comes from them mastering the five specific behaviors that prevent eviction in their situation. Track outcomes, not hours spent. If your "curriculum" isn't producing a measurable change in what you can do three months in, the structure's wrong. One practical method: pick your end goal, interview someone who's already achieved it, and ask them to list the exact sequence of capabilities they built. Not topics--capabilities. Then build your learning plan around acquiring those specific abilities in that order, nothing else until each one's functional.
I've built three completely different careers--psychology undergrad, MBA in finance, then law school--before spending 30 years in family law. What I learned is that personal curriculum building works best when you steal the structure from professional training but ditch the pressure. Here's what actually worked for me when I wanted to master surrogacy law (a niche almost nobody practiced in North Carolina in the early 2000s): I gave myself permission to learn "just enough" in 90-day sprints. First sprint was reading every surrogacy case I could find and taking terrible notes. Second sprint was calling five attorneys in California where this was more common and asking dumb questions. Third sprint was taking on one pro bono case to test everything. I wasn't trying to become the world expert--I was trying to help one client, then another, then another. That became my curriculum. The MBA taught me something crucial about self-directed learning: you need forcing functions. In business school, that's exams. For personal learning, I use real problems. When I wanted to get better at high-asset divorce cases, I didn't read textbooks--I took a forensic accounting workshop, then immediately applied it to a case where a spouse was hiding business income. The curriculum was "learn what I need this month to solve this actual problem." Next month, different problem, different mini-curriculum. Track your "confused to clear" ratio like you'd track billable hours. I keep a running list of concepts that made zero sense initially but clicked later. For collaborative divorce training, mediation certification, LGBTQ+ family law--each time I write down what finally made it stick (usually a real conversation or a mistake I made). That list becomes your personal syllabus for the next learner: you.
I've been practicing dentistry since 1984, and the biggest lesson I've learned about mastering complex skills is this: you can't rush the fundamentals, but you also can't stay there forever. When I graduated from Temple, I knew basic procedures--but learning to place dental implants or do full smile makeovers required me to build my own curriculum outside any formal program. Here's what worked: I identified one skill gap at a time (implants, then Zoom whitening, then veneers), found the best lab technicians and continuing education courses for that specific thing, and gave myself permission to practice it repeatedly on willing patients at reduced rates until my hands knew it cold. I tracked every case with photos and notes--what went right, what I'd adjust next time. That feedback loop was everything. The key was alternating between learning mode and teaching mode. After I got decent at implants, I'd explain the procedure to patients in detail, which forced me to truly understand it. I also called patients post-surgery to hear their experience--that real-world data shaped how I refined my approach far more than any textbook. What made it sustainable was keeping my practice broad while going deep on one thing at a time. I never stopped doing fillings and cleanings while learning cosmetic work--that baseline kept me grounded and funded the learning. Pick your anchor skill, add one branch at a time, and use actual projects (not just theory) to pressure-test what you're learning.
I built my personal curriculum at 60 when I left nonprofit financial management to start FZP Digital. Most people think you need a master plan, but I started by mapping what I already knew--accounting, drums, nonprofit leadership--and identified one missing piece: advanced WordPress development and SEO. Here's what actually worked: I treated learning like building a website, not writing a novel. I picked ONE skill per quarter (first was WordPress customization, then SEO fundamentals, then email marketing automation). Each morning before client work, I'd spend 45 minutes on tutorials or building a test site. The key was having a real project--I rebuilt my own site four times while learning, so every lesson had immediate application. The drumming background taught me something crucial about practice: repetition with variation beats cramming. I didn't try to learn "all of digital marketing." I learned enough WordPress to launch one client site, then reflected on what confused me during that project. That confusion became next quarter's curriculum. It's like learning a song--you don't master every technique first, you learn enough to play it badly, then improve through repetition. Track your energy patterns like you'd track business metrics. I finded I retain technical information best between 6-8 AM but creative concepts click around 2 PM. So I scheduled coding tutorials for morning, design theory for afternoon. That simple split doubled my retention without adding more hours.
I built my personal curriculum by stacking interdisciplinary experiences simultaneously rather than sequentially. While getting my Master's in Biotechnology at Hopkins, I deliberately worked as a research tech studying pancreatic cancer during the day and volunteered as a Firefighter/EMT at night. Most people told me I was spreading myself too thin, but those parallel tracks created unexpected connections--crisis decision-making from emergency medicine directly improved how I approached time-sensitive research protocols. The framework that worked: I identified the hardest problems I wanted to solve (building a mission-driven healthcare company), then deliberately put myself in environments where I'd be terrible at first. I did summer internships across six different institutions--Sloan Kettering, Columbia, Weill Cornell--specifically choosing rotations in units I knew nothing about. Being the dumbest person in the room repeatedly taught me how to learn fast under pressure, which matters more than any individual skill. Here's what made it sustainable: I only pursued learning that had immediate application within 90 days. When I studied novel drug development, I was simultaneously managing a research lab where I could test those frameworks. When I took business courses, I had real operational problems from clinical internships at Hopkins Hospital to apply them to that same week. Theory without a testing ground just evaporates. The metrics I tracked weren't courses completed--they were cross-pollination moments. How many times did knowledge from one domain solve a problem in another? That pancreatic cancer research background now informs how we approach patient treatment protocols at ProMD. That firefighter training shapes our crisis management systems today. If your learning isn't creating those bridges within three months, restructure immediately.
I left a 14-year engineering career at Intel because I realized structured learning had stopped making sense for me. Corporate training modules felt hollow--I was checking boxes but not actually growing. So I built my own curriculum around micro-soldering and circuit board repair by treating every broken device like a case study. Each iPhone or laptop that came through my door became a hands-on problem to solve, and I kept a repair log tracking what techniques worked and which failed spectacularly. Here's what made it stick: I only learned what I could immediately apply. No theory without practice. When I needed to understand BGA rework, I didn't watch 10 hours of YouTube--I bought a broken MacBook logic board for $40, attempted the repair, failed twice, then figured out why. That failure-to-application loop is faster than any classroom I ever sat in. The biggest shift was realizing personal curriculum needs an forcing function. Mine was customer devices--if I didn't learn data recovery properly, someone lost their family photos forever. That's way more motivating than a grade. Find your version of that: a project with real stakes, even if it's just a friend counting on you. Accountability beats motivation every time. Track your confusion points in writing. I keep a simple notebook of "things that didn't make sense today" and revisit it weekly. Half the time, the answer clicks because I've now seen the problem three different ways through three different repairs. Your brain needs that repetition with variation--it's how pattern recognition actually builds.
I built my learning curriculum by rotating through every single role at Standard Plumbing Supply--from sweeping warehouses at eight years old to leading our Vendor Managed Inventory expansion to 60+ customer locations. Most people build expertise vertically in one function, but I learned horizontally across operations, sales, logistics, and customer service before taking on leadership. The key was treating each role as a 6-12 month intensive where I had to become competent enough to train someone else. When I worked our counter, I wasn't just taking orders--I was learning how contractors think under pressure at 6 AM when a job site is down. That ground-level knowledge now shapes every strategic decision I make as VP because I've actually lived the problems our customers face. What made it stick was keeping a "customer win journal" throughout each rotation. Every time I saw someone solve a problem creatively--whether a warehouse worker found a faster loading method or a driver knew a shortcut that saved a contractor's timeline--I documented it. Those 50+ documented patterns became my real curriculum, not any textbook. Now when we design programs, I'm pulling from actual field wisdom, not theory.
I learned to build personal curricula the hard way--after my wife Joni was killed by a drunk driver early in our marriage. I had to systematically teach myself victim advocacy, drunk-driving law, and nonprofit organizing from scratch because nobody else was going to do it. My framework: **reverse-engineer from a real problem you need to solve**. I didn't study DUI law abstractly--I needed to understand blood-alcohol evidence to help MADD families, so I tracked down toxicologists, read crash reports, and sat through trials. Within two years I was chairing Florida MADD and co-founding RID's Tampa chapter. The curriculum built itself because every new skill had immediate application. When I later taught trial practice at Stetson Law, I saw students memorize rules but freeze in mock trials. The ones who succeeded treated each courtroom skill (cross-examination, jury selection) like a case file: they'd watch one real trial per week, transcribe 10 minutes of questioning, then practice that specific technique on camera. They learned **vertically**--going deep on one micro-skill until it was muscle memory--not horizontally across a dozen topics. The metric that matters: can you use it this week? I've handled 40,000 injury cases because I spent my first five years mastering just intake interviews and medical record analysis before expanding. If your "curriculum" doesn't produce a tangible output within 7-10 days--a written brief, a working prototype, a real conversation in your target language--you're studying, not learning.
I built my practice from scratch in 1994, and here's what I learned about designing your own curriculum: **you need a mentor track alongside your content track**. Everyone focuses on what to learn, but nobody talks about who to learn *from*. When I wanted to expand beyond general dentistry into a multi-specialty facility, I didn't just read about it--I shadowed three different practices for six months, asked stupid questions, and took notes on what failed, not just what worked. The framework I used: pick one skill, find someone doing it at the level you want, and commit to 90 days of deliberate imitation before you innovate. When we added CEREC same-day crowns and guided implant surgery, I spent three months just copying the exact protocols from experienced users. No shortcuts, no "my way." Month four is when I started adapting. That sequence matters--you can't customize what you haven't mastered. Here's the practical part everyone misses: **build accountability through teaching**. I started mentoring dental assistants and offering EFDA training not just to help them, but because explaining procedures forced me to systematize my own learning. When you commit to teaching someone else what you're learning by month three, your retention jumps and the gaps in your knowledge become obvious fast. You learn twice--once for yourself, once for them.
Setting deadlines is key to successfully building a personal curriculum. Otherwise, Parkin's Law rings true, in that "work expands to fill the time available." Because it's a personal endeavor, behavioral psychology is an important factor to consider. I can attest to this when I built my own curriculum on credit building. I started out thinking that this was going to be an easy topic of interest to study, given that I have a background in finance. However, my mistake was that I didn't set a deadline for myself from the get-go. The lack of urgency resulted in my curriculum taking a backseat to everything else I was doing, and so it ran longer than I intended it to.
I got into building my own curricula when it became obvious that nobody was going to teach me how to run a business, especially the odd little mix of creative work and AI that my agency leans on now. So I started making these month-long learning sprints for myself: one on pricing psychology, another on agency ops, another on prompt engineering. Each one began with a clear goal -- something like "write a value-based pricing proposal that a client actually accepts" -- and I worked backward from there. It never looked like a formal syllabus; it was more of a loose mission plan cobbled together from Google Docs, podcasts, and the occasional late-night rabbit hole on YouTube. What kept me committed was having a clear endpoint and something tangible I had to produce. I wasn't learning for the sake of it; whatever I studied had to show up in a pitch, a project, or some piece of work I was already wrestling with. That's when everything stuck. I also kept the whole thing light on rules -- no guilt about unfinished books or skipping resources that weren't clicking. The point wasn't to mimic school. It was to create a way of learning that actually helped me grow past it.
When I think about building a personal curriculum, I approach it the same way we onboard new engineers at work. Start by carving the topic into a few big themes -- the kind you'd normally see tucked into a syllabus -- and then map out a trail of hands-on exercises, small projects, and targeted reading. I've learned far more by wrestling with a messy codebase or poking at an unexpected query plan than by staring at a tidy diagram. For me, personal learning only sticks when it requires me to actually build or break something. A good example is when I wanted to get a firmer grip on distributed systems. Instead of reading a stack of papers, I built a stripped-down job dispatch queue with .NET Core, RabbitMQ, and Redis. From there I pushed it into failure: forced timeouts, created lock contention, tweaked retry logic just to see where it would wobble. That little sandbox taught me more about system behavior than any formal overview could. The structure mattered -- having clear constraints, a real artifact to work on, and a moment at the end to look back and note what surprised me. It's essentially a course format, just built for one.
I always start by asking what problem I'm trying to solve or what I'm curious about. That usually sets the path. At AthenaHQ, we started using three-week blocks for new topics, mixing reading with actual building and a review. It got the team up to speed on AI changes way faster than just articles. Keeping it focused and time-bound is what stops you from getting overwhelmed and actually finishing.
The best trick I've found for remote teams is breaking goals into weekly chunks. We set up a simple doc where people drop links to tutorials or case studies and leave comments on what works. If you're trying to learn something new yourself, just pick a real problem at work, like onboarding, and build your studying around that. It's way more useful that way.
I handle my learning like a project. Start with the result you want, then break it into smaller pieces. When I was building Tutorbase, I had to get good at SaaS sales, so I broke it down into pricing, demos, and negotiation. I'd tackle one each week, spending extra time on whatever was hardest. The key is to stay flexible. Changing your plan keeps it interesting instead of feeling like a chore.