- Audience
- Teachers, principals, school leaders, paraprofessionals (teacher aides / learning assistants), inclusion / learning-support coordinators
- Delivery modes
- F2F multi-day · F2F short topic · virtual classroom · self-paced · mixed-mode · faculty / leadership team · early-career-tailored
- Sub-services
- 1A design & develop · 1B deliver · 1C educator Community of Practice
- Co-design with
- Educators, education authorities, Autistic people, people with disability, First Nations, CALD
- Anchored to
- DDA 1992 · DSE 2005 · NCCD · Australian Curriculum v9 · APST · APSP · National Autism Strategy 2025–31
- Critical hooks
- Must meet teacher regulatory authority PL recognition; align with Autism Microcredential
Prep pack for the Melbourne co-design session with Autism CRC
A starting scaffold so the room has something to react to rather than starting from a blank page. It is not a co-design framework, and it does not preempt Cheryl's process — it surfaces the service requirements from the RFT, the session and resource types that have to be designed, and where lived experience and student voice need to be embedded.
01Service requirements inventory
Nine services to design and deliver across 1 July 2026 – 30 June 2029
All nine services from Schedule 1 of the RFT, with their audiences, required delivery modes, embedded co-design expectations, and headline KPIs. Services 1–5 are the participant-facing offer; Services 6–8 are the platform, engagement and assurance layer; Service 9 is on-call.
- Audience
- Parents and carers of Autistic school-age students, primary and secondary
- Delivery modes
- F2F multi-day · F2F short topic · virtual classroom · mixed-mode · weekend & after-hours
- Sub-services
- 2A design & develop · 2B deliver · 2C post-training networking / forum
- Co-design with
- Lived experience (Autistic, autism community), First Nations, CALD parents and carers
- Topic anchors
- DDA · DSE participant rights · advocacy mechanisms · home–school partnerships · transitions (pre-school → primary → secondary → post-school)
- Audience
- Whole school community: leadership, staff, parents and carers, students (where appropriate)
- Engagement phases
- 1) Pre-engagement & assessment · 2) Targeted PL + whole-school training · 3) Collaboration & student voice · 4) Ongoing support & champion role
- Co-design with
- School leadership, school champion, parents/carers, students where appropriate, Service 7 stakeholders
- Targets
- Min 15 schools yr 1; min 20 schools yr 2 onwards. Min 12 months of support per school.
- Outputs per school
- Action plan with milestones, school champion identified, formal follow-up session, links to school strategic plan
- Audience
- Educators, parents and carers of Autistic First Nations school students; ACCO staff
- Co-design partners
- First Nations communities, ACCOs, First Peoples Disability Network
- Principles
- Cultural safety · co-design · community-led engagement · strengths-based · reciprocity & respect · alignment with Closing the Gap and ATSIEAP
- Targets
- Min 10 workshops per year. Min 2 ACCO partnerships per year, supported across the contract period.
- Translation
- Materials adaptable to local context; First Nations language translation where required
- Compliance
- IPP / MMR if any Remote Area delivery; capability uplift evidenced via ACCO-endorsed case report
- Audience
- CALD parents and carers of Autistic school-age students. CALD includes the Deaf community.
- Co-design partners
- CALD communities, National Ethnic Disability Alliance, advocacy groups, Autistic CALD voices
- Targets
- Min 10 workshops per year. Priority access for new migrants and refugees. Delivered nationally including regional, rural, remote.
- Translation
- Materials adaptable to community; translation into community languages and Auslan
- Facilitators
- Persons with experience working with CALD families; able to speak the community's primary language(s)
- Sub-services
- 6A program website · 6B online learning system · 6C information & learning resources
- Resource formats
- Fact sheets · articles / blogs · videos & animations · webinars · interactive tools · podcasts
- Accessibility
- WCAG 2.1 AA · Easy Read · Auslan · 15+ community languages
- Co-design via
- User-Centred Design with diverse users incl. people with disability, First Nations, CALD, key stakeholders, the department
- Standards
- Australian Government Digital Experience Policy (Service / Access / Inclusion / Performance Standards) · 99.95% uptime · PSPF / ISM / Essential Eight
- Sub-services
- 7A communications & promotion plan · 7B stakeholder forums
- Forum cohorts
- 1) Autism & disability associations + allied health · 2) First Nations (FPDN, ACCOs, community members) · 3) CALD (NEDA, advocacy, Deaf community) · 4) Australian Government · 5) State / Territory + non-gov education authorities + parent & carer associations
- Purpose
- Forums inform Services 1–6; promote across jurisdictions; encourage participation from priority cohorts
- Note for Melbourne
- Autism CRC's role in co-design needs to be located clearly relative to these forums (see open questions, §6).
- Sub-services
- 8A data collection & management · 8B performance monitoring & CI
- Plan
- Data, Performance and Continuous Improvement Plan, approved by the department
- Inputs
- Service metrics · participant feedback & surveys · website & social analytics · stakeholder forum feedback · delivery partner input
- Reporting
- Quarterly progress · annual reports · IPP QPR · AIP implementation · ad hoc · final report
- Scope
- Research projects · new training resources · tailored training for identified cohorts. Earliest commencement: 1 July 2027.
- Triggers
- Emerging issues or new government priorities, agreed in writing with the department
- Costing basis
- Hourly & daily rates, plus indicative costs for: a research project + report; a new learning resource; a new training session
Cross-cutting requirements that shape co-design
- Requirement 5 — Co-design method. The RFT specifies that, where co-design is required, the supplier must adhere to the principles and follow the guidance of the People with Disability Australia (PWDA) Co-Design Programming Overview. This is the named methodology in the contract.
- Lived experience throughout. Content and delivery must incorporate lived experience and the perspectives of the autism community across all services, not only Services 4 and 5.
- Student voice. Explicitly named in Service 3 ("options for a student voice, where appropriate") within the whole-school community model. No equivalent named hook in Services 1, 2, 4, 5, 6 — worth deciding whether to embed it more widely.
- Department approval. All training materials, online resources, website content and promotional content must be approved by the department prior to publication or use.
- Phase 4 inheritance. Existing Phase 4 materials will be handed over and must be reviewed, refreshed, replaced or built on — co-design starts from a body of existing content, not a blank slate.
02Program / session catalogue
The actual things that need to be designed
Pulled out of the RFT and grouped by modality so the room can see, in one place, the surface area of what has to be co-designed. Durations are indicative — the RFT names some formats explicitly (multi-day, virtual classroom, short topic) but leaves length to the tenderer; 90-minute topic sessions are flagged here as a likely default for the online topic offer.
Face-to-face
| Format | Indicative duration | Audience | Service(s) |
|---|---|---|---|
| Comprehensive PL workshop | 1–2 days | Teachers, school leaders, paraprofessionals | 1, 3 |
| Short / targeted topic session | Half-day or 90 min | Teachers, school leaders | 1, 3 |
| Faculty / leadership team training | Half-day to full day | School leadership groups | 1, 3 |
| Comprehensive parent / carer workshop | 1 day | Parents and carers | 2, 3 |
| Parent / carer topic session | 2–3 hrs (incl. weekend / after-hours) | Parents and carers | 2 |
| Whole-school community workshop | Variable, scoped per school | Staff + parents + leadership | 3 |
| First Nations community workshop | Co-designed with community | Educators, parents, carers in First Nations communities | 4 |
| CALD community workshop | Co-designed with community, in language | CALD parents and carers | 5 |
Online — live
| Format | Indicative duration | Audience | Service(s) |
|---|---|---|---|
| Virtual classroom (interactive PL) | 2 hrs typical | Teachers, school staff | 1, 3 |
| Virtual parent / carer session | 90 min typical | Parents and carers | 2 |
| Online topic session | 90 min | Teachers, parents (varies by topic) | 1, 2, 3, 6B |
| Webinar (broadcast format) | 45–60 min | All audiences | 6C |
Online — self-paced and mixed
| Format | Indicative duration | Audience | Service(s) |
|---|---|---|---|
| Foundational online module (accredited) | 30–60 min per module | Teachers, school staff | 6B |
| Topic module (self-paced) | 20–45 min | All audiences | 6B |
| Mixed-mode (pre-work + live workshop) | Module + 1–2 hr live | Teachers, parents | 1, 2, 3 |
| Educator Community of Practice | Ongoing | Past PL participants | 1C, 6A |
| Parent / carer networking forum | Ongoing | Past participants | 2C, 6A |
Information & learning resources (Service 6C)
| Format | Audience focus | Notes |
|---|---|---|
| Fact sheet | All audiences (general & targeted) | Easy Read, Auslan, 15+ community languages where applicable |
| Article / blog | Parents, school staff | Topical / current-issue focus |
| Video / animation | All audiences | Strong fit for First Nations, CALD, Easy Read variants |
| Webinar (recording) | All audiences | Companion to live broadcasts |
| Interactive tool | Parents, school staff | Higher build cost — co-design priority |
| Podcast | Parents, school staff, community | Strong vehicle for lived-experience storytelling |
What the catalogue tells us about the co-design surface area
- Five distinct audience streams — educators, parents/carers, whole-school communities, First Nations, CALD — each with their own variant of most formats.
- Three layers of intensity — long-form comprehensive workshops, mid-form 90-minute sessions, short-form modules and resources.
- One platform spine — Service 6 — that has to integrate with all of the above, on a fixed deadline.
- Two ongoing community structures — educator CoP and parent / carer networking forum — that need their own co-design and moderation model.
03Co-design inputs needed
Whose voice we need, and where it lands
A first-pass map of the participation expectations the RFT places on Phase 5. The aim here is not to assign people to roles — that's part of what Cheryl's process will do — but to make visible the range of voices that need to be embedded, and where each one matters most.
Autistic people
Lived experience must shape content and delivery across all services. Includes Autistic adults, Autistic students (with appropriate consent and support), Autistic educators and Autistic parents.People with disability (broader)
The RFT names "people with disability" alongside Autistic people. Co-design via PWDA's framework is the named methodology under Requirement 5.Students (student voice)
Explicitly named only in Service 3 ("options for a student voice, where appropriate"). A genuine open question for Melbourne is whether it should also reach into Services 1, 2 and 6.Parents and carers
Both as participants and as co-designers. Required perspective in resource development, parent / carer training, whole-school community model, and resource formats.First Nations communities
Co-design is mandated, not optional. Cultural safety, community-led engagement, ACCO partnerships, alignment with Closing the Gap and ATSIEAP. First Peoples Disability Network in Service 7B.CALD communities
Co-design is mandated. NEDA in Service 7B. Includes new migrants, refugees, the Deaf community. Translation into 15+ community languages and Auslan.Teachers, principals, school leaders
Co-design with educators and education authorities is required to ensure local context, policies and priorities are reflected. Must align with APST / APSP and teacher PL accreditation.Education authorities
State / territory government, Catholic, independent. They sit in Service 7B, but they also gate the practical question of whether PL is recognised in each jurisdiction.Allied health and peak bodies
Speech pathologists, occupational therapists, autism / disability advocacy peak bodies. Inform resources and identify communities and schools.Department of Education
Approves all materials. Approves the co-design method (via Quality Management Plan). Convenes Service 7 forums. Member of every governance loop.Phase 4 evidence base
Existing materials, evaluation findings, and an 80,000+ participant body of feedback. Treat this as a "voice in the room" — co-design starts informed, not from zero.Autism CRC
Named in Requirement 5 as a research source the supplier should "pay particular attention to". Their position in the Phase 5 co-design model — peer, advisor, deliverable contributor — is exactly the question Melbourne is meant to start to answer.Layers of input — a working distinction
- Direction — voices that shape what gets designed in the first place. Lived experience advisors, First Nations and CALD community representatives, education authorities. Most leverage when engaged earliest.
- Co-creation — voices in the room while content is being made. Autistic educators, parents, students (where appropriate), allied health.
- Validation — voices that test what's been made. UCD testing for Service 6, user acceptance testing, pilot delivery.
- Continuous — voices that keep informing change after launch. Service 7B forums, the educator CoP, Service 8 surveys, ad-hoc participant feedback.
04Draft workshop flow
A possible shape for the Melbourne session
A lightweight running order for the in-room session, designed to defer to Cheryl's facilitation while making sure the working surface area gets covered. Times are indicative; the spine matters more than the clock.
- Open · ~15 min · Cheryl-led Welcome, framing, and Cheryl's co-design approach Cheryl sets the method for the session and the broader phase. The PP team listens before pitching anything.
- Orient · ~20 min Walk through the service inventory and program catalogue PP / Autism CRC together skim §1 and §2 of this scaffold, just to anchor shared understanding of the surface area. Not a debate.
- Map · ~30 min Where does co-design land hardest? Group identifies the 3–4 services where co-design has the highest stakes (likely 3, 4, 5, 6 — but let the room decide). These get the deeper attention; the others are flagged for later workstreams.
- Voice · ~30 min Whose voice, where, in what role Walk through §3. Treat the four layers (direction / co-creation / validation / continuous) as a working frame — not a deliverable. The student-voice question is worth surfacing here explicitly.
- Frame · ~30 min · Cheryl-led Sketch a working co-design framework Cheryl drives — using whatever method she's bringing, informed by what's been mapped. Outcome: a shared rough framework, even if it's still a sketch.
- Position · ~20 min Autism CRC's role in Phase 5 Where does Autism CRC sit relative to the Service 7B forums? Are they an advisor, a research partner, a delivery partner, or a hybrid? This needs an explicit answer or an explicit "we'll resolve it by date X".
- Close · ~15 min Decisions, parking lot, owners, dates Capture: what was decided, what's parked for next session, who owns what, and the next 2–3 milestones. Send the page back out updated within 48 hrs.
05Slide outline
A small visual deck for projecting at the start
Five slides, designed to be shown only if the room wants something at the front. Click Enter presenter mode in the side bar to step through them full-screen; arrow keys / Esc to navigate and exit.
06Open questions for Cheryl
Worth asking before the meeting, so the room starts ahead
Things to raise in the pre-meeting so the team isn't dependent on Cheryl bringing all the structure. None of these displace her process — they just make sure the day uses the room well.
- What co-design method are you bringing into the room? The RFT names the PWDA Co-Design Programming Overview as the contractual anchor under Requirement 5. Knowing whether Cheryl is using PWDA as-is, an adaptation, or a different framework that she'll map to PWDA tells us how the day's work translates into the QMP later.
- Where do you want lived-experience advisors in the room vs. consulted out of session? Sets expectations for how Melbourne itself models the co-design we're trying to design. Also clarifies whether the room is "PP + Autism CRC" planning, or "PP + Autism CRC + lived experience" co-creating.
- How do you see Autism CRC's role across Phase 5? Are they a research source we cite, an advisory partner that sits alongside Service 7B forums, or a delivery contributor on specific services? Different answers imply different governance, different IP arrangements, and different framing for this Melbourne meeting itself.
- How do you want student voice to operate? The RFT names it only in Service 3, but the spirit of co-design points wider. A position before Melbourne avoids the room re-litigating it under time pressure.
- What cadence of co-design milestones do you envisage across the three years? Phase 1 transition runs Jan–Jun 2026; service delivery starts 1 Jul 2026; full website by 31 Dec 2026. The co-design rhythm needs to fit those gates. A rough sketch from Cheryl saves the room from improvising one.
- Where do you want the PP team holding a pen, vs. holding the space? Avoids the trap where PP staff drift into co-designing on behalf of the people we should be co-designing with. Also tells the team how to behave in the room.
- What's a "good outcome" for the Melbourne session in your view? Gives the team a single shared definition of done so the day doesn't become open-ended. Pairs with §4's "decisions, parking lot, owners" close.
07Evaluation framework considerations
What we need to bake in now so ACER can evaluate it later
The independent evaluation of Phase 5 will be commissioned by the Department of Education and is likely to be run by an external research body such as ACER. Service 8 is the supplier's own data and continuous-improvement infrastructure; the independent evaluation sits over the top of it. The decisions we make in co-design — what we measure, who we ask, how we capture lived experience, what counts as success — directly determine whether the program is evaluable when the time comes.
Considerations the framework needs to cover
Theory of change / program logic
A clear logic model linking inputs → activities → outputs → short-term outcomes → long-term outcomes for each audience stream. Without this, evaluators are reverse-engineering intent. Should be co-designed early and approved through the Quality Management Plan.Outcome dimensions
Multiple chains, multiple audiences. Educator change (knowledge, confidence, classroom practice). Parent / carer change (knowledge, advocacy, engagement). Student experience and inclusion. School-level culture and policy. Each needs its own measurement approach.KPIs vs evaluation outcomes
Annual delivery targets and KPIs (Attachment A) measure that the program ran. Evaluation measures whether it worked. Both matter, but they're different instruments — an evaluation framework that only restates the KPIs has no independent yield.Baseline and longitudinal capture
Pre-program / pre-session measurement so change can be attributed. Follow-up at 3, 6, 12 months for the lasting question — does new educator practice persist; does parent advocacy stick. Whole-school community model (Service 3) has a 12-month cycle that lends itself naturally to this.Phase 4 continuity
Phase 4 ran 2021–2026 with 80,000+ participants. Continuity of measures matters: the more Phase 5 instruments mirror Phase 4, the stronger the longitudinal story. Any deliberate divergence should be deliberate, documented, and defensible.Comparison logic
A national open-access program is hard to run a true counterfactual on. Realistic options: pre / post within participants; matched comparisons across schools; staggered roll-out designs; comparison with non-participating school cohorts. Worth flagging early — it shapes the data architecture.Equity disaggregation
Outcomes broken down by First Nations, CALD, remoteness, sector, school type, age band. A whole-of-program average can hide that the program works for one cohort and not another. The evaluation framework needs the data to support this from day one.First Nations data sovereignty
Data about First Nations participants requires Indigenous Data Governance principles, community endorsement, and ACCO involvement in evaluation design — not only delivery design. ACCO-endorsed case reports under Service 4C are part of this picture but not all of it.Culturally safe instruments
Surveys, interview protocols and outcome measures need cultural and linguistic adaptation, not just translation. Includes Auslan, Easy Read, plain-English variants, and instruments validated for use with Autistic respondents.Lived experience in evaluation design
The same logic that drives co-design of services applies to evaluation: Autistic people, parents, students (where appropriate), First Nations and CALD voices should help shape what success looks like and how it's measured. Otherwise the evaluation reproduces external assumptions about what good outcomes are.Student voice and ethics
Where students are part of evaluation (consistent with Service 3's "options for student voice"), it needs ethical review, age-appropriate consent, supported participation, and clear protocols on what's done with what they say. Worth a position before ACER asks.Privacy, consent, data sharing
APP-compliant collection, clear privacy statements, consent for downstream sharing with third-party evaluators, and ISM / PSPF-compliant storage and access. Service 6's website needs to support this from MVP day-one — adding consent retrospectively is painful.Process vs outcome vs impact
Three distinct lenses the evaluator will likely run in parallel. Process: was the program delivered as designed (fidelity, reach, accessibility). Outcomes: did participants change. Impact: are Autistic students experiencing different educational outcomes. The framework needs data feeding all three.Cost-effectiveness / value for money
Increasingly expected of Commonwealth program evaluations. Requires unit-cost data per participant, per workshop, per resource — which has to be tracked through delivery, not reconstructed afterwards.Alignment with the National Autism Strategy
Phase 5 is named as an existing priority under the National Autism Strategy 2025–31. The evaluation will be read for its contribution to strategy outcomes, not only program-internal outcomes. The framework should make that contribution legible.Independence safeguards
Evaluator needs unfiltered access to data, participants, personnel and materials. Supplier doesn't pre-screen findings. Worth being explicit early so the working culture supports it — independence is easier to design in than to retrofit.What this means for the Melbourne conversation
- Co-design the logic model alongside the services. The conversation about what good looks like is the same conversation in both directions. If the room agrees on outcomes, the evaluation framework is half written.
- Decide measurement at the same time as design. Each service should leave Melbourne with at least a draft answer to "what would tell us this is working?" — even if the instruments come later.
- Bring lived experience into evaluation design, not just service design. Otherwise the people the program is for don't get a say in what success means for them.
- Map who holds what. Supplier owns delivery data and continuous improvement. Department commissions and owns the independent evaluation. ACER (or whoever) brings methodological independence. Autism CRC's role here is part of §6's open questions.
- Build for evaluability from MVP. Service 6's MVP on 1 July 2026 is the practical deadline for whether registration, consent capture, and data linkage are designed-in or bolted-on.
Working questions to flag for the department and ACER
- Will ACER (or another body) be commissioned for the full term, or for milestone-based evaluation moments? Determines whether evaluation is continuous and embedded, or summative and periodic. The data architecture is different in each case.
- Is there a Phase 4 evaluation report we should be designing continuity from? If Phase 4 outcomes were measured a certain way, repeating those measures gives a longitudinal story. Diverging deliberately is fine — diverging accidentally is wasted information.
- What outcome statements does the department most want to be able to make at the end of Phase 5? Working backwards from the desired statement defines the data you need to collect. Saves the team from over-collecting and from under-collecting in equal measure.
- How will First Nations data sovereignty be operationalised in the evaluation? Affects what data is collected, who holds it, who consents on whose behalf, and how findings about First Nations participants are reported. Best resolved with ACCO partners early.
- Where does the supplier's continuous improvement reporting end and the independent evaluation begin? Avoids duplication and avoids gaps. Service 8 is the supplier's instrument; the independent evaluation is the department's. The handshake between the two is the point worth defining.