Positive Partnerships Phase 5 · Working scaffold, not a method

Prep pack for the Melbourne co-design session with Autism CRC

A starting scaffold so the room has something to react to rather than starting from a blank page. It is not a co-design framework, and it does not preempt Cheryl's process — it surfaces the service requirements from the RFT, the session and resource types that have to be designed, and where lived experience and student voice need to be embedded.

How to use this. Share before the meeting if helpful, or pull it up at the start to anchor the conversation. The aim is to save the first hour of orientation so the room can spend more time on the harder co-design questions. Cheryl owns the method; this only sketches the surface area.

01Service requirements inventory

Nine services to design and deliver across 1 July 2026 – 30 June 2029

All nine services from Schedule 1 of the RFT, with their audiences, required delivery modes, embedded co-design expectations, and headline KPIs. Services 1–5 are the participant-facing offer; Services 6–8 are the platform, engagement and assurance layer; Service 9 is on-call.

Service 1 Professional Learning for teachers, principals and other school staff Participant-facing
Audience
Teachers, principals, school leaders, paraprofessionals (teacher aides / learning assistants), inclusion / learning-support coordinators
Delivery modes
F2F multi-day · F2F short topic · virtual classroom · self-paced · mixed-mode · faculty / leadership team · early-career-tailored
Sub-services
1A design & develop · 1B deliver · 1C educator Community of Practice
Co-design with
Educators, education authorities, Autistic people, people with disability, First Nations, CALD
Anchored to
DDA 1992 · DSE 2005 · NCCD · Australian Curriculum v9 · APST · APSP · National Autism Strategy 2025–31
Critical hooks
Must meet teacher regulatory authority PL recognition; align with Autism Microcredential
Service 2 Parent and carer workshops and training Participant-facing
Audience
Parents and carers of Autistic school-age students, primary and secondary
Delivery modes
F2F multi-day · F2F short topic · virtual classroom · mixed-mode · weekend & after-hours
Sub-services
2A design & develop · 2B deliver · 2C post-training networking / forum
Co-design with
Lived experience (Autistic, autism community), First Nations, CALD parents and carers
Topic anchors
DDA · DSE participant rights · advocacy mechanisms · home–school partnerships · transitions (pre-school → primary → secondary → post-school)
Service 3 Training and support for whole school communities Participant-facing Student voice expected
Audience
Whole school community: leadership, staff, parents and carers, students (where appropriate)
Engagement phases
1) Pre-engagement & assessment · 2) Targeted PL + whole-school training · 3) Collaboration & student voice · 4) Ongoing support & champion role
Co-design with
School leadership, school champion, parents/carers, students where appropriate, Service 7 stakeholders
Targets
Min 15 schools yr 1; min 20 schools yr 2 onwards. Min 12 months of support per school.
Outputs per school
Action plan with milestones, school champion identified, formal follow-up session, links to school strategic plan
Service 4 Outreach for First Nations communities Participant-facing Co-design mandated
Audience
Educators, parents and carers of Autistic First Nations school students; ACCO staff
Co-design partners
First Nations communities, ACCOs, First Peoples Disability Network
Principles
Cultural safety · co-design · community-led engagement · strengths-based · reciprocity & respect · alignment with Closing the Gap and ATSIEAP
Targets
Min 10 workshops per year. Min 2 ACCO partnerships per year, supported across the contract period.
Translation
Materials adaptable to local context; First Nations language translation where required
Compliance
IPP / MMR if any Remote Area delivery; capability uplift evidenced via ACCO-endorsed case report
Service 5 Targeted workshops and training for CALD families Participant-facing Co-design mandated
Audience
CALD parents and carers of Autistic school-age students. CALD includes the Deaf community.
Co-design partners
CALD communities, National Ethnic Disability Alliance, advocacy groups, Autistic CALD voices
Targets
Min 10 workshops per year. Priority access for new migrants and refugees. Delivered nationally including regional, rural, remote.
Translation
Materials adaptable to community; translation into community languages and Auslan
Facilitators
Persons with experience working with CALD families; able to speak the community's primary language(s)
Service 6 Online learning and information services Platform MVP 1 Jul 2026 · Full release 31 Dec 2026
Sub-services
6A program website · 6B online learning system · 6C information & learning resources
Resource formats
Fact sheets · articles / blogs · videos & animations · webinars · interactive tools · podcasts
Accessibility
WCAG 2.1 AA · Easy Read · Auslan · 15+ community languages
Co-design via
User-Centred Design with diverse users incl. people with disability, First Nations, CALD, key stakeholders, the department
Standards
Australian Government Digital Experience Policy (Service / Access / Inclusion / Performance Standards) · 99.95% uptime · PSPF / ISM / Essential Eight
Service 7 Stakeholder engagement and promotion Engagement
Sub-services
7A communications & promotion plan · 7B stakeholder forums
Forum cohorts
1) Autism & disability associations + allied health · 2) First Nations (FPDN, ACCOs, community members) · 3) CALD (NEDA, advocacy, Deaf community) · 4) Australian Government · 5) State / Territory + non-gov education authorities + parent & carer associations
Purpose
Forums inform Services 1–6; promote across jurisdictions; encourage participation from priority cohorts
Note for Melbourne
Autism CRC's role in co-design needs to be located clearly relative to these forums (see open questions, §6).
Service 8 Program data, performance monitoring and continuous improvement Assurance
Sub-services
8A data collection & management · 8B performance monitoring & CI
Plan
Data, Performance and Continuous Improvement Plan, approved by the department
Inputs
Service metrics · participant feedback & surveys · website & social analytics · stakeholder forum feedback · delivery partner input
Reporting
Quarterly progress · annual reports · IPP QPR · AIP implementation · ad hoc · final report
Service 9 Targeted projects (if required) On-call
Scope
Research projects · new training resources · tailored training for identified cohorts. Earliest commencement: 1 July 2027.
Triggers
Emerging issues or new government priorities, agreed in writing with the department
Costing basis
Hourly & daily rates, plus indicative costs for: a research project + report; a new learning resource; a new training session

Cross-cutting requirements that shape co-design

  • Requirement 5 — Co-design method. The RFT specifies that, where co-design is required, the supplier must adhere to the principles and follow the guidance of the People with Disability Australia (PWDA) Co-Design Programming Overview. This is the named methodology in the contract.
  • Lived experience throughout. Content and delivery must incorporate lived experience and the perspectives of the autism community across all services, not only Services 4 and 5.
  • Student voice. Explicitly named in Service 3 ("options for a student voice, where appropriate") within the whole-school community model. No equivalent named hook in Services 1, 2, 4, 5, 6 — worth deciding whether to embed it more widely.
  • Department approval. All training materials, online resources, website content and promotional content must be approved by the department prior to publication or use.
  • Phase 4 inheritance. Existing Phase 4 materials will be handed over and must be reviewed, refreshed, replaced or built on — co-design starts from a body of existing content, not a blank slate.

02Program / session catalogue

The actual things that need to be designed

Pulled out of the RFT and grouped by modality so the room can see, in one place, the surface area of what has to be co-designed. Durations are indicative — the RFT names some formats explicitly (multi-day, virtual classroom, short topic) but leaves length to the tenderer; 90-minute topic sessions are flagged here as a likely default for the online topic offer.

Face-to-face

FormatIndicative durationAudienceService(s)
Comprehensive PL workshop1–2 daysTeachers, school leaders, paraprofessionals1, 3
Short / targeted topic sessionHalf-day or 90 minTeachers, school leaders1, 3
Faculty / leadership team trainingHalf-day to full daySchool leadership groups1, 3
Comprehensive parent / carer workshop1 dayParents and carers2, 3
Parent / carer topic session2–3 hrs (incl. weekend / after-hours)Parents and carers2
Whole-school community workshopVariable, scoped per schoolStaff + parents + leadership3
First Nations community workshopCo-designed with communityEducators, parents, carers in First Nations communities4
CALD community workshopCo-designed with community, in languageCALD parents and carers5

Online — live

FormatIndicative durationAudienceService(s)
Virtual classroom (interactive PL)2 hrs typicalTeachers, school staff1, 3
Virtual parent / carer session90 min typicalParents and carers2
Online topic session90 minTeachers, parents (varies by topic)1, 2, 3, 6B
Webinar (broadcast format)45–60 minAll audiences6C

Online — self-paced and mixed

FormatIndicative durationAudienceService(s)
Foundational online module (accredited)30–60 min per moduleTeachers, school staff6B
Topic module (self-paced)20–45 minAll audiences6B
Mixed-mode (pre-work + live workshop)Module + 1–2 hr liveTeachers, parents1, 2, 3
Educator Community of PracticeOngoingPast PL participants1C, 6A
Parent / carer networking forumOngoingPast participants2C, 6A

Information & learning resources (Service 6C)

FormatAudience focusNotes
Fact sheetAll audiences (general & targeted)Easy Read, Auslan, 15+ community languages where applicable
Article / blogParents, school staffTopical / current-issue focus
Video / animationAll audiencesStrong fit for First Nations, CALD, Easy Read variants
Webinar (recording)All audiencesCompanion to live broadcasts
Interactive toolParents, school staffHigher build cost — co-design priority
PodcastParents, school staff, communityStrong vehicle for lived-experience storytelling

What the catalogue tells us about the co-design surface area

  • Five distinct audience streams — educators, parents/carers, whole-school communities, First Nations, CALD — each with their own variant of most formats.
  • Three layers of intensity — long-form comprehensive workshops, mid-form 90-minute sessions, short-form modules and resources.
  • One platform spine — Service 6 — that has to integrate with all of the above, on a fixed deadline.
  • Two ongoing community structures — educator CoP and parent / carer networking forum — that need their own co-design and moderation model.

03Co-design inputs needed

Whose voice we need, and where it lands

A first-pass map of the participation expectations the RFT places on Phase 5. The aim here is not to assign people to roles — that's part of what Cheryl's process will do — but to make visible the range of voices that need to be embedded, and where each one matters most.

Autistic people

Lived experience must shape content and delivery across all services. Includes Autistic adults, Autistic students (with appropriate consent and support), Autistic educators and Autistic parents.
Embedded across · Services 1, 2, 3, 6 mandate · Services 4, 5 cross-cut

People with disability (broader)

The RFT names "people with disability" alongside Autistic people. Co-design via PWDA's framework is the named methodology under Requirement 5.
Method anchor · all services

Students (student voice)

Explicitly named only in Service 3 ("options for a student voice, where appropriate"). A genuine open question for Melbourne is whether it should also reach into Services 1, 2 and 6.
Named · Service 3 · candidate for · 1, 2, 6

Parents and carers

Both as participants and as co-designers. Required perspective in resource development, parent / carer training, whole-school community model, and resource formats.
Embedded · Services 2, 3, 6C

First Nations communities

Co-design is mandated, not optional. Cultural safety, community-led engagement, ACCO partnerships, alignment with Closing the Gap and ATSIEAP. First Peoples Disability Network in Service 7B.
Mandated · Service 4 · cross-cut · 1, 2, 6

CALD communities

Co-design is mandated. NEDA in Service 7B. Includes new migrants, refugees, the Deaf community. Translation into 15+ community languages and Auslan.
Mandated · Service 5 · cross-cut · 1, 2, 6

Teachers, principals, school leaders

Co-design with educators and education authorities is required to ensure local context, policies and priorities are reflected. Must align with APST / APSP and teacher PL accreditation.
Required · Services 1, 3, 6

Education authorities

State / territory government, Catholic, independent. They sit in Service 7B, but they also gate the practical question of whether PL is recognised in each jurisdiction.
Forum · Service 7B · gating · Service 1

Allied health and peak bodies

Speech pathologists, occupational therapists, autism / disability advocacy peak bodies. Inform resources and identify communities and schools.
Forum · Service 7B

Department of Education

Approves all materials. Approves the co-design method (via Quality Management Plan). Convenes Service 7 forums. Member of every governance loop.
Approval · all services

Phase 4 evidence base

Existing materials, evaluation findings, and an 80,000+ participant body of feedback. Treat this as a "voice in the room" — co-design starts informed, not from zero.
Inheritance · all services

Autism CRC

Named in Requirement 5 as a research source the supplier should "pay particular attention to". Their position in the Phase 5 co-design model — peer, advisor, deliverable contributor — is exactly the question Melbourne is meant to start to answer.
RFT-named research source · role TBD

Layers of input — a working distinction

  • Direction — voices that shape what gets designed in the first place. Lived experience advisors, First Nations and CALD community representatives, education authorities. Most leverage when engaged earliest.
  • Co-creation — voices in the room while content is being made. Autistic educators, parents, students (where appropriate), allied health.
  • Validation — voices that test what's been made. UCD testing for Service 6, user acceptance testing, pilot delivery.
  • Continuous — voices that keep informing change after launch. Service 7B forums, the educator CoP, Service 8 surveys, ad-hoc participant feedback.

04Draft workshop flow

A possible shape for the Melbourne session

A lightweight running order for the in-room session, designed to defer to Cheryl's facilitation while making sure the working surface area gets covered. Times are indicative; the spine matters more than the clock.

  1. Open · ~15 min · Cheryl-led Welcome, framing, and Cheryl's co-design approach Cheryl sets the method for the session and the broader phase. The PP team listens before pitching anything.
  2. Orient · ~20 min Walk through the service inventory and program catalogue PP / Autism CRC together skim §1 and §2 of this scaffold, just to anchor shared understanding of the surface area. Not a debate.
  3. Map · ~30 min Where does co-design land hardest? Group identifies the 3–4 services where co-design has the highest stakes (likely 3, 4, 5, 6 — but let the room decide). These get the deeper attention; the others are flagged for later workstreams.
  4. Voice · ~30 min Whose voice, where, in what role Walk through §3. Treat the four layers (direction / co-creation / validation / continuous) as a working frame — not a deliverable. The student-voice question is worth surfacing here explicitly.
  5. Frame · ~30 min · Cheryl-led Sketch a working co-design framework Cheryl drives — using whatever method she's bringing, informed by what's been mapped. Outcome: a shared rough framework, even if it's still a sketch.
  6. Position · ~20 min Autism CRC's role in Phase 5 Where does Autism CRC sit relative to the Service 7B forums? Are they an advisor, a research partner, a delivery partner, or a hybrid? This needs an explicit answer or an explicit "we'll resolve it by date X".
  7. Close · ~15 min Decisions, parking lot, owners, dates Capture: what was decided, what's parked for next session, who owns what, and the next 2–3 milestones. Send the page back out updated within 48 hrs.

05Slide outline

A small visual deck for projecting at the start

Five slides, designed to be shown only if the room wants something at the front. Click Enter presenter mode in the side bar to step through them full-screen; arrow keys / Esc to navigate and exit.

Slide 1 / 5

Phase 5 at a glance

1 July 2026 – 30 June 2029, with possible 24-month extension
9Services
5Audience streams
3Years
15+Languages
  • Services 1–5 are participant-facing (PL, parents, whole school, First Nations, CALD)
  • Service 6 is the platform — website MVP by 1 July 2026, full release by 31 Dec 2026
  • Services 7–9 are engagement, assurance, and on-call projects
Slide 2 / 5

What needs to be designed

The surface area of Phase 5 co-design
Sessions & workshops
  • Comprehensive PL (multi-day)
  • Targeted / 90-min topic sessions
  • Whole-school community model
  • First Nations community workshops
  • CALD community workshops
Platform & resources
  • New website + online learning system
  • Self-paced modules (accredited)
  • Webinars, podcasts, videos, interactive tools
  • Easy Read, Auslan, 15+ languages
  • Educator Community of Practice + parent forum
Slide 3 / 5

Voices to embed

PWDA Co-Design Programming Overview is the RFT-named method
  • Always present — Autistic people, people with disability, lived experience of the autism community
  • Mandated co-design partners — First Nations communities & ACCOs, CALD communities incl. the Deaf community
  • System voices — teachers, principals, education authorities, parents and carers
  • Named explicitly in RFT — student voice in Service 3; open question whether it reaches further
  • Working distinction — direction · co-creation · validation · continuous
Slide 4 / 5

Where co-design lands hardest

Where the room's attention is most leveraged
  • Service 3 — whole school community. Action plan, school champion, student voice, 12-month support cycle.
  • Service 4 — First Nations. ACCO partnerships, cultural safety, capability uplift, IPP / MMR if remote.
  • Service 5 — CALD. Community-by-community design, in-language delivery, new migrant / refugee priority.
  • Service 6 — platform. UCD with diverse users; 1 July 2026 MVP deadline forces early co-design decisions.
  • Service 7B — forums. Standing structures that feed all of the above.
Slide 5 / 5

What we want from this session

A starting scaffold, not a finished method
  • A shared map of the co-design surface area we agree on
  • A working framework — Cheryl-led — that the team can take forward
  • A clear-enough answer to "how does Autism CRC sit in this?"
  • Three or four next steps with named owners and dates
  • An explicit list of what we're not deciding today

06Open questions for Cheryl

Worth asking before the meeting, so the room starts ahead

Things to raise in the pre-meeting so the team isn't dependent on Cheryl bringing all the structure. None of these displace her process — they just make sure the day uses the room well.

  1. What co-design method are you bringing into the room? The RFT names the PWDA Co-Design Programming Overview as the contractual anchor under Requirement 5. Knowing whether Cheryl is using PWDA as-is, an adaptation, or a different framework that she'll map to PWDA tells us how the day's work translates into the QMP later.
  2. Where do you want lived-experience advisors in the room vs. consulted out of session? Sets expectations for how Melbourne itself models the co-design we're trying to design. Also clarifies whether the room is "PP + Autism CRC" planning, or "PP + Autism CRC + lived experience" co-creating.
  3. How do you see Autism CRC's role across Phase 5? Are they a research source we cite, an advisory partner that sits alongside Service 7B forums, or a delivery contributor on specific services? Different answers imply different governance, different IP arrangements, and different framing for this Melbourne meeting itself.
  4. How do you want student voice to operate? The RFT names it only in Service 3, but the spirit of co-design points wider. A position before Melbourne avoids the room re-litigating it under time pressure.
  5. What cadence of co-design milestones do you envisage across the three years? Phase 1 transition runs Jan–Jun 2026; service delivery starts 1 Jul 2026; full website by 31 Dec 2026. The co-design rhythm needs to fit those gates. A rough sketch from Cheryl saves the room from improvising one.
  6. Where do you want the PP team holding a pen, vs. holding the space? Avoids the trap where PP staff drift into co-designing on behalf of the people we should be co-designing with. Also tells the team how to behave in the room.
  7. What's a "good outcome" for the Melbourne session in your view? Gives the team a single shared definition of done so the day doesn't become open-ended. Pairs with §4's "decisions, parking lot, owners" close.

07Evaluation framework considerations

What we need to bake in now so ACER can evaluate it later

The independent evaluation of Phase 5 will be commissioned by the Department of Education and is likely to be run by an external research body such as ACER. Service 8 is the supplier's own data and continuous-improvement infrastructure; the independent evaluation sits over the top of it. The decisions we make in co-design — what we measure, who we ask, how we capture lived experience, what counts as success — directly determine whether the program is evaluable when the time comes.

What the RFT actually says. Requirement 2 obliges the supplier to fully cooperate with any evaluation activities initiated by the department and undertaken by a third party — including provision of Contract Material, information and data, and access to personnel, subcontractors and participants. Service 8A requires a Data, Performance and Continuous Improvement Plan agreed with the department. The independent evaluation is the department's call; the supplier's job is to make sure the program is set up to support it.

Considerations the framework needs to cover

Theory of change / program logic

A clear logic model linking inputs → activities → outputs → short-term outcomes → long-term outcomes for each audience stream. Without this, evaluators are reverse-engineering intent. Should be co-designed early and approved through the Quality Management Plan.
Foundational · sets what counts as success

Outcome dimensions

Multiple chains, multiple audiences. Educator change (knowledge, confidence, classroom practice). Parent / carer change (knowledge, advocacy, engagement). Student experience and inclusion. School-level culture and policy. Each needs its own measurement approach.
Avoid conflating delivery volume with impact

KPIs vs evaluation outcomes

Annual delivery targets and KPIs (Attachment A) measure that the program ran. Evaluation measures whether it worked. Both matter, but they're different instruments — an evaluation framework that only restates the KPIs has no independent yield.
Distinct from contract performance reporting

Baseline and longitudinal capture

Pre-program / pre-session measurement so change can be attributed. Follow-up at 3, 6, 12 months for the lasting question — does new educator practice persist; does parent advocacy stick. Whole-school community model (Service 3) has a 12-month cycle that lends itself naturally to this.
Service 3 cycle · CoP follow-up · forum data

Phase 4 continuity

Phase 4 ran 2021–2026 with 80,000+ participants. Continuity of measures matters: the more Phase 5 instruments mirror Phase 4, the stronger the longitudinal story. Any deliberate divergence should be deliberate, documented, and defensible.
Inheritance · longitudinal claim space

Comparison logic

A national open-access program is hard to run a true counterfactual on. Realistic options: pre / post within participants; matched comparisons across schools; staggered roll-out designs; comparison with non-participating school cohorts. Worth flagging early — it shapes the data architecture.
Methodological choice with design consequences

Equity disaggregation

Outcomes broken down by First Nations, CALD, remoteness, sector, school type, age band. A whole-of-program average can hide that the program works for one cohort and not another. The evaluation framework needs the data to support this from day one.
All audience streams · priority cohorts

First Nations data sovereignty

Data about First Nations participants requires Indigenous Data Governance principles, community endorsement, and ACCO involvement in evaluation design — not only delivery design. ACCO-endorsed case reports under Service 4C are part of this picture but not all of it.
Service 4 · cross-cuts evaluation method

Culturally safe instruments

Surveys, interview protocols and outcome measures need cultural and linguistic adaptation, not just translation. Includes Auslan, Easy Read, plain-English variants, and instruments validated for use with Autistic respondents.
Services 4, 5, 6 · accessibility & cultural fit

Lived experience in evaluation design

The same logic that drives co-design of services applies to evaluation: Autistic people, parents, students (where appropriate), First Nations and CALD voices should help shape what success looks like and how it's measured. Otherwise the evaluation reproduces external assumptions about what good outcomes are.
Mirrors PWDA co-design framing

Student voice and ethics

Where students are part of evaluation (consistent with Service 3's "options for student voice"), it needs ethical review, age-appropriate consent, supported participation, and clear protocols on what's done with what they say. Worth a position before ACER asks.
Service 3 named · candidate for wider use

Privacy, consent, data sharing

APP-compliant collection, clear privacy statements, consent for downstream sharing with third-party evaluators, and ISM / PSPF-compliant storage and access. Service 6's website needs to support this from MVP day-one — adding consent retrospectively is painful.
Requirement 2 · Requirement 3 · Service 6A

Process vs outcome vs impact

Three distinct lenses the evaluator will likely run in parallel. Process: was the program delivered as designed (fidelity, reach, accessibility). Outcomes: did participants change. Impact: are Autistic students experiencing different educational outcomes. The framework needs data feeding all three.
Determines what data is collected and when

Cost-effectiveness / value for money

Increasingly expected of Commonwealth program evaluations. Requires unit-cost data per participant, per workshop, per resource — which has to be tracked through delivery, not reconstructed afterwards.
Needs delivery-side cost capture from day one

Alignment with the National Autism Strategy

Phase 5 is named as an existing priority under the National Autism Strategy 2025–31. The evaluation will be read for its contribution to strategy outcomes, not only program-internal outcomes. The framework should make that contribution legible.
External alignment · whole-of-government read

Independence safeguards

Evaluator needs unfiltered access to data, participants, personnel and materials. Supplier doesn't pre-screen findings. Worth being explicit early so the working culture supports it — independence is easier to design in than to retrofit.
Requirement 2 · contractual obligation

What this means for the Melbourne conversation

  • Co-design the logic model alongside the services. The conversation about what good looks like is the same conversation in both directions. If the room agrees on outcomes, the evaluation framework is half written.
  • Decide measurement at the same time as design. Each service should leave Melbourne with at least a draft answer to "what would tell us this is working?" — even if the instruments come later.
  • Bring lived experience into evaluation design, not just service design. Otherwise the people the program is for don't get a say in what success means for them.
  • Map who holds what. Supplier owns delivery data and continuous improvement. Department commissions and owns the independent evaluation. ACER (or whoever) brings methodological independence. Autism CRC's role here is part of §6's open questions.
  • Build for evaluability from MVP. Service 6's MVP on 1 July 2026 is the practical deadline for whether registration, consent capture, and data linkage are designed-in or bolted-on.

Working questions to flag for the department and ACER

  1. Will ACER (or another body) be commissioned for the full term, or for milestone-based evaluation moments? Determines whether evaluation is continuous and embedded, or summative and periodic. The data architecture is different in each case.
  2. Is there a Phase 4 evaluation report we should be designing continuity from? If Phase 4 outcomes were measured a certain way, repeating those measures gives a longitudinal story. Diverging deliberately is fine — diverging accidentally is wasted information.
  3. What outcome statements does the department most want to be able to make at the end of Phase 5? Working backwards from the desired statement defines the data you need to collect. Saves the team from over-collecting and from under-collecting in equal measure.
  4. How will First Nations data sovereignty be operationalised in the evaluation? Affects what data is collected, who holds it, who consents on whose behalf, and how findings about First Nations participants are reported. Best resolved with ACCO partners early.
  5. Where does the supplier's continuous improvement reporting end and the independent evaluation begin? Avoids duplication and avoids gaps. Service 8 is the supplier's instrument; the independent evaluation is the department's. The handshake between the two is the point worth defining.