MI Write Feedback: Helping Students Understand and Improve Their Writing
MI Write provides students with detailed evaluation and feedback on their writing, helping them understand the quality of their writing and improve specific aspects of their work. Using an intelligent rule-based system, MI Write evaluates student essays based on key linguistic features and established scoring rubrics. Feedback is designed to be actionable, guiding students through meaningful revisions.
Writing Analysis: Evaluation and Feedback
MI Write’s feedback offers a comprehensive Writing Analysis, providing students with both quantitative and qualitative feedback to help them understand their current performance, set clear writing goals, and receive targeted guidance on how to improve.
Quantitative Feedback:
- Trait Scores: Student writing is assessed across six essential writing traits: development of ideas, organization, style, word choice, sentence fluency, and conventions. Each trait is scored on a 1–5 scale based on scoring rubrics.
- Holistic Score: A total score (6–30) is generated by summing trait scores, aligning with the Six Trait Scoring Model.
- Percentile Rank & Stanine Score: These provide a norm-referenced comparison of student writing performance.
Qualitative Feedback:
- Rubric-Based Trait Feedback: Each trait includes 1–5 feedback statements aligned with the given score, offering guidance for improvement.
- Evaluation Prompts: Students receive targeted questions to encourage self-assessment, such as:
Have you clearly stated your central idea and addressed all parts of the prompt?
Is your writing organized logically, with a strong introduction and conclusion?
Does your word choice enhance clarity and engagement?
Have you used a variety of sentence structures to improve fluency?
Are spelling, punctuation, and grammar correct?
- Linguistic Feature Analysis: The feedback considers key writing characteristics and linguistic patterns to provide insights on coherence, clarity, and fluency.
- Integrated Lessons: Trait feedback links students to relevant writing lessons, supporting improvement in areas such as sentence structure, word choice, and organization.
Introducing Writing Advisor: AI-Powered Writing Feedback for More Meaningful Revisions
MI Write now includes Writing Advisor, a new AI-powered feedback tool that enhances the writing revision process with personalized, context-aware feedback. Writing Advisor supplements, rather than replaces, MI Write’s rule-based feedback. However, unlike traditional feedback, Writing Advisor uses generative AI to provide more precise and actionable suggestions tailored to each student’s writing.
Key Features of Writing Advisor:
- Aligned to Scoring Rubrics: Feedback is designed to complement MI Write’s existing trait-based scoring, ensuring students receive actionable guidance that aligns with writing standards.
- Context-Aware Feedback: Writing Advisor evaluates the unique context of each essay, offering feedback that helps students refine their writing with greater precision.
- Secure, Self-Hosted AI Engine: All generative AI feedback is processed within MI Write’s self-hosted environment, ensuring student data privacy and security.
- User Feedback for Continuous Improvement: Students can provide direct feedback on the Writing Advisor's suggestions using the “Was this advice helpful?” button. This feature allows MI Write to continuously refine and improve the quality of AI-generated feedback based on user input.
Example of Writing Advisor Feedback:
These personalized, generative-AI suggestions go beyond traditional rubric-based feedback, helping students make targeted revisions that improve the depth, clarity, and organization of their writing.
Development and Testing of Writing Advisor
The development of Writing Advisor followed a multi-stage process designed to ensure high-quality, context-aware feedback. First, large language models (LLMs) were used to generate a broad sample of writing feedback. This initial feedback was then carefully curated by human reviewers to ensure quality, appropriateness, and variety, addressing a common challenge with LLMs—the tendency to repeat similar advice. The curated dataset was subsequently used to fine-tune a smaller, more efficient LLM through a technique known as model distillation, allowing it to replicate the capabilities of the larger models while operating faster and more reliably in real-time.
Extensive testing has demonstrated that the fine-tuned LLM used in Writing Advisor is highly reliable in generating relevant and constructive feedback. The model is specifically optimized for a single, specialized task—providing writing feedback—reducing the likelihood of irrelevant or off-topic responses. Additionally, students never interact with the LLM directly. Instead, their writing is processed through predefined prompts that structure the AI’s response, ensuring consistency and focus.
As with all generative AI systems, there is always a small possibility of unintended responses. However, rigorous testing, conducted using MI Write’s prepackaged prompts, found no instances of Writing Advisor generating off-topic or inappropriate feedback. Even in cases where student essays contained extreme or inappropriate content, the system consistently provided responses that were neutral, constructive, and appropriate.
Benefits of Writing Advisor Feedback
Writing Advisor feedback enhances accessibility and feedback uptake by providing more personalized, context-aware guidance that is easier for students to understand and apply. Unlike traditional rule-based feedback, which relies on predefined statements tied to rubric scores, Writing Advisor adapts to the specific content and structure of each student’s writing. This allows for more natural, conversational feedback that feels relevant and actionable. Additionally, the AI-driven approach reduces cognitive load by presenting suggestions in a clear, intuitive format, making it more accessible to diverse learners, including those who may struggle with interpreting structured rubric-based feedback. By offering tailored advice in a more engaging and interactive manner, Writing Advisor increases the likelihood that students will read, understand, and implement the feedback, ultimately leading to stronger writing improvements.
Best Practice in the Classroom: Getting the Most Out of MI Write
Using MI Write most effectively requires weaving its automated writing feedback into a structured process of assigning, drafting, revising, and reflecting on students’ work. Teachers who create or select prompts that align with their broader curriculum encourage meaningful writing practice beyond sporadic, stand-alone exercises. Once students draft their responses, they should examine the automated feedback and revise with the system’s suggestions in mind. While Writing Advisor makes feedback more accessible than ever before, teachers can strengthen this feedback loop by guiding students through interpreting the suggestions, prompting them to make deeper changes rather than superficial edits. In doing so, educators not only increase students’ writing practice opportunities, but also help them internalize self-regulatory strategies.
Additionally, MI Write generates classroom- and student-level data that teachers can use for instructional planning. Reviewing class reports can reveal persistent issues—like weak sentence structure or insufficient textual evidence—so teachers can plan small-group mini-lessons or conferencing sessions. Students benefit as well from learning to track their own improvement across multiple essays, motivating them to take greater ownership of the revision process. Combined with direct teacher support, repeated practice, and clear learning targets, the system’s automated scores and trait-level feedback can help elevate writing performance over time.