Form Overview

Title: “Quest Formative Feedback Sprint - Plagiarism Prevention Module Testing”
Description: Help improve our plagiarism prevention quest while learning user testing methods for your own e-learning projects!

Estimated Time: 25-30 minutes
Participants: ~100 students across 3 classes
Groups: Work in pairs or groups of 3, submit one form per group


Google Form Structure

Section 1: Team Information

[Short Answer Questions]

  1. Group Members (Required)
    • List names of all team members
    • Example: Sarah J, Mike K, Alex L
  2. Class Period (Required - Multiple Choice)
    • Period 1
    • Period 2
    • Period 3
    • Other: _______
  3. Testing Location (Multiple Choice)
    • Computer Lab
    • Classroom with laptops
    • Personal devices
    • Other: _______

Section 2: C1 - Plagiarism Case Studies

[Time limit guidance: < 5 minutes]

  1. Best UI Element in C1 (Required - Multiple Choice)
    • Page layout/design
    • Audio content
    • Video content
    • Interactive elements
    • Navigation
    • Other: _______
  2. Why was this element best? (Required - Short Answer)
    • Be specific about what worked well
  3. Biggest plagiarism takeaway/surprise (Required - Paragraph)
    • What shocked you most about the consequences? Share one memory from the cases.
  4. Least favorite UI element in C1 (Required - Multiple Choice)
    • Page layout/design
    • Audio content
    • Video content
    • Interactive elements
    • Navigation
    • Other: _______
  5. Why didn’t this element work? (Required - Short Answer)
    • Be specific about the problem
  6. C1 Difficulty Rating (Required - Linear Scale: 0-5)
    • 0 = Very Hard, 5 = Very Easy
    • Scale: 0, 1, 2, 3, 4, 5

Section 3: C2 - APA Reference & Citation Training

[Time limit guidance: 5-7 minutes]

  1. Best UI element in C2 (Required - Multiple Choice)
    • Page layout/design
    • Quote Finder tool
    • Building your own reference/citation tool
    • Test Fill feature
    • Video tutorials
    • Other: _______
  2. Why was this C2 element most helpful? (Required - Short Answer)
    • Explain what made it effective
  3. Was coming up with quotes too hard? (Required - Multiple Choice)
    • Yes, way too hard
    • Yes, somewhat hard
    • No, just right
    • No, too easy
    • Not applicable
  4. Why? (Quote difficulty explanation) (Required - Short Answer)
    • Explain your quote difficulty rating
  5. Was building references/citations too hard? (Required - Multiple Choice)
    • Yes, way too hard
    • Yes, somewhat hard
    • No, just right
    • No, too easy
    • Didn’t try this feature
  6. Why? (Citation building explanation) (Required - Short Answer)
    • Explain your citation difficulty rating
  7. Quick Win: What will you actually use? (Required - Short Answer)
    • One thing you learned that you’ll apply to real assignments

Section 4: C3 - Error Correction Practice

[Time limit guidance: 5 minutes]

  1. Did you like the C3 error correction page? (Required - Multiple Choice)
    • Yes, loved it
    • Yes, liked it
    • Sort of/neutral
    • No, didn’t like it
    • No, hated it
  2. Best part of C3 (Required - Short Answer)
    • What worked well in the error correction exercises?
  3. Did it feel like something was missing? (Required - Multiple Choice)
    • Yes, definitely missing something important
    • Yes, missing something minor
    • No, felt complete
    • Not sure
  4. If missing, what? (Conditional - Short Answer)
    • Only appears if answered “Yes” to question 19
  5. C3 Difficulty Rating (Required - Linear Scale: 0-5)
    • 0 = Very Hard, 5 = Very Easy
    • Scale: 0, 1, 2, 3, 4, 5
  6. Would you do more exercises like this voluntarily? (Required - Multiple Choice)
    • Definitely yes
    • Probably yes
    • Maybe
    • Probably no
    • Definitely no

Section 5: C4 - Build Your Own Document

[Time limit guidance: 5 minutes]

  1. Favorite C4 feature (Required - Short Answer)
    • What did you like best about the document building tools?
  2. Why was this your favorite? (Required - Short Answer)
    • Explain what made it effective or enjoyable
  3. Hardest part of C4 (Required - Short Answer)
    • What was most challenging about building your document?
  4. How could the hard part be easier? (Required - Short Answer)
    • Suggest specific improvements
  5. Will you use tools like this for real assignments? (Required - Multiple Choice)
    • Definitely will use
    • Probably will use
    • Might use
    • Probably won’t use
    • Definitely won’t use

Section 6: Overall Quest Experience

[Time limit guidance: 5 minutes]

  1. Did the main page status help you understand progression? (Required - Multiple Choice)
    • Yes, very clear
    • Yes, somewhat clear
    • Neutral/unsure
    • No, somewhat confusing
    • No, very confusing
  2. What was clear about the main page? (Required - Short Answer)
    • Specific elements that helped navigation
  3. What was confusing about the main page? (Required - Short Answer)
    • Specific elements that hindered understanding
  4. Did this help you understand the idea of a quest? (Required - Multiple Choice)
    • Yes, totally got the quest concept
    • Yes, mostly understood quests
    • Somewhat understood
    • Not really
    • No, still confused about quests
  5. How is this different from regular lessons? (Required - Paragraph)
    • Describe the key differences you noticed

Section 7: C5 & C6 Preview Response

[Shown after C1-C4 completion]

  1. After seeing teacher assessment and student review, do you understand how data accumulates? (Required - Multiple Choice)
    • Yes, completely understand the data flow
    • Yes, mostly understand
    • Somewhat understand
    • Not really
    • No, very confused
  2. What’s smart about this assessment approach? (Required - Short Answer)
    • Identify strengths of the automated feedback system
  3. What would you change about C5/C6? (Required - Short Answer)
    • Suggest specific improvements

Section 8: Learning Transfer & Overall Feedback

  1. Will this testing experience help you build your own quest? (Required - Multiple Choice)
    • Yes, learned a lot about user testing
    • Yes, got some useful ideas
    • Maybe, need to think about it
    • Probably not
    • No, didn’t learn much
  2. Most valuable lesson for your own quest building (Required - Short Answer)
    • What will you apply to your own e-learning project?
  3. Overall quest rating (Required - Linear Scale: 1-10)
    • 1 = Terrible, 10 = Excellent
    • Scale: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
  4. Would you recommend this to other students? (Required - Multiple Choice)
    • Definitely recommend
    • Probably recommend
    • Might recommend
    • Probably wouldn’t recommend
    • Definitely wouldn’t recommend
  5. Final suggestions (Optional - Paragraph)
    • Any other feedback, ideas, or suggestions for improvement

Data Analysis Setup

Automatic Charts to Enable:

  • Question 9, 21: Difficulty ratings (histogram)
  • Question 38: Overall ratings (histogram)
  • Questions 4, 7, 10: UI element preferences (pie charts)
  • Question 2: Class period distribution (pie chart)
  • Questions 12, 14: Task difficulty assessments (stacked bar)

Key Metrics to Track:

  1. Completion Rate: % who finish entire form
  2. Difficulty Patterns: Average ratings per module
  3. UI Preferences: Most/least popular elements
  4. Learning Transfer: % who say experience helps their projects
  5. Recommendation Rate: % who would recommend to others

Cross-Analysis Opportunities:

  • Difficulty ratings vs. overall satisfaction
  • UI preferences by class period
  • Learning transfer by overall rating
  • Completion rate by testing location

Implementation Tips

Before Testing:

  • Create form and test with 2-3 students first
  • Set up automatic email notifications for responses
  • Prepare sharing link and QR code for easy access
  • Brief students on time management (25-30 min total)

During Testing:

  • Share form link at start: “Complete C1-C4, then fill sections 1-6”
  • Announce time checks: “5 minutes for C1 feedback”
  • Show C5/C6 demo when groups reach section 7
  • Encourage honest feedback: “Help make this better!”

After Testing:

  • Download responses as Excel for deeper analysis
  • Create summary report within 48 hours
  • Share key findings with students to show impact
  • Use data to prioritize improvement sprints

Form Settings:

  • Collect names (for follow-up if needed)
  • Allow response editing (in case groups want to revise)
  • Require sign-in (prevents duplicate responses)
  • Limit to 1 response per account (groups share login)

This Google Form structure will give you rich, analyzable data from all 100 students while teaching them practical user testing methods they can apply to their own quest projects!