Skip to content

Conversation

@HawKhiem
Copy link
Contributor

@HawKhiem HawKhiem commented Jan 26, 2026

Summary

Introduces an AI-powered Instructor Checklist panel in the programming exercise editor. This feature analyzes the problem statement to infer learning goals, assess difficulty consistency, and identify quality issues (clarity, coherence, completeness), helping instructors create higher-quality exercises.

Checklist

General

Server

  • Important: I implemented the changes with a very good performance and prevented too many (unnecessary) and too complex database calls.
  • I strictly followed the principle of data economy for all database calls.
  • I strictly followed the server coding and design guidelines and the REST API guidelines.
  • I added multiple integration tests (Spring) related to the features (with a high test coverage).
  • I added pre-authorization annotations according to the guidelines and checked the course groups for all new REST Calls (security).
  • I documented the Java code using JavaDoc style.

Client

  • Important: I implemented the changes with a very good performance, prevented too many (unnecessary) REST calls and made sure the UI is responsive, even with large data (e.g. using paging).
  • I strictly followed the principle of data economy for all client-server REST calls.
  • I strictly followed the client coding guidelines.
  • I strictly followed the AET UI-UX guidelines.
  • Following the theming guidelines, I specified colors only in the theming variable files and checked that the changes look consistent in both the light and the dark theme.
  • I added multiple integration tests (Jest) related to the features (with a high test coverage), while following the test guidelines.
  • I added authorities to all new routes and checked the course groups for displaying navigation elements (links, buttons).
  • I documented the TypeScript code using JSDoc style.
  • I added multiple screenshots/screencasts of my UI changes.
  • I translated all newly inserted strings into English and German.

Changes affecting Programming Exercises

  • High priority: I tested all changes and their related features with all corresponding user types on a test server configured with the integrated lifecycle setup (LocalVC and LocalCI).
  • I tested all changes and their related features with all corresponding user types on a test server configured with LocalVC and Jenkins.

Motivation and Context

Instructors often struggle to ensure their programming exercises explain concepts clearly, align with declared difficulty levels, and cover appropriate learning goals. Without immediate feedback, these issues may only be discovered when students struggle. This feature provides an automated, intelligent review of the problem statement before the exercise is published.

Description

  1. Client:

    • Added ChecklistPanelComponent (Angular standalone) deeply integrated into the programming exercise update view.
    • Using the generated OpenAPI client HyperionProblemStatementApiService to communicate with the backend.
    • Displays three analysis sections:
      • Inferred Learning Goals: Lists skills extracted from the text, confidence levels, and taxonomy classification.
      • Difficulty Assessment: Compares the AI-suggested difficulty with the instructor's declared difficulty.
      • Quality Issues: Highlights potential clarity, coherence, or completeness issues.
    • Styled to support both Light and Dark modes using standard Artemis/Bootstrap components.
  2. Server:

    • Implemented HyperionChecklistService to orchestrate AI calls.
    • Added DTOs ChecklistAnalysisRequestDTO, ChecklistAnalysisResponseDTO for structured communication.
    • Added new Prompt Templates checklist_learning_goals.st, checklist_difficulty.st, checklist_quality.st for targeted AI analysis.
    • Exposed a new endpoint in HyperionProblemStatementResource.

Steps for Testing

Prerequisites:

  • 1 Instructor
  1. Log in to Artemis as an Instructor.
  2. Navigate to Course Management > Exercises > Programming Exercises.
  3. Create a new exercise or edit an existing one.
  4. Scroll down to the Problem Statement section.
  5. You should see the Instructor Checklist panel below the editor.
  6. Click Analyze Problem Statement.
  7. Wait for the analysis to complete (loading spinner).
  8. Verify that the results (Learning Goals, Difficulty, Quality Issues) are displayed correctly.
  9. Dark Mode Test: Switch your theme to Dark Mode and verify the panel remains legible and clearly styled.

Exam Mode Testing

Prerequisites:

  • 1 Instructor
  • 1 Exam with a Programming Exercise
  1. Log in to Artemis.
  2. Edit a programming exercise within an Exam.
  3. Verify that the Instructor Checklist panel also appears and functions correctly in the Exam exercise editor mode.

Testserver States

You can manage test servers using Helios. Check environment statuses in the environment list. To deploy to a test server, go to the CI/CD page, find your PR or branch, and trigger the deployment.

Review Progress

Test Coverage

Warning: Client tests failed. Coverage could not be fully measured. Please check the workflow logs.

Server

Class/File Line Coverage Lines
ConsistencyIssueCategory.java 100.00% 8
ArtifactLocationDTO.java 100.00% 13
ChecklistAnalysisRequestDTO.java 100.00% 7
ChecklistAnalysisResponseDTO.java 100.00% 6
ConsistencyIssueDTO.java 100.00% 8
DifficultyAssessmentDTO.java 100.00% 6
LearningGoalItemDTO.java 100.00% 5
HyperionChecklistService.java 78.72% 163
HyperionProblemStatementResource.java 100.00% 86

Last updated: 2026-01-27 02:00:44 UTC

Screenshots

@github-actions github-actions bot added server Pull requests that update Java code. (Added Automatically!) client Pull requests that update TypeScript code. (Added Automatically!) programming Pull requests that affect the corresponding module hyperion labels Jan 26, 2026
@github-actions
Copy link

@HawKhiem Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details.

@github-actions
Copy link

End-to-End (E2E) Test Results Summary

TestsPassed ✅Skipped ⚠️FailedTime ⏱
End-to-End (E2E) Test Report223 ran222 passed1 skipped0 failed1h 28m 25s 450ms
TestResultTime ⏱
No test annotations available

@github-actions github-actions bot added the tests label Jan 27, 2026
@github-actions
Copy link

@HawKhiem Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details.

@github-actions
Copy link

@HawKhiem Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details.

@github-actions
Copy link

@HawKhiem Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details.

@github-actions
Copy link

End-to-End (E2E) Test Results Summary

TestsPassed ✅SkippedFailedTime ⏱
End-to-End (E2E) Test Report1 ran1 passed0 skipped0 failed2s 217ms
TestResultTime ⏱
No test annotations available

@github-actions
Copy link

@HawKhiem Test coverage could not be fully measured because some tests failed. Please check the workflow logs for details.

@github-actions
Copy link

End-to-End (E2E) Test Results Summary

TestsPassed ✅Skipped ⚠️FailedTime ⏱
End-to-End (E2E) Test Report223 ran222 passed1 skipped0 failed1h 25m 34s 509ms
TestResultTime ⏱
No test annotations available

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

client Pull requests that update TypeScript code. (Added Automatically!) hyperion programming Pull requests that affect the corresponding module server Pull requests that update Java code. (Added Automatically!) tests

Projects

Status: Work In Progress

Development

Successfully merging this pull request may close these issues.

2 participants