The Exam Sheets Evaluator is a Python-based system designed to automate the evaluation of student exam answers. It supports two types of questions:
- One-word answers (1 mark each)
- Short-answer questions (2 marks each)
The system uses natural language processing (NLP) techniques to assess student responses, providing both numeric scores and detailed feedback.
- Automated Grading: Evaluates student answers against correct answers using:
- Cosine similarity for short answers
- Levenshtein distance for one-word answers
- Lemmatization for word normalization
- Question Types:
- One-word questions (exact matching with spelling tolerance)
- Short-answer questions (semantic similarity evaluation)
- Professor Mode:
- Load/create question banks
- Set evaluation thresholds
- View question banks
- Student Mode:
- Take randomized exams
- View results with detailed feedback
- Track performance over time
- Practice with question bank
- Performance analytics and progress tracking
- Weak area identification
- Practice mode with immediate feedback
- Exam history storage
- Visualization of performance trends
-
Clone the repository:
git clone https://github.com/yourusername/exam-sheets-evaluator.git cd exam-sheets-evaluator -
Install required dependencies:
pip install pandas numpy scikit-learn nltk matplotlib
-
Download NLTK data:
import nltk nltk.download('punkt') nltk.download('stopwords') nltk.download('wordnet')
-
Run the system:
python Exam_Sheets_Evaluator.ipynb
-
Choose between Professor or Student mode at startup.
- Load question bank from CSV
- Create sample question bank
- Adjust evaluation thresholds
- View current question bank
- Take a new exam (randomized questions)
- View previous results
- Check performance analytics
- Practice with questions
- Review question bank
Exam_Sheets_Evaluator.ipynb: Main Jupyter notebook containing the systemexam_results/: Directory where student results are stored (created automatically)- Sample question bank CSV files can be created through the professor interface
- Python 3.9+
- pandas
- numpy
- scikit-learn
- nltk
- matplotlib
Professors can:
- Adjust the evaluation threshold (
short_answer_threshold) - Modify the scoring scale for short answers
- Create custom question banks in CSV format with columns:
- Question
- Type ("one-word" or "short-answer")
- Correct Answer
This project is open-source and available under the MIT License.
Contributions are welcome! Please fork the repository and submit a pull request with your improvements.