Thank you for your interest in contributing to Torch-RecHub! We welcome all types of contributions, including but not limited to:
- 🐛 Bug reports
- 💡 Feature suggestions
- 📝 Documentation improvements
- 🔧 Code contributions
- 🧪 Test cases
- 📖 Tutorials and examples
# 1. Fork and clone the repository
git clone https://github.com/YOUR_USERNAME/torch-rechub.git
cd torch-rechub
# 2. Install dependencies and setup environment
uv sync
# 3. Install the package in development mode
uv pip install -e .- Fork the repository: Click the "Fork" button in the upper right corner.
- Make your changes: Implement new features or fix bugs.
- Format code: Run code formatting before committing to ensure consistent code style:
python config/format_code.py
- Commit changes:
git commit -m "feat: add new feature"orfix: fix some issue"(Following Conventional Commits is preferred). - Push to branch:
git push origin - Create Pull Request: Go back to the original repository page, click "New pull request", compare your branch with the
mainbranch of the main repository, and submit PR.
feature/feature-name- for new featuresfix/bug-description- for bug fixesdocs/documentation-update- for documentation changestest/test-description- for test additions
We follow Conventional Commits specification:
feat: add new recommendation modelfix: resolve memory leak in training loopdocs: update installation guidetest: add unit tests for DeepFM modelrefactor: optimize data loading pipeline
-
Push your branch
git push origin your-branch-name
-
Create Pull Request
- Visit the GitHub repository page
- Click "New pull request"
- Select your branch
- Fill out the PR template
-
PR Requirements
- Clear description of changes
- Explanation of why changes are needed
- List related issues (if any)
- Include testing instructions
- Add screenshots (if applicable)
- Unit Tests: Test individual functions or classes
- Integration Tests: Test interactions between modules
- End-to-End Tests: Test complete workflows
tests/
├── test_models/
│ ├── test_ranking.py
│ ├── test_matching.py
│ └── test_multi_task.py
├── test_trainers/
├── test_utils/
└── test_e2e/
# Run all tests
uv run pytest
# Run specific test file
uv run pytest tests/test_models/test_ranking.py
# Run with coverage
uv run pytest --cov=torch_rechubimport pytest
from torch_rechub.models.ranking import DeepFM
def test_deepfm_forward():
"""Test DeepFM forward pass"""
model = DeepFM(
deep_features=deep_features,
fm_features=fm_features,
mlp_params={"dims": [128, 64]}
)
output = model(sample_input)
assert output.shape == expected_shape- API Documentation: Docstrings in code
- User Guides: Files in
docs/directory - Tutorials: Jupyter notebooks in
tutorials/directory - README: Project introduction and quick start
- Use Markdown format
- Include code examples
- Provide clear step-by-step instructions
- Keep both English and Chinese versions synchronized
- Follow scikit-learn style docstrings (NumPy/SciPy convention) for Python code
def train_model(model, data_loader, optimizer):
"""Train a recommendation model.
Parameters
----------
model : torch.nn.Module
Model to train.
data_loader : DataLoader
Training data loader.
optimizer : torch.optim.Optimizer
Optimizer for training.
Returns
-------
float
Training loss.
Examples
--------
>>> model = DeepFM(features, mlp_params)
>>> loss = train_model(model, train_loader, optimizer)
"""
# Implementation here- 📖 Improve documentation and comments
- 🧪 Add test cases
- 🐛 Fix simple bugs
- 📝 Translate documentation
- 💡 Add example code
- 🔧 Code formatting and style improvements
- 🚀 Implement new recommendation algorithms
- ⚡ Performance optimizations
- 🏗️ Architecture improvements
- 📊 Add new evaluation metrics
- 🛠️ Development tools and scripts
- 🔬 Research paper implementations
When implementing new models:
- Follow existing patterns: Look at existing models for structure
- Add comprehensive tests: Include unit tests and integration tests
- Provide examples: Add usage examples in
examples/directory - Document thoroughly: Include docstrings and README updates
- Benchmark performance: Compare with existing implementations
If you encounter issues during contribution:
- Check existing Issues: There might be related discussions
- Create new Issue: Describe your problem clearly
- Join discussions: Ask questions in relevant Issues or PRs
- Contact maintainers: Through GitHub or email
- Check documentation: Review SUPPORT.md for detailed help
We value every contribution! All contributors will be recognized in:
- Contributors list in README
- Release notes acknowledgments
- Project documentation contributor pages
- Special mentions for significant contributions
Please follow our Code of Conduct to ensure a friendly and inclusive community environment.
Project maintainers regularly release new versions:
- Version Planning: Discussed in Issues
- Feature Freeze: Stop adding new features
- Testing: Comprehensive testing phase
- Release Preparation: Update documentation and version numbers
- Official Release: Publish to PyPI
All contributions go through a review process:
- Automated Checks: CI/CD pipeline runs tests
- Code Review: Maintainers review code quality
- Testing: Verify functionality works as expected
- Documentation: Ensure documentation is updated
- Approval: At least one maintainer approval required
Thank you again for your contribution! Every contribution makes Torch-RecHub better. 🎉