Backend Developer | Systems Programming in Go
type Engineer struct {
Name string
Focus []string
Building string
}
me := Engineer{
Name: "Uthman",
Focus: []string{"Systems Programming", "Network Protocols", "Compilers"},
Building: "HTTP servers, version control, and CLI tools from first principles",
}Built Git's core in Go without using any Git libraries. Content-addressable storage, tree objects, staging area, and commit history.
What works:
- SHA-256 based object storage with automatic deduplication
- Three-tree architecture (working dir โ staging โ repository)
- Tree objects built bottom-up with proper hash dependencies
- zlib compression for all objects
- Persistent commit history
View Project โ | Read Article โ
HTTP/HTTPS server built directly on TCP. No frameworks, just sockets and the HTTP spec.
Performance:
- Started at 250 RPS (buggy implementation)
- Fixed connection handling: 1,389 RPS
- Added keep-alive optimization: 1,710 RPS
- Peak: 4,000 RPS
- 16x improvement from initial version
View Project โ | Read Article โ
Real-time messaging system handling 100+ concurrent connections. TLS encryption, AI integration, rate limiting.
Technical implementation:
- Concurrent connection management with goroutines
- Real-time message broadcasting across lobbies
- AI assistant with conversation context
- Rate limiting to prevent abuse
Performance: 100+ simultaneous users, ~50MB baseline memory, minimal CPU under load.
Mathematical computing environment with AST parser, comparison operators, logical operations, and unit conversions.
Features:
- Recursive descent parser with proper precedence
- Comparison operators (>, <, >=, <=, ==, !=) returning boolean values
- Logical operators (&&, ||) with correct precedence
- Scientific functions (trig, log, stats)
- Unit conversion system (length, weight, time)
- Persistent variable storage and calculation history
Test coverage: 95% on core modules (tokenizer, parser, evaluator, units)
View Project โ | Read Article โ
Tokenizer library for building compilers, interpreters, and DSLs.
Implementation:
- 50+ token types with Unicode support
- Single-pass tokenization, low memory allocation
- Error recovery (continues after detecting errors)
- Validated against 1700+ test tokens
Use cases: Compiler frontends, configuration parsers, code analysis tools.
AI tool for summarizing PDFs with multiple output styles. Built for exam prep, used by students for CBT preparation.
Technical challenges:
- UTF-8 encoding normalization
- Character corruption fixes
- PDF text extraction
PHP library for 2FA with authenticator apps, email, and SMS verification.
Security:
- TOTP-based code generation
- Secret key encryption before storage
- QR code generation
- Building Git from Scratch in Go
- Building an HTTP Server from TCP Sockets: 250-4000 RPS
- Building a Terminal Calculator That Actually Does Logic - Axion
- More articles on DEV.to

