Syed Muhammad Taha Imam
NUST
· 2026
·
411155
Email
simam.bscs22seecs@seecs.edu.pk
Phone
923181272233
GitHub
—
Academic
Program
—
CGPA
3.44
Year
2026
Education
BSCS
School of Electrical Engineering and Computer Science , Islamabad , 3.44 (2026)
Address
HOUSE #:1683, STREET 26, PRECINCT 8, BAHRIA TOWNKARACHI , Karachi , Pakistan
DOB
—
Career
Current role
—
Target role
—
Skills
PROFESSIONAL PROFILE
Deep Learning–focused Computer Science undergraduate with hands-on research and industry experience in Transformer
architectures, model refinement, and distributed training systems. Proven ability to improve model efficiency through
architectural optimization, parameter reduction, and scalable inference pipelines, with experience spanning LLMs, Vision-
Language Models, RAG systems, and production-grade MLOps. Actively engaged in research-driven development with a strong
inclination toward core DL over black-box model usage.
EDUCATION
BSCS
School of Electrical Engineering and Computer Science , Islamabad , 3.44 (2026)
INTERNSHIP EXPERIENCE
Epistemy UK
01-Sep-2025 - 01-Dec-2025
– Led the end-to-end development of an AI tutoring platform, architecting a Nest.js backend (20+ end- points, Supabase) and a
Next.js frontend. – Architected an event-driven task queue (Redis/BullMQ) to orchestrate multi-agent workflows, ensuring fault
tolerance across distributed analysis pipelines. – Enforced strict SWE standards with CI pipelines and pre-commit hooks, achieving
and maintaining 90%+ unit test coverage.
RapidsAI
01-Sep-2024 - 01-Dec-2024
• Integrated an RAG-based chatbot into a website using Streamlit, enhancing user engagement by 35%. • Optimized chatbot
responses through Chain-of-Thought prompting, reducing incorrect responses by 50%. • Cut OpenAI API costs 50% with a multi-
model query routing system. • Designed FastAPI endpoint for chatbot responses, enabling user session management and contextual
interactions. • Built a Twitter scraper that processed 500+ tweets daily, enabling large-scale sentiment analysis.
CogniMind AI
01-Feb-2025 - 01-May-2025
• Deployed a dockerized Apache Airflow instance on a virtual machine, improving workflow automation efficiency by 40%. • Designed
5+ scalable DAGs in Airflow to orchestrate data pipelines, reducing manual intervention by 60%. • Developed CI/CD pipelines with
GitHub Actions and Docker, cutting deployment time by 30% • Improved VLM document extraction accuracy by 20% using prompt
engineering techniques • Optimized inference pipelines by 10% using parallel and batch processing techniques • Improved retrieval
speed and accuracy by 10% using quantization and HNSW parameter tuning
Bradbury Lab
01-Apr-2025 - 20-Jan-2026
– Conducted a comprehensive literature review on Transformer topology and PEFT, identifying specific inefficiencies in existing
weight-sharing methods. – Proposed a novel layer-merging strategy based on Tucker Decomposition, aiming to reduce parameter
count without retraining – Analyzed the mathematical properties of Self-Attention blocks to demonstrate the feasibility of aligning
Query (Q) and Key (K) projections for future efficient-by-design architectures.
FINAL YEAR PROJECT
LiteDoc: Distilling Large Document Models into Efficient Task-Specific Encoders.
AI enrichment
Deep Learning–focused Computer Science undergraduate with hands-on research and industry experience in Transformer
architectures, model refinement, and distributed training systems. Proven ability to improve model efficiency through
architectural optimization, parameter reduction, and scalable inference pipelines, with experience spanning LLMs, Vision-
Language Models, RAG systems, and production-grade MLOps. Actively engaged in research-driven development with a strong
inclination toward core DL over black-box model usage.
Status: ai_done
Provenance
Source file: —Created: 1777448792