Building scalable systems, intelligent pipelines, and data-driven solutions across fintech & enterprise environments.
I'm a Software & Data Engineer with over 5 years of experience building scalable backend systems, data pipelines, and intelligent solutions across fintech, payments, and enterprise environments.
My work spans the full data lifecycle — from designing ETL/ELT pipelines and optimizing SQL-heavy workloads, to developing REST APIs, integrating AI/ML components, and building interactive BI dashboards that drive business decisions.
I've worked with technologies like Python, Java, SQL, Azure, AWS, Databricks, Power BI, and modern AI frameworks like LangChain and LLMs. I'm passionate about turning complex data into clear, actionable insights and building systems that are reliable, performant, and maintainable.
Currently based in Dublin, Ireland, I hold a Master's in Artificial Intelligence from National College of Ireland and I'm always looking for challenging problems at the intersection of software engineering, data, and AI.
Built a GDS link-prediction pipeline leveraging graph algorithms (common neighbors, preferential attachment, triangle counts) achieving 82.2% speed-up. Trained Random Forest model natively (AUCPR 0.826 training, 0.801 testing). Hybrid Neo4j-Python workflow with 87.3% cache hit rate and 3.2x throughput improvement.
Developed a hybrid ML pipeline combining real and StyleGAN2-generated synthetic skin disease images. Fine-tuned EfficientNet models on enriched datasets, boosting accuracy and robustness. Evaluated using FID, LPIPS, ROC-AUC, Precision, Recall, and F1-Score metrics.
Real-time voice-interactive chatbot using OpenAI Whisper (STT) and Coqui TTS with latency under 2 seconds. FastAPI microservices cutting latency by 45%. RAG pipeline combining FAISS and PostgreSQL improving relevance by 30%. Fine-tuned Llama model. Deployed via Django + Docker.
Fine-tuned Llama model on college-specific datasets for education domain queries achieving 85% accuracy. Custom dataset with augmentation techniques. Deployed on GCP with GPU instances supporting 100 concurrent users. 20% improvement over baseline models.
Created database objects (tables, stored procedures, views, triggers, functions) with performance tuning. Developed T-SQL pivot/unpivot queries for efficient report analysis. Built SSIS packages for data load and TruBI dashboards.
Created stored procedures with business logic. Built SSIS packages for base and survey data load. Implemented TruBI reports. Conducted unit testing and UAT for data integration validation across source systems.
Extensive SSIS work importing, exporting, and transforming data between linked servers. Created database objects using T-SQL in development and production environments. Survey data load utilities and analytics reports via TruBI and SSAS.
Developed a user-friendly online auctioning platform using .NET with product authentication, secure bidding server, and highly-scalable architecture supporting large numbers of concurrent bidders in active auctions.
I'm currently open to new opportunities and always happy to connect. Whether you have a role in mind, a project to collaborate on, or just want to say hello — feel free to reach out!