Skip to content
View bGuzzo's full-sized avatar
๐Ÿค“
๐Ÿค“
  • Go Reply
  • Cosenza, IT
  • 05:05 (UTC +01:00)

Block or report bGuzzo

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this userโ€™s behavior. Learn more about reporting abuse.

Report abuse
bGuzzo/README.md

Hi, I'm Bruno Guzzo ๐Ÿ‘‹

I am a Google Cloud Certified Professional Developer and Software Engineer at Go Reply, where I build scalable cloud-native solutions and implement MLOps pipelines.

I recently completed my MSc in AI & ML Engineering. My professional role has expanded to include deploying Generative AI models (Gemini) and building RAG (Retrieval-Augmented Generation) systems on Google Cloud (Vertex AI, Agent Builder).

๐Ÿš€ Key AI & ML Projects

My academic and personal projects demonstrate my hands-on experience in Deep Learning, NLP, and Anomaly Detection:

  • Italian-LLaMA-Project: Fine-tuned the 8B LLaMAntino-3 model for Italian proficiency using 4-bit quantization and LoRA. This project also included the implementation of a complete RAG chatbot system using LangChain and FAISS.
  • MSc Thesis: Semi-Supervised Anomaly Detection: A rigorous experimental analysis of under-sampling techniques (like ENN, NCR, and Tomek Links) to improve the performance and efficiency of Auto-Encoders for semi-supervised anomaly detection.
  • Transformer-GNN for Link Prediction: Designed and built a custom, deep Graph Neural Network (GNN) inspired by the Transformer architecture. The model was trained from scratch on a custom-built 7-million-node dataset (crawled from Wikipedia) for semantic link prediction.
  • Anomaly-Transformer-Analysis: A deep-dive implementation and comparative analysis of the "Anomaly Transformer" paper, benchmarking its novel Anomaly-Attention mechanism against standard self-attention for time series anomaly detection.

๐Ÿ› ๏ธ Core Competencies

Category Technologies
AI & ML PyTorch, LangChain, LLM Fine-Tuning (LoRA), RAG, NLP, Pandas, Vertex AI (Gemini)
Cloud & DevOps Google Cloud (GCP), Terraform, Docker, CI/CD (GitHub, GitLab), Cloud Run, BigQuery
Software Engineering Python (Flask), Java (Spring Boot), JavaScript (Angular), SQL & NoSQL

๐Ÿ“ซ Get In Touch

Pinned Loading

  1. MSc-AI-ML-thesis-anomaly-detection MSc-AI-ML-thesis-anomaly-detection Public

    An experimental analysis of under-sampling in deep semi-supervised anomaly detection. This MSc thesis evaluates how different sampling strategies affect the performance and efficiency of the AE-SADโ€ฆ

    TeX

  2. Anomaly-Transformer-Analysis Anomaly-Transformer-Analysis Public

    An in-depth analysis of the Anomaly Transformer, with experiments on hyperparameter tuning, optimizers, and a head-to-head comparison with a standard Transformer Encoder.

    Python

  3. transformer-gnn-link-prediction transformer-gnn-link-prediction Public

    A deep, transformer-inspired Graph Attention Network (GAT) for link prediction on semantic graphs. This project features a custom GNN architecture and a large-scale dataset built from Wikipedia.

    Python

  4. Italian-LLaMA-Project Italian-LLaMA-Project Public

    An MSc NLP project on fine-tuning LLaMAntino-3-8B for Italian. Explores efficient training on consumer hardware using PEFT/LoRA and includes extensive evaluation (Perplexity, INVALSI) and chatbot/Rโ€ฆ

    TeX