π€ Jules Belveze β£ββ π¦ Open Source β β£ββ tsa - Dual-attention autoencoder β β£ββ bert-squeeze - Speed up Transformer models β β£ββ bundler - Learn from your data β β£ββ nhelper - Behavioral testing β βββ time-series-dataset - Dataset utilities β£ββ π Contributions β β£ββ π€ Hugging Face Ecosystem β β β£ββ t5-small-headline-generation - t5 for headline generation β β βββ tldr_news - Summarization dataset β β£ββ βοΈ John Snow Labs Ecosystem β β βββ langtest - Deliver safe & effective NLP models β β£ββ π§Ή Dust β β βββ Dust - Customizable and secure AI assistants. β β£ββ π« SpaCy Ecosystem β β βββ concepCy - SpaCy wrapper for ConceptNet β β£ββ bulk - contributed the color feature β βββ FastBERT - contributed the batching inference βββ π Blogs & Papers β£ββ Atlastic Reputation AI: Four Years of Advancing and Applying a SOTA NLP Classifier β£ββ Real-World MLOps Examples: Model Development in Hypefactors β£ββ LangTest: Unveiling & Fixing Biases with End-to-End NLP Pipelines β£ββ Case Study: MLOps for NLP-powered Media Intelligence using Metaflow β£ββ Scaling Machine Learning Experiments With neptune.ai and Kubernetes βββ Scaling-up PyTorch inference: Serving billions of daily NLP inferences with ONNX Runtime
I currently work as a Software Engineer at @Dust.
My previous experiences include leading AI developments and setting up entire AI infrastructures at Ava, as well as spearheading MLOps and NLP projects at John Snow Labs. I have engineered multilingual NLP solutions at Hypefactors and conducted deep learning research at Microsoft.
I believe that automating model development and deployment using MLOps enables faster feature releases. To achieve this goal, I have worked with various tools such as PyTorch Lightning, FastAPI, HuggingFace, Kubernetes, ONNXruntime, and more.
Apart from this, I have worked extensively with Deep Learning and Time Series, completing my Master's Thesis on Anomaly Detection in High Dimensional Time Series. Additionally, I am keenly interested in exploring state-of-the-art techniques to speed up the inference of Deep Learning models, especially Transformer-based models.
I am an avid open source contributor and advocate for ethical AI practices.