I have hands-on experience with Databricks Unity Catalog and Delta Lake, enabling secure, efficient, and organized data management. My work also encompasses AWS cloud services, data ingestion pipelines, domain-specific projects, and optimizing data workflows using SQL Server and modern data tools.
I'm passionate about solving complex data challenges and utilizing cutting-edge cloud technologies to drive impactful solutions.
⚡Portfolio 🔎LinkedIn 🌐Website 💬Twitter 📰Dev.to
- Programming Languages: Python, Java, SQL
- Cloud Platforms: AWS (Certified CCP, Developer, DevOps, Solution Architect Associate, Solution Architect Professional), GCP (Cloud Engineer)
- Tools & Technologies: Git, GitHub, Jenkins, Databricks, Kafka, AirFlow, PySpark, Spark, Terraform, GenAI, Snowflake, Pandas, NumPy!
.
✔️Certificate verification link: Cloud Practitioner Developer DevOps Engineer Solution Architect Associate Solution Architect Professional Associate Cloud Engineer
My Projects as a Developer & Data Engineer in Deloitte:
- Healthcare Project
Played a pivotal role in a groundbreaking healthcare project, contributing to ETL workflows using Python and SQL. Enhanced data integration and analysis, fostering data-driven decision-making processes while ensuring efficient data management practices.
- AWS Development Project – Diverse Cloud Services
Worked concurrently on this AWS project alongside the healthcare initiative to deepen my knowledge of cloud technologies. Showcased versatility by gaining hands-on experience with various AWS services, learning cloud architecture, deployment, and management. During this project, I achieved certifications as an AWS Certified Cloud Practitioner (CCP), Developer Associate, and Solutions Architect Associate, demonstrating my proficiency in the AWS ecosystem.
- Business Insurance Project – Migrating On-Premises Data to the Cloud
Currently contributing to this transformative project as a Data Engineer and Developer. My responsibilities include migrating on-premises data from multiple source systems to the cloud using Python, SQL, ETL processes, AWS services, Databricks Unity Catalog, Delta Lake, Kafka, and Spark. I build and manage robust, end-to-end data pipelines while utilizing Jenkins for automation and Git for version control. Leveraging Terraform for Infrastructure as Code (IaC), I streamline cloud infrastructure deployment and management. This project involves integrating big data tools and technologies to enhance data processing capabilities and collaborating with Jira for agile project management.
Spot Award, Applause Award, PACE Award X 2.
Feel free to connect with me on LinkedIn or Twitter if you have any questions, suggestions, or just want to chat.
Let's collaborate and create amazing things together!