Akhil Bonthinayanivari

I'm

About

I am a passionate Data Engineer with practical experience in building scalable data pipelines, cloud data platforms, and big data processing. My expertise includes Python, SQL, Apache Spark, ETL, and cloud solutions across AWS, Azure, and GCP. I'm eager to work on challenging data problems and transform raw data into meaningful insights.

Your Name

Data Engineer

Strong experience in building data pipelines, cloud architecture and data analytics platforms.

  • Degree: Master in Data Engineering
  • City: Dallas, Texas, USA
  • Opportunities: Available

Skills

I specialize in building data pipelines, big data processing, cloud-based data solutions, and machine learning workflows. My expertise spans across multiple cloud platforms and data engineering frameworks.

Python 95%
SQL 95%
Apache Spark 90%
AWS / Azure / GCP 90%
Airflow & dbt 85%
Snowflake & Redshift & BigQuery 85%
TensorFlow & Machine Learning 80%
Django 85%
JavaScript 90%

Resume

I am a Data Engineer with hands-on experience in building scalable data pipelines, cloud data platforms, and big data processing systems. Skilled in SQL, Python, Apache Spark, and cloud platforms including AWS, Azure, and GCP.

Professional Summary

Akhil Bonthinayanivari

Aspiring Data Engineer with hands-on experience gained through an Accenture-sponsored internship, focused on designing scalable data pipelines and converting raw data into valuable insights. Proficient in SQL, Python, and ETL processes, with hands-on exposure to cloud platforms like AWS and Azure. Experienced with big data technologies like Apache Spark and data warehousing solutions such as Snowflake and Redshift. Solid background in data modeling, quality assurance, and workflow automation using tools like Apache Airflow and dbt.

  • Dallas, Texas, USA
  • +1 470-940-8044
  • akhilme008@gmail.com

Education

Master in Data Engineering

2023 - 2025

University of North Texas, Denton, TX

Bachelor in Computer Science

2020 - 2023

Madanapalle Institute of Technology and Science

Diploma in Computer Science

2017 - 2020

SV Government Polytechnic

Professional Experience

Data Engineer Intern

Dec 2022 - Aug 2023

Accenture

  • Developed and deployed machine learning models for predictive analytics using Spark and TensorFlow.
  • Documented data architecture designs and changes to ensure knowledge transfer and system maintainability.
  • Configured and maintained cloud-based data infrastructure on platforms like AWS, Azure, and Google Cloud to enhance data storage and computation capabilities.
  • Optimized ETL pipelines and worked with large-scale datasets for enhanced analytics and reporting.

Skills

  • SQL, Python, ETL, Apache Spark, AWS, Azure, GCP, Snowflake, Redshift, Big Query, Apache Airflow, TensorFlow, Data Modeling, Data Warehousing

Projects

  • Real time voting data engineering project - This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgres and Streamlit. The system is built using Docker Compose to easily spin up the required services in Docker containers.
  • RealtimeStreamingEngineering - Data Pipeline with Reddit, Airflow, Celery, Postgres, S3, AWS Glue, Athena, and Redshift
  • Sparkify ETL Pipeline with Airflow and Redshift - This project builds a complete ETL pipeline to ingest Sparkify’s music data into an AWS Redshift data warehouse. It uses Apache Airflow for orchestration and scheduling. Data is extracted from AWS S3, transformed using custom Python scripts, and loaded into a star schema in Redshift to support analytical queries about user activity and song preferences.
  • Digital CheckIn System - A complete employee management and attendance system built with Django. The application allows employee registration, face recognition-based attendance marking, leave request management, department management, and admin dashboards for full control. Employees can update their profile, while admins can monitor attendance records, approve leave requests, and manage employee data efficiently.
  • Full-Stack Django Personal Portfolio with AWS Deployment - Designed and developed a full-stack personal portfolio website using Django as backend and Bootstrap 5, AOS, and custom JavaScript libraries for interactive frontend. Implemented complete CRUD operations for dynamic project management using Django Admin interface. Integrated media upload functionality for project images and GitHub repository linking for each project entry. Enabled real-time category-based filtering of projects using Django ORM, slugify filters, and Isotope-style filtering logic. Configured secure contact form with CSRF protection, form validation, and Django form processing. Optimized static file handling using Django Whitenoise and Nginx for efficient production-ready deployment. Fully deployed on AWS EC2 instance with Gunicorn, Nginx, systemd, and PostgreSQL database integration. Managed source code and deployment lifecycle using Git, GitHub, and SSH remote synchronization.
  • Django Tailwind Blog - Developer Portfolio Site - Created a complete developer portfolio website with Django backend and TailwindCSS frontend. Created dynamic blog, project showcase, categories, and contact form and implemented custom error handling. Built Django models, templates, and PostgreSQL database integrations. Deployed responsive web application with static/media file management.
Akhil Bonthinayanivari

Cover Letter

Date:

To:

Dear ,

I am excited to apply for the Data Engineer position at . With over one year of hands-on experience in building scalable data pipelines, transforming raw data into actionable insights, and managing cloud-based infrastructures, I am confident in my ability to contribute significantly to your team's success.

Throughout my internship at Accenture, I gained valuable experience in working with AWS, Azure, GCP, and big data technologies like Apache Spark. I was responsible for developing machine learning models, optimizing ETL pipelines, and designing cloud data architectures. Additionally, my knowledge of SQL, Python, and tools like Apache Airflow and dbt has allowed me to build robust data engineering solutions, which I believe would be an asset to your team.

Key Skills

  • Extensive expertise in AWS, Azure, and GCP platforms
  • Proven proficiency with Python, Apache Spark, and SQL
  • Strong experience with data warehousing solutions such as Snowflake and BigQuery
  • Experience with Django web development and PostgreSQL integration

Highlighted Projects

  • Real time voting data engineering project

    This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgres and Streamlit. The system is built using Docker Compose to easily spin up the required services in Docker containers.

  • RealtimeStreamingEngineering

    Data Pipeline with Reddit, Airflow, Celery, Postgres, S3, AWS Glue, Athena, and Redshift

  • Sparkify ETL Pipeline with Airflow and Redshift

    This project builds a complete ETL pipeline to ingest Sparkify’s music data into an AWS Redshift data warehouse. It uses Apache Airflow for orchestration and scheduling. Data is extracted from AWS S3, transformed using custom Python scripts, and loaded into a star schema in Redshift to support analytical queries about user activity and song preferences.

  • Digital CheckIn System

    A complete employee management and attendance system built with Django. The application allows employee registration, face recognition-based attendance marking, leave request management, department management, and admin dashboards for full control. Employees can update their profile, while admins can monitor attendance records, approve leave requests, and manage employee data efficiently.

  • Full-Stack Django Personal Portfolio with AWS Deployment

    Designed and developed a full-stack personal portfolio website using Django as backend and Bootstrap 5, AOS, and custom JavaScript libraries for interactive frontend. Implemented complete CRUD operations for dynamic project management using Django Admin interface. Integrated media upload functionality for project images and GitHub repository linking for each project entry. Enabled real-time category-based filtering of projects using Django ORM, slugify filters, and Isotope-style filtering logic. Configured secure contact form with CSRF protection, form validation, and Django form processing. Optimized static file handling using Django Whitenoise and Nginx for efficient production-ready deployment. Fully deployed on AWS EC2 instance with Gunicorn, Nginx, systemd, and PostgreSQL database integration. Managed source code and deployment lifecycle using Git, GitHub, and SSH remote synchronization.

  • Django Tailwind Blog - Developer Portfolio Site

    Created a complete developer portfolio website with Django backend and TailwindCSS frontend. Created dynamic blog, project showcase, categories, and contact form and implemented custom error handling. Built Django models, templates, and PostgreSQL database integrations. Deployed responsive web application with static/media file management.

Thank you for considering my application. I look forward to the possibility of discussing how my skills and background can contribute to your team's success.

Sincerely,
Akhil Bonthinayanivari

Projects

Explore some of the key projects I've worked on, showcasing my expertise in various technologies.

  • All
  • Python
  • Django
  • Odoo
  • JavaScript
  • Data Engineering
  • Web Development
  • Machine Learning
Real time voting data enginee…

Python, JavaScript, Data Engineering

This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgres and Streamlit. The system is built using Docker Compose to easily spin up the required services in Docker containers.

RealtimeStreamingEngineering

Python, Data Engineering

Data Pipeline with Reddit, Airflow, Celery, Postgres, S3, AWS Glue, Athena, and Redshift

Sparkify ETL Pipeline with Ai…

Python, Data Engineering

This project builds a complete ETL pipeline to ingest Sparkify’s music data into an AWS Redshift data warehouse. It uses Apache Airflow for orchestration and scheduling. Data is extracted from AWS S3, transformed using custom Python scripts, and loaded into a star schema in Redshift to support analytical queries about user activity and song preferences.

Digital CheckIn System

Python, Django, JavaScript, Web Development, Machine Learning

A complete employee management and attendance system built with Django. The application allows employee registration, face recognition-based attendance marking, leave request management, department management, and admin dashboards for full control. Employees can update their profile, while admins can monitor attendance records, approve leave requests, and manage employee data efficiently.

Full-Stack Django Personal Po…

Python, Django, JavaScript, Web Development

Designed and developed a full-stack personal portfolio website using Django as backend and Bootstrap 5, AOS, and custom JavaScript libraries for interactive frontend. Implemented complete CRUD operations for dynamic project management using Django Admin interface. Integrated media upload functionality for project images and GitHub repository linking for each project entry. Enabled real-time category-based filtering of projects using Django ORM, slugify filters, and Isotope-style filtering logic. Configured secure contact form with CSRF protection, form validation, and Django form processing. Optimized static file handling using Django Whitenoise and Nginx for efficient production-ready deployment. Fully deployed on AWS EC2 instance with Gunicorn, Nginx, systemd, and PostgreSQL database integration. Managed source code and deployment lifecycle using Git, GitHub, and SSH remote synchronization.

Django Tailwind Blog - Develo…

Python, Django, JavaScript, Web Development

Created a complete developer portfolio website with Django backend and TailwindCSS frontend. Created dynamic blog, project showcase, categories, and contact form and implemented custom error handling. Built Django models, templates, and PostgreSQL database integrations. Deployed responsive web application with static/media file management.

Contact

I am always open to new opportunities, collaborations, and challenging projects. If you would like to connect, please feel free to reach out via any of the contact options below.

Location:

Dallas, Texas, USA

Call:

+1 470-940-8044

LinkedIn:

linkedin
Loading
Your message has been sent. Thank you!