Jaya Sai Adithya Lankela
Aspiring Data Engineer | SnowPro Core Certified | AWS Enthusiast
Let's go!
Summary
• From South India, M.S. in Computer Science at FAU- May 2025 • Passionate about data engineering, cloud integration, and automation • Enjoy solving real-world problems with scalable data solutions
My core skills
- Languages and Querying: SQL, Python
- Cloud and Platforms: - AWS (S3, Lambda, Glue, EC2, DataSync, CloudWatch)- Snowflake (RBAC, Snowpipe, Snowpark, Streamlit, Tasks, Stored Procedures)
- Data Engineering: -ETL/ELT pipelines (S3 ↔ Snowflake), Incremental loads, On-prem (Isilon) integrations
Project I'm working on
Title: Metadata-Driven ETL Pipeline from PostgreSQL to S3supporting- historical + incremental loads - Ensures a fresh, consistent, analytics-ready data lake
Architecture Overview
- Source: PostgreSQL table resource_agent hosted on RDS.
- Target: S3 buckets (uexpertly-data-lake-dev, uexpertly-data-lake-prod).
- Processing: AWS Glue (PySpark ETL scripts).
- Orchestration: AWS Step Functions.
- Scheduling: EventBridge cron rules.
- Monitoring: CloudWatch alarms.
- CI/CD: Bitbucket Pipelines.
- Validation: Unit tests (PyTest with coverage enforcement).
My recent Project
Extract: Game data stored in AWS S3
• Load: Ingested into Snowflake for processing
• Transform: Cleaned, filtered, and reshaped using SQL in Snowflake
• Incremental Load: Transformed updates pushed back to S3 in efficient format
Outcome
• Enabled automated refresh of analytics-ready data
• Reduced data duplication with incremental logic
• Created a scalable pipeline for future ingestion
Built a Snowpipe to process game telemetry data for downstream analytics.
Tools I’m Learning Hands-On
- Git versioning, branching strategies, PRs, code reviews
- AWS Step Functions for workflow orchestration
Use this side of the card to provide more information about a topic. Focus on one concept. Make learning and communication more efficient.
Title
`What i am learning rightnow ??
Write a brief description here
My Learning Roadmap (Next Steps)
Databricks
Non-primary- Informatica - Local DE Stack: - SSIS (Integration)
- SSRS (Reporting),
- SSAS (Analytics Cubes)
Deeper dive into Snowflake automation tools (Tasks, Streams, Snowpipe)
CI/CD
- GitHub Actions
- Terraform for IaC
- Automated unit testing in pipelines
• dbt (for analytics engineering, version-controlled SQL transformations)
Thank you
START
Jaya Sai Adithya Lankela
adithya reddy
Created on October 14, 2025
Start designing with a free template
Discover more than 1500 professional designs like these:
View
Corporate Christmas Presentation
View
Snow Presentation
View
Winter Presentation
View
Hanukkah Presentation
View
Vintage Photo Album
View
Nature Presentation
View
Halloween Presentation
Explore all templates
Transcript
Jaya Sai Adithya Lankela
Aspiring Data Engineer | SnowPro Core Certified | AWS Enthusiast
Let's go!
Summary
• From South India, M.S. in Computer Science at FAU- May 2025 • Passionate about data engineering, cloud integration, and automation • Enjoy solving real-world problems with scalable data solutions
My core skills
Project I'm working on
Title: Metadata-Driven ETL Pipeline from PostgreSQL to S3supporting- historical + incremental loads - Ensures a fresh, consistent, analytics-ready data lake
Architecture Overview
My recent Project
Extract: Game data stored in AWS S3 • Load: Ingested into Snowflake for processing • Transform: Cleaned, filtered, and reshaped using SQL in Snowflake • Incremental Load: Transformed updates pushed back to S3 in efficient format
Outcome • Enabled automated refresh of analytics-ready data • Reduced data duplication with incremental logic • Created a scalable pipeline for future ingestion
Built a Snowpipe to process game telemetry data for downstream analytics.
Tools I’m Learning Hands-On
Use this side of the card to provide more information about a topic. Focus on one concept. Make learning and communication more efficient.
Title
`What i am learning rightnow ??
Write a brief description here
My Learning Roadmap (Next Steps)
Databricks
Non-primary- Informatica - Local DE Stack:- SSIS (Integration)
- SSRS (Reporting),
- SSAS (Analytics Cubes)
Deeper dive into Snowflake automation tools (Tasks, Streams, Snowpipe)
CI/CD
• dbt (for analytics engineering, version-controlled SQL transformations)
Thank you
START