Introduction n8n, the fair-code workflow automation platform, has transformed how developers and businesses automate their processes. While self-hosting n8n typically requires dedicated infrastruc...
Terraform for Data Engineers: Automating Your Data Infrastructure
As a data engineer, we’re all too familiar with the pain of manually provisioning data processing resources, dealing with inconsistent environments, and the nightmare of trying to recreate a failed...
Building a Production-Ready Data Pipeline with Airflow and dbt
In today’s data-driven business landscape, transforming raw operational data into actionable insights requires robust, scalable data pipelines. I recently designed and implemented a comprehensive d...
Building a Scalable ETL Pipeline for AdTech Analytics
In the world of digital advertising, data is everything. Transforming raw operational data into actionable insights requires robust analytics pipelines. Recently, I implemented a comprehensive ETL ...
Takeaway: Unit Testing of Bash Scripts
Bash/Shell is a potent scripting language that lets us communicate with our computer’s operating system. In many of my everyday tasks, I rely on Bash to carry out Linux commands and create certain ...
Learning and Takeaways from Kubesimplify Workshop
This post will keep the learning process and takeaways from the Kubesimplify workshops held by Saiyam Pathak. The motivation for me to catch up on this DevOps topic is to systematically learning by...
- Terraform for Data Engineers: Automating Your Data Infrastructure
- How to train a customized Name Entities Recognition (NER) model based on spaCy pre-trained model
- Run your own Apache Spark jobs in AWS EMR and S3
- Implementing Fivetran Data Source Connector with AWS Lambda
- Learning and Takeaways from Kubesimplify Workshop