Home » #Memoir
In 2010, I revisited the bustling Sunday Sabji Bazaar in Sarni, Betel, Madhya Pradesh, with my father, reliving cherished childhood memories of our weekend outings. The market was a lively scene of colorful vegetables, fresh greens, and the rich aroma of spices, showcasing the region’s agricultural charm. Beyond a shopping destination, it was a communal…
Most of my childhood spent at the foothills of Mathar Dev, MPEB colony in Sarni, Madhya Pradesh. In 2009, during a journey of self-exploration, I had the opportunity to once again trek Mathar Dev satpura hills on a rainy day—an experience that remains one of my most cherished memories. During my trek, I met Satish…
Most of my childhood spent at the foothills of Mathar Dev, MPEB colony in Sarni, Madhya Pradesh. In 2009, during a journey of self-exploration, I had the opportunity to once again trek Mathar Dev satpura hills on a rainy day—an experience that remains one of my most cherished memories. During my trek, I met Satish…
Automating repetitive tasks is key to modern software development. Continuous Integration and Continuous Deployment/Delivery (CI/CD) pipelines streamline workflows, ensure code quality, and accelerate deployments. Python, known for its versatility and extensive library support, is an excellent choice for integrating text processing tasks into CI/CD pipelines. In my two decades in the tech world, I haven’t…
Automating workflows is essential for modern software development. Continuous Integration and Continuous Deployment/Delivery (CI/CD) pipelines enable teams to integrate, test, and deploy code efficiently. While PHP is a popular language for web development, it can play a vital role in automating CI/CD processes, including linting, testing, deployment, and database migrations. For over 20 years, I’ve…
Legacy datasets often bring unique challenges, especially when dealing with mixed or unknown encodings. Encoding errors can corrupt text, create unreadable characters, or cause application crashes. Detecting and fixing these issues is crucial for maintaining data integrity and usability. In my 20-year tech career, I’ve been a catalyst for innovation, architecting scalable solutions that lead…
Automating file tasks such as text replacement, backups, and file processing is essential for improving efficiency and reducing errors. Python and shell scripts are two popular tools for file automation, but choosing the right one depends on the complexity of your task, the environment, and your familiarity with the tool. For over two decades, I’ve…
In 2013, I embarked on an unforgettable bike journey to the Maha Kumbh Mela at Prayagraj, a once-in-12-years spiritual gathering at the confluence of the Ganga, Yamuna, and Saraswati rivers. Riding my trusted Bajaj Avenger, I covered 700+ km from Gurgaon, stopping briefly in Noida and Kanpur before reaching Prayagraj. The sight of illuminated pandals,…
In 2013, I embarked on a once-in-a-lifetime journey to the Maha Kumbh Mela, an extraordinary spiritual event held every 12 years at the Sangam—the confluence of the Ganga, Yamuna, and the mythical Saraswati rivers at Prayagraj. My nomadic yearly ritual of taking 1-2 weeks off for bike adventures had already taken me to various corners…
In 2013, I embarked on an unforgettable bike journey to the Maha Kumbh Mela at Prayagraj, a once-in-12-years spiritual gathering at the confluence of the Ganga, Yamuna, and Saraswati rivers. Riding my trusted Bajaj Avenger, I covered 700 km from Gurgaon, stopping briefly in Noida and Kanpur before reaching Prayagraj. The sight of illuminated pandals,…
Managing encoded data in files is a frequent challenge, especially when dealing with XML, JSON, or other structured file types. URL-encoded characters like %20 (for spaces) or %3F (for question marks) can make data unreadable and difficult to process. Python provides a seamless way to handle these issues by decoding URL-encoded characters and replacing specific text efficiently. Two decades…
Replacing data within files on Linux platforms like ubuntu, is a common task for system administrators, developers, and anyone who frequently works with large files. Whether you’re cleaning up unwanted data, replacing any typo error, modifying configuration files, or handling encoded data, understanding how to efficiently replace data in files is crucial in servers. For…
In today’s data-driven world, machine learning (ML) plays a crucial role in extracting valuable insights from massive datasets. Often, this data resides in Hadoop Distributed File System (HDFS) and is queried and processed using Apache Hive. I’ve spent ~20 years in the tech industry, working alongside organisations to navigate the complexities of technological change. I…
In the era of big data, machine learning (ML) drives innovation. Vast data volumes demand robust processing frameworks. Hadoop, with its distributed computing and storage capabilities, empowers ML workflows on massive datasets. For over two decades, I’ve been igniting change and delivering scalable tech solutions that elevate organizations to new heights. My expertise transforms challenges into…
Databases are at the core of modern applications, powering everything from small blogs to large-scale enterprise systems. Two primary database types dominate the landscape: SQL (Structured Query Language) and NoSQL (Not Only SQL). Each has its strengths, weaknesses, and ideal use cases. For over two decades, I’ve been at the forefront of the tech industry, championing innovation, delivering…
In the world of data processing and analytics, schemas define the structure, relationships, and constraints of the data. Two paradigms dominate this landscape: Schema-on-read and Schema-on-write. These approaches are critical to how data is ingested, stored, and queried, and their application can significantly affect performance, flexibility, and usability in various scenarios. Over two decades in the tech corporate…
As businesses collect increasing amounts of data, the challenge of storing and managing it efficiently grows. Data lakes and data warehouses have become essential for modern data strategies, providing organizations with robust solutions to process and analyze their data. While both serve critical roles, their design, functionality, and use cases differ. For over two decades,…
As data continues to grow at an exponential rate, businesses face the challenge of efficiently storing and analyzing diverse datasets. Data lakes and data warehouses have become essential components of modern data architectures, and technologies like Hadoop and NoSQL play a pivotal role in their implementation. Two decades in the tech world, I have spearhead groundbreaking innovations, engineer scalable…
Real-time data streaming is transforming how businesses process and analyze information. With technologies like Apache Kafka, Hadoop, and NoSQL databases, you can build powerful, scalable systems to handle real-time data streams. With 20 years of experience driving tech excellence, I’ve redefined what’s possible for organizations, unlocking innovation and building solutions that scale effortlessly. My guidance…
Web crawling frameworks have revolutionized how we collect data from websites, making the process faster and more efficient. However, choosing the right framework depends on your specific needs, including website complexity, data format, and interactivity. For over two decades, I’ve been igniting change and delivering scalable tech solutions that elevate organizations to new heights. My…