India-based logistics technology platform Shiprocket has selected event streaming platform Confluent to help it in overcoming communication challenges between different services within the organisation as well as simplify service development and management, and handle the high data throughput required by the company.
Shiprocket operates in the post-order space for direct-to-consumer e-commerce transactions, with multiple services involved in the workflow.
Shiprocket’s chief technology officer, Sunil Kumar, told iTnews Asia that some services are vertically self-sufficient, meaning they have their own source code and business logic, while some services depend on events produced by other services.
For instance, the tracking service monitors the progress of shipments, which interacts with the courier partner system via webhooks or APIs to get shipment statuses, Kumar said.
He added that it needs information from the “core system” where the shipment is manifested to track the AWB (airway bill) and other details.
Once the status changes, the tracking service must relay this information back to the parent service.
Kumar said with the expansion of the company’s offerings and addition of new services, the system became more complex, and communication between services became a challenge.
The team deals with millions of AWBs and frequent status changes and faced latency issues while trying API and RabbitMQ (message-queueing software) to manage communication between services.
Kumar noted that the system was working but showing signs of stress.
The team started using Kafka and, in the process, learned about ksqlDB from Confluent – a database to help developers create stream processing applications on top of Apache Kafka.
He said this has helped Shiprocket reduce dependencies on the core application and API frameworks, making the development process and management of services easier.
Now, introducing new services or breaking down larger components into smaller ones is less complicated, as the focus is on business logic segregation and event-based design, Kumar said.
Decoupling of services
Despite dealing with a huge transactional data – roughly 500 GB of data points ingested daily – the solution was working well, with the tracking service handling millions of shipments every five minutes, he added.
Decoupling services in the company’s event-driven architecture allows it to scale components individually, providing a flexible system.
Kumar said each service can focus on its specialised business logic.
Also, while dealing with a large amount of event data, the ksqlDB has addressed the challenge of effectively identifying and managing relevant events, said Kumar.
Another advantage is the almost “non-existent lag between multiple services,” he added.
The high throughput Shiprocket achieves ensures that communication and notification endpoints receive data instantly, necessary for the business’s efficient operations.
Kumar said the company had built its own data lake, a central repository to consolidate and store diverse datasets and data utilisation.
He added the team has also implemented solutions from Snowflake to improve the efficiency of the company’s data processing capabilities.