Equinix API Kakfa Client Feature
The internal AOS portal is a centralized interface that interacts with backend services, configuration services, databases, and monitoring systems to manage and observe service workflows.
​​
This feature in the internal portal enables to directly publish the customer service provisioning messages to Kafka.
Confidentiality Note: Due to organizational data privacy and security requirements, actual screenshots are not publicly shareable. The visuals presented here are illustrative samples reflecting the implemented functionality.


The Problem: Higher Time To Resolve
Takes a higher time to resolve as the users/developers must perform manual steps to publish customer service provisioning messages to Kafka.

If the Equinix platform layer fails to publish customer service provisioning messages to the Kafka bus (for AOS consumption), users in the AOS layer cannot simply retry those requests. In order to replay those messages into the Kafka bus, users must perform manual steps.
Challenge
​
Time-Consuming - Takes an average of 30 min of developer time to resolve
The manual process is error-prone
-
Log into wrong Broker/Topic
-
Permission or security issue
-
Missing provisioning details
-
Incorrect header names
-
Identified a critical operational gap where failed customer service provisioning requests could not be retried without manual Kafka publishing.
-
Collaborated with users/developers to understand their pain points and operational challenges and defined the core requirements for the feature.
-
Analyzed the existing microservices and event-driven system flow end-to-end to identify the correct integration points and constraints within the architecture.
-
Designed and developed an end-to-end UI-driven workflow that enables authorized users to publish provisioning messages directly to Kafka, eliminating the need to manually identify brokers, access Kafka hosts, or use CLI tools.
-
Implemented the supporting backend Kafka producer API to validate and transmit provisioning messages in a format compatible with downstream AOS consumers.
-
Added structured logging across the workflow to improve operational visibility and support efficient troubleshooting.
My Role
Impact & Metrics
Higher Developer Productivity
Time cut from avg 30 min → 5 min. 83% reduction per replay.
No SSH/CLI Steps
Produce directly from AOS-UI portal
Quicker Troubleshooting
Lower time-to-resolve, improving the customer experience
Fewer Errors & Data Discrepancy
Guided form reduces wrong place, no permission, missing fields, wrong headers names
Key Learnings
-
User-Driven System Design: Learned how to translate real operational pain points into a safe, self-service internal tool that reduces manual intervention and improves recovery time.
-
Event-Driven Architecture: Gained hands-on experience designing and producing Kafka messages with correct keys, headers, and metadata to ensure reliable downstream processing.
-
API & Microservices Development: Built backend APIs as a dedicated microservice to validate, construct, and publish Kafka messages in a controlled and scalable way.
-
Operational Readiness: Learned to design for observability and debuggability from day one, enabling faster issue resolution without ad-hoc debugging.
-
Infrastructure Fundamentals: Strengthened practical understanding of Docker, Kafka, APIs, and microservices through end-to-end implementation.