Data Streaming Architect

We are seeking a highly skilled Data Streaming Architect with expertise in GCP to design and implement real-time data streaming solutions. The ideal candidate will have a deep understanding of event-driven architectures, streaming technologies, and cloud-based data platforms, ensuring optimal performance, scalability, and reliability of enterprise data streaming solutions.

  • Design and implement real-time data streaming architectures using GCP services and others.
  • Develop solutions leveraging Apache Kafka, Cassandra, GCP and other real-time data streaming technologies.
  • Real-Time Data Processing: Experience in handling real-time streaming data pipelines for network performance and customer insights.
  • Cloud Data Warehousing (DWH): Advanced knowledge of designing and implementing DWH architecture for large-scale data ingestion, with a preference for GCP Big Query.
  • Experience with GCP Big Query and other cloud data warehousing solutions.
  • Architect, deploy and optimize event-driven and microservices-based architectures.
  • Define and enforce best practices for data ingestion, transformation, and streaming pipeline optimization.
  • Ensure high availability, fault tolerance, and security of data streaming solutions.
  • Collaborate with data engineers, cloud architects, DevOps teams, and business stakeholders to align technical strategies with business needs.
  • Implement monitoring and alerting mechanisms to proactively detect and address issues in streaming pipelines.
  • Evaluate and recommend emerging technologies to enhance data streaming capabilities.


Required Qualifications:

  • 12+ years of experience in data architecture, cloud computing and real-time data processing.
  • Hands-on experience with Apache Kafka (Confluent), Cassandra etc and related technologies.
  • Strong expertise in GCP.
  • Realtime services experience using GCP services like Pub/Sub, Cloud Functions, Datastore, and Cloud Spanner.
  • Experience with message queues (e.g., RabbitMQ) and event-driven patterns.
  • Hands-on experience with data serialization formats (e.g., Avro, Parquet, JSON) and schema registries.
  • Strong understanding of DevOps and CI/CD pipelines for data streaming solutions.
  • Familiarity with containerization and orchestration tools
  • Excellent communication and leadership skills, with experience collaborating across technical and business teams.
  • Good Knowledge of programming languages such as Python, Java, or Scala for real-time data processing