Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Principal Data Platform Infrastructure Engineer.
Israel Jobs Expertini

Urgent! Principal Data Platform Infrastructure Engineer Job Opening In Ra'anana – Now Hiring ZoomInfo

Principal Data Platform Infrastructure Engineer



Job description

ZoomInfo is where careers accelerate.

We move fast, think boldly, and empower you to do the best work of your life.

You’ll be surrounded by teammates who care deeply, challenge each other, and celebrate wins.

With tools that amplify your impact and a culture that backs your ambition, you won’t just contribute.

You’ll make things happen–fast.

About the role:
ZoomInfo is looking for a Principal Data Platform Infrastructure Engineer to design and scale our next-generation Self Service AI-ready enterprise data platform.

This role is at the intersection of data infrastructure and AI, powering LLM chatbots, intelligent agents, and advanced analytics at enterprise scale.

You’ll drive architecture across the data platform, semantic layer, Model Context Protocol (MCP) services, and governance automation, ensuring consistent, reliable, and high-quality outcomes.


What you’ll do:

  • Lead the architecture of an AI-first enterprise data platform

  • Build and scale MCP-driven tooling, context fabric, and semantic contracts

  • Automate governance, data quality, lineage, and policy-as-code frameworks


  • Implement PII classification/redaction and multi-tenant context scoping

  • Ensure reliability, observability, performance, and cost efficiency across the platform


  • Partner with Data Engineering, Analytics, and ML/AI teams on access patterns and evaluation standards


  • Drive proof-of-concepts (POCs) and platform innovation initiatives


  • Establish best practices in Infrastructure-as-Code, CI/CD, and automated policy enforcement
  • Basic qualifications:

  • 12+ years designing and building large-scale data platforms and distributed cloud systems


  • 1+ year of experience with LLMs, RAG, search, or information retrieval systems

  • Bachelor’s degree in Computer Science, Engineering, or equivalent experience


  • Advanced SQL skills and warehouse expertise (Snowflake, BigQuery, Redshift)


  • Strong programming skills in Python; experience with microservices and data services

  • Workflow orchestration (Airflow) and modeling (dbt) system setup and abstraction experience


  • Batch/streaming systems design expertise: Kafka, Spark, Beam, Dataflow, EMR

  • Proven background in metadata systems & governance: data contracts, catalogs, OpenLineage


  • Infrastructure experience: Kubernetes, serverless (Lambda/Cloud Functions), Terraform, CI/CD (GitHub Actions, Jenkins)

  • Strong foundation in data security & privacy: IAM, secrets management, encryption, tenant isolation


  • Expertise in MCP tool servers/clients, capability registries, and observability patterns


  • Knowledge of lakehouse formats (Delta Lake, Iceberg, Hudi)


  • Familiarity with FinOps for AI: cost controls, token budgeting, caching strategies

  • Required Skill Profession

    Computer Occupations



    Your Complete Job Search Toolkit

    ✨ Smart • Intelligent • Private • Secure

    Start Using Our Tools

    Join thousands of professionals who've advanced their careers with our platform

    Rate or Report This Job
    If you feel this job is inaccurate or spam kindly report to us using below form.
    Please Note: This is NOT a job application form.


      Unlock Your Principal Data Potential: Insight & Career Growth Guide