Senior Staff Software Engineer, Data Pipeline - HBO Max

Seattle, WA
Full Time
Requisition ID: 173713BR

Share this job

Twitter LinkedIn Copy Link
Click HERE to Quick Apply.

Build the Future of Entertainment

HBO Max is WarnerMedia’s direct-to-consumer (DTC) offering debuting in May 2020. With over 10,000 hours of curated premium content anticipated at launch, HBO Max will offer powerhouse programming for everyone in the home, bringing together HBO, a robust slate of new original series, key third-party licensed programs and movies, and fan favorites from WarnerMedia’s rich library, including Warner Bros., New Line, DC, CNN, TNT, TBS, truTV, Turner Classic Movies, Cartoon Network, Adult Swim, Crunchyroll, Rooster Teeth, Looney Tunes, and more.

Our DTC Technology team is laying the foundation for HBO Max, and we need highly motivated, uniquely skilled, and technically agile experts across a variety of disciplines to accomplish things that have never been done before.

The Job

We are the team who provides the platform that enables near real time targeted customer experiences and dynamic business decisions. Our current data processing infrastructure is ready for an upgrade, and we are looking for data minded developers to design and implement the 2nd version of our event processing pipeline to support both batch and well as stream processing scenarios.

We enable a data driven culture by building systems for processing datasets describing customer behavior for our OTT streaming video applications including the upcoming HBO Max service. We collect, transform and assemble data from access logs, services, web and native applications, and authoritative stores to create a unified view of customer experience for WarnerMedia streaming video applications across web, mobile, gaming consoles, and connected screens. You will work as part of a cross functional team of software developers and data engineers to produce high quality data that truthfully represents the online behavior of WarnerMedia's customers.

The Daily
  • Build a large distributed, scalable, self service data platform that can seamlessly scale to handle billions of events per minute
  • Design and develop abstraction layers to make real time stream processing available to broader sets of users
  • Use your experience to make smart engineering decisions and own critical areas of a product used by millions of customers
  • Work in multiple areas of the product, both on the client-side and server-side
  • Lead the design, implementation, and delivery of large engineering projects that interact with multiple teams
  • Influence technology, process, standards, and best practices, both on this team and across the organization
  • Define architectures and designs that enable working through others
  • Ideate and innovate on complex software solutions
  • Be a team resource, growing other engineers and defining team software standards and engineering best practices
  • Hold yourself and your team to high standards while maintaining friendly, respectful relationships
The Essentials
  • Have a deep understanding of software data structures and algorithms
  • Have the ability to design and deliver complex architectures and distributed systems, including working through other engineers to help deliver the end solution.
  • Have the ability to describe highly technical concepts to non-technical audiences in a way they can understand
  • Be passionate about software engineering with a strong sense of ownership for the code that you and your team write
  • Seek self-improvement and are adaptable to suggestions and feedback
  • BS degree in Computer Science, or a related field with 7+ years of relevant experience
  • 5+ years of professional development experience using languages such as TypeScript, Java, or C#
The Nice to Haves
  • Experience building telemetry systems and data pipelines to stream and process datasets at low latencies. (Apache Spark, Apache Kafka, Amazon Kinesis, Apache Beam)
  • Experience building workflow orchestration and automation systems (Amazon Simple Workflow, Kubernetes’ native workflows, Airflow) to drive insights, programmatic recommendations and marketing automation
  • Experience with continuous ETL sourced from databases (Cassandra, Postgres, Redis) and third party systems APIs
  • Experience with data vault integration and visualization tools such as (Tableau, Periscope, Amplitude, Snowflake)
  • Success in delivering and operating reliable low-latency, services operating in a 24x7 environment
  • Full-stack industry experience, or hands-on familiarity with both client and server technology
The Perks
  • HBO exclusive events
  • Paid time off to volunteer
  • Access to well-being tools, resources, and freebies
  • Access to in-house learning and development resources
  • Part of the WarnerMedia family of powerhouse brands
Click HERE to Quick Apply.