Lead Software Engineer, Data Platforms
Posted 2025-04-06Job Posting Title:
Lead Software Engineer, Data Platforms
Req ID:
10092611
Job Description:
Disney Entertainment & ESPN Technology
On any given day at Disney Entertainment & ESPN Technology, weÂre reimagining ways to create magical viewing experiences for the worldÂs most beloved stories while also transforming DisneyÂs media business for the future. Whether thatÂs evolving our streaming and digital products in new and immersive ways, powering worldwide advertising and distribution to maximize flexibility and efficiency, or delivering DisneyÂs unmatched entertainment and sports content, every day is a moment to make a difference to partners and to hundreds of millions of people around the world.
A few reasons why we think youÂd love working for Disney Entertainment & ESPN Technology
 Building the future of DisneyÂs media business: DE&E Technologists are designing and building the infrastructure that will power DisneyÂs media, advertising, and distribution businesses for years to... come.
 Reach & Scale: The products and platforms this group builds and operates delight millions of consumers every minute of every day  from Disney+ and Hulu, to ABC News and Entertainment, to ESPN and ESPN+, and much more.
 Innovation: We develop and execute groundbreaking products and techniques that shape industry norms and enhance how audiences experience sports, entertainment & news.
The Product & Data Engineering team is responsible for end to end development for DisneyÂs world-class consumer-facing products, including streaming platforms Disney+, Hulu, and ESPN+, and digital products & experiences across ESPN, Marvel, Disney Studios, NatGeo, and ABC News. The team drives innovation at scale for millions of consumers around the world across Apple, Android, Smart TVs, game consoles, and the web, with our platforms powering core experiences like personalization, search, messaging and data.
About The Role
The Data Platforms Team, a segment under the Disney Entertainment & ESPN Technology (DEET) organization is looking for a Lead Software Engineer to join our Big Data Applications team. Big Data technology is critical to improving products in the DEET Portfolio because it allows us to quantify content performance, measure advertising effectiveness, help users discover new content, detect fraud, and track user journeys. Overall, this data enables us to better understand the user experience so we can continue improving our service for our users, advertisers, and content partners. Our Big Data Applications Engineering team is seeking a highly motivated Software Engineer with a strong technical background who is passionate about designing and building systems to process data at scale, solving challenging problems in both batch and real-time data processing, and working across software and data disciplines to engineer solutions. Our tech stack includes AWS, Databricks, Airflow, and Spark, and primary languages are Scala and Java.
Responsibilities
 Contribute to maintaining, updating, and expanding the existing Data Capture platform including the Spark data pipelines while maintaining strict uptime SLAs
 Extend functionality of current Data Capture platform offerings, including metadata parsing, extending the metastore API, and building new integrations with APIs both internal and external to the Data organization
 Implement the Lakehouse architecture, working with customers, partners, and stakeholders to shift towards a Lakehouse centric data platform
 Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across the Data organization
 Collaborate with product managers, architects, and other engineers to drive the success of the Data Capture platform
 Lead the developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, partitioning strategies, and more
 Ensure high operational efficiency and quality of the Data Capture platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
 Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team
 Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements
 Maintain detailed documentation of your work and changes to support data quality and data governance requirements
 Provide mentorship and guidance for team members; evangelize the platform, best-practices, data driven decisions; identify new use cases and features and drive adoption
 Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python
Basic Qualifications
 7+ years of software engineering experience developing backend applications
 2+ years of data engineering experience developing large data pipelines
 Strong algorithmic problem-solving expertise
 Strong fundamental Scala and Python programming skills
 Basic understanding of AWS or other cloud provider resources (S3)
 Strong SQL skills and ability to create queries to analyze complex datasets
 Hands-on production environment experience with distributed processing systems such as Spark
 Hands-on production experience with orchestration systems such as Airflow
 Some scripting language experience
 Willingness and ability to learn and pick up new skillsets
 Self-starting problem solver with an eye for detail and excellent analytical and communication skills
Preferred Qualifications
 Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Redshift, Big Query)
 Experience in developing APIs with GraphQL
 Deep Understanding of AWS or other cloud providers as well as infrastructure as code
 Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
 Familiar with Scrum and Agile methodologies
 MasterÂs Degree a plus
Required Education
 BachelorÂs Degree in Computer Science, Information Systems or related field or equivalent industry experience
The hiring range for this position in Seattle is $156,300 - $209,600 per year and in Santa Monica is $149,300 - $200,200 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidateÂs geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
Job Posting Segment:
Product & Data Engineering
Job Posting Primary Business:
PDE - Data Platform Engineering
Primary Job Posting Category:
Software Engineer
Employment Type:
Full time
Primary City, State, Region, Postal Code:
Seattle, WA, USA
Alternate City, State, Region, Postal Code:
Date Posted:
2024-07-22
Apply Job!