hiring-jobs.com
Job Posting Title:
Lead Software Engineer, Big Data Infrastructure
Req ID:
10115534
Job Description:
On any given day at Disney Entertainment & ESPN Product & Technology (DEEP&T), we’re reimagining ways to create magical viewing experiences for the world’s most beloved stories while also transforming Disney’s media business for the future. Whether that’s evolving our streaming and digital products in new and immersive ways, powering worldwide advertising and distribution to maximize flexibility and efficiency, or delivering Disney’s unmatched entertainment and sports content, every day is a moment to make a difference to partners and to hundreds of millions of people around the world.
A few reasons why we think you’d love working for Disney Entertainment & ESPN Product &Technology
Building the future of Disney’s media business: DE&E Technologists are designing and building the infrastructure that will power Disney’s media, advertising, and distribution businesses for years to come.
Reach & Scale: The products and platforms this group builds and operates delight millions of consumers every minute of every day – from Disney+ and Hulu, to ABC News and Entertainment, to ESPN and ESPN+, and much more.
Innovation: We develop and execute groundbreaking products and techniques that shape industry norms and enhance how audiences experience sports, entertainment & news.
The Big Data Infrastructure team manages big data services such as Hadoop, Spark, Flink, Presto, Hive, etc. Our services are distributed across the data center and Cloud, supporting a large scale of data amount and thousands of physical resources. We focus on the virtualization of big data environments, cost efficiency, resiliency, and performance.
The right person for this role should have proven experience with working in mission-critical infrastructure and enjoy building and maintaining large-scale data systems with the challenge of varied requirements and large storage capabilities. If you are someone who enjoys building large-scale big data infrastructure, then this is a great role for you
Develop, scale, and improve in-house/cloud and open-source Hadoop-related systems (e.g. Spark, Flink, Presto/Trino, etc).
Investigate new big data technology, and apply it to the DisneyStreaming production environment.
Build next-gen cloud-based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability
Handle architectural and design considerations such as performance, scalability, reusability, and flexibility issues.
Advocate engineering best practices, including the use of design patterns, code review, and automated unit/functional testing.
Work together with other engineering teams to influence them on big data system design and optimization.
Define and lead the adoption of best practices and processes. Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation
Collaborate efficiently with Product Managers and other developers to build datastores as a service.
Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation.
At least 7 years of professional programming and design experience
Bigdata-related components (e.g. HDFS, HBase, Yarn, Hive, Spark, Flink, Presto, Impala, Terraform, EKS, Spinnaker, IAM, EMR, and etc)
Experience in building in-house big data infrastructure.
Experience in developing and optimizing ETL and ad-hoc query engines (e.g. Spark, Flink, Hive, Presto/Trino, GreenPlum)
Experience in CICD, fine-tuned metrics, security and compliance enhancement on compute engines
Experience in latest data format (Iceberg, Delta, Hudi)
Experience in catalog and metadata management would be a plus
Experience in developing and optimizing Hadoop-related and containerized technologies would be a plus (e.g. HDFS, HBase, Yarn, Kubernetes, docker, RocksDB)
Demonstrated ability with cloud infrastructure technologies, including Terraform, K8S, IAM, ELB, Ranger, KMS, S3, Glue etc
Experience in managing a big data cluster with over 1000 nodes.
Bachelor’s degree in computer science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study, and/or equivalent work experience
#DISNEYTECH
The hiring range for this position in Santa Monica, California is $152,200 to $204,100 per year, in Seattle, Washington is $159,500 to $213,900 per year, and in San Francisco, California is $166,800 to $223,600 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate’s geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
Job Posting Segment:
Commerce, Data & Identity
Job Posting Primary Business:
PDE – Data Platform Engineering
Primary Job Posting Category:
Software Engineer
Employment Type:
Full time
Primary City, State, Region, Postal Code:
Santa Monica, CA, USA
Alternate City, State, Region, Postal Code:
USA – CA – Market St, USA – WA – 925 4th Ave
Date Posted:
2025-03-10
Apply now
To help us track our recruitment effort, please indicate in your cover/motivation letter where (hiring-jobs.com) you saw this job posting.
Job title: Regional Medical Director (DVM) - Now Hiring + Amazing Salary - Dallas, Texas…
Job title: Registered Practical Nurse Medical Complex Care - Finch Site Location Company Southlake Regional…
Job title: Platform Product Manager Company Starling Bank Job description Job Description:Starling is the UK's…
Job title: Now Hiring Remote Sales Experts: Apply Today Company Winterfell and Company Job description…
Job title: Senior Manager, Biostatistics F/M Company Lifelancer Job description Job Title: Senior Manager, Biostatistics…
Job title: Senior Software Engineer, Fine Grained Authorization (Auth0) Company Okta Job description Get to…
This website uses cookies.