Responsibilities:
- Assist in the definition of release scope and objectives, involving all relevant stakeholders and ensuring technical feasibility.
- Drive integration with popular tools and services in the broader data orchestration ecosystem.
- Ensure that releases are delivered on time, within scope and within budget.
- Measure and track project performance using appropriate tools and techniques.
- Manage the relationship with the client and all stakeholders.
Experience:
- 8-10 years of experience managing product strategies and roadmaps
- Must have technical expertise in Databricks.
- Possesses in-depth knowledge of data processing techniques and handling large data sets.
- Proficient in Pyspark, Spark SQL & Spark streaming for processing data for various use cases, creation of an optimized storage layer, scheduling features for job automation.
- Solid understanding of cloud infrastructure (AWS, Azure)
- Working knowledge of Apache Spark’s execution model, query planning and execution and familiarity with Big Data technology
- Strong data analysis and operationalization skills (SQL, rollups, building operational dashboards)
- Own end-to-end product management processes.
- Meet with client teams (business, technical and quality) to take detailed briefs and clarify specific requirements.
- Knowledge of Agile, SAFe and Waterfall software development methodologies
- Highly proficient in product roadmap planning in alignment to business goals and objectives; maintain comprehensive product related documentation.
- Track project deliveries and report and escalate to management as needed.
- Delegate project tasks based on individual strengths, skill sets and experience levels.
- Certified Scrum Product Owner (CSPO), Certified Scrum Master (CSM) is a plus.