Roles & Responsibilities:
- Working closely with the client to understand the business use case, build the data architecture, build the data pipeline, by using scalable, highly available and fault tolerant services in big
- Provide thought leadership in all phases of a project from discovery and planning through implementation and delivery
- Responsible for planning, organizing, and overseeing the completion of the data workstream project tasks while ensuring the project is on time, on budget, and within
- Accountable for the technical leadership regarding the Ensuring a sound and future-proof architecture is planned and the implementation meets the technical quality standards
- Proficiency in design, creation, deployment, review and get the final sign off from the client by following the best practices in SDLC / Agile methodologies
- Analyze latest big data technologies and its innovative applications; and bring these insights and best practices to the team
- Should have excellent communication and presentation
- Experience in leading and managing teams for customer implementations
- Providing thought leadership and mentoring to the data engineering team on how data should be stored and processed more efficiently and quickly at scale
- Good understanding of data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, data management, integration, consumption, Scheduling, Automation, Quality control, Migration, Deployment) using various cloud technologies in AWS/Azure/GCP
- Work with Functional Analysts to understand the core functionalities of the solution and build Data Models, Design Documents
- Ensure adherence with Security and Compliance policies for the products
- Stay up to date with evolving cloud technologies and development best-practices including open-source
- Proven problem-solving skills with the ability to anticipate roadblocks, diagnose problems and generate effective solutions
- Identify and manage risks / issues related to deliverables and arrive at mitigation plans to resolve the issues and risks
- Workload Distribution for sub teams as per business priority – ensure even and well-rounded distribution of work for all analysts
- Work in an Agile Environment and provide optimized solutions to the customers using JIRA or similar tools for project management
- Strong experience in client facing roles along with expertise in implementation planning, resource utilization tracking, project plan tracking, time-period project status reports, client presentations.
Technical Skills:
- Excellent communication and problem-solving
- Experience of management of data and analytics projects
- Proficiency in team management, project management.
- Strong experience in client facing roles
- Highly proficient in Project Management principles, methods, techniques, and tools
- Should have a strong understanding of big data solutions and analytical techniques
- Hands on experience in ETL process, performance optimization techniques are a must
- Candidate should have taken part in Architecture design and discussion
- Ability to set good goals and objectives and write reviews for team members
- Minimum of 7 years of experience in working with batch processing/ real-time systems
- Minimum of 7 years of experience working in Datawarehouse or Data Lake Projects in a role beyond just Data
- Minimum of 4 years of extensive working knowledge in AWS building scalable solutions and equivalent level of experience in Azure or Google Cloud is also acceptable
- Minimum of 4 years of experience in programming languages (preferably Python)
- Experience in Pharma Domain will be a very Big
- Familiar with tools like Git, Code Commit, Jenkins, Code Pipeline
- Familiar with Unix/Linux and Shell Scripting