About the position
Exusia, a cutting-edge digital transformation consultancy, seeks Google Cloud Platform Architect/Developer to join our global delivery team’s Data Engineering & Analytics practice in India. What’s the Role? The GCP - Senior Data Engineer role will be part of our AI, Data & Cloud practice and will be focused on delivering best of breed AI, Data & Analytics solutions based on Google Cloud Platform for Exusia’ s clients. The candidate will also be part of the AI, Data & Cloud practice teams and contribute to competency development & driving innovation through various internal initiatives within the company across the 3 major cloud platforms – GCP, AWS & AZURE. Candidate should have more than 5 years of cloud experience with at least 3 years on GCP architecting, designing and building data lakes /data warehouses / data pipelines / AI & ML solutions using GCP services.
Responsibilities
• Work on client projects to Architect and deliver robust ETL processes / Data Pipelines / Data Analytics Architect, Build & Deploy scalable Data Engineering & Analytics solutions
• Provide hands on technical leadership for relevant technical skills on multiple projects if required
• Take complete ownership of the SDLC cycle & own all technical deliverables
• Design data pipelines to ingest & integrate structured/unstructured data from heterogeneous sources
• Collaborate with delivery leadership to deliver projects on time adhering to the quality standards
• Contribute to the growth of the Cloud practice by helping with solutioning for prospects
• Providing mentorship to junior resources
Requirements
• Should have 5+ years of experience working with various Cloud platforms
• Minimum 3 years of experience on GCP general architecture with at least 2 end to end projects
• Knowledge in Infrastructure and application Migrations from on premise to GCP
• Hands on experience working on GCP Cloud projects in the areas of data engineering, Business Intelligence or Machine Learning is mandatory
• Ability to lead a technical solution discussion on the cloud infrastructure and architecture
• Responsible for design setting up of new GCP cloud environments, Sizing and managing deployments
• Must have handled projects using ETL / Data Pipeline and Orchestration tools – Cloud composer, Cloud data fusion, Dataflow and Dataproc
• Must have exposure to Data Analytics & Querying services like BigQuery, Datalab
• Must have hands on experience on Big Table, Datastore, Firestore and Memorystore
• Should have exposure to following general GCP services Storage – Cloud storage, Persistent disk, Cloud filestore Compute – App Engine, Compute Engine and VMware Engine Developer & Productivity tools – Artifact & Container Registry, Cloud source repositories
• Problem-solving skills along with good interpersonal & communication skills
Nice-to-haves
• Exposure to GCP AI & ML related services is an advantage
• Exposure to Network services like Cloud CDN / DNS / IDS / NAT / Interconnect / VPN / VPC etc
• Understanding of comparative offerings from AWS and Azure will be a big plus