Free cookie consent management tool by TermsFeed Job - GCP Data Engineer | Digisourced

Duration: 6 months + extensions

Location: Remote within Spain

Pay Rate: Negotiable

Spanish Speaking

 

 

Role Overview:

As a GCP Data Engineer, you will design, build, and manage the end-to-end data architecture on Google Cloud Platform (GCP). You’ll collaborate with data scientists, analysts, and other engineers to ensure data is consistently accessible, clean, and ready for real-time or batch processing. Your responsibilities will span from data ingestion to storage, transformation, and ensuring the scalability of our GCP infrastructure.

Key Responsibilities

  • Design and implement ETL/ELT pipelines using GCP services such as BigQuery, Cloud Storage (GCS), Dataflow, and Cloud Functions.
  • Develop and manage data pipelines to ingest, process, and store structured and unstructured data from various sources.
  • Optimize and scale pipelines to handle large volumes of data efficiently, ensuring low-latency, high-performance data processing.
  • Collaborate with cross-functional teams to define data models and ensure data readiness for analytics and reporting.
  • Monitor, troubleshoot, and improve data infrastructure using GCP tools such as Cloud Monitoring, Cloud Logging, and Stackdriver.
  • Implement data governance, ensuring data quality, integrity, and compliance with industry standards.
  • Integrate GCP services with other third-party or on-premises systems to create seamless data workflows.
  • Participate in cloud migration projects, assisting in moving existing data infrastructures to GCP.
  • Ensure security best practices for managing and accessing data on GCP.

Qualifications

  • 3+ years of experience as a Data Engineer, with a focus on Google Cloud Platform (GCP).
  • Strong experience with GCP services like BigQuery, Cloud Storage, Cloud Functions, Dataflow, Pub/Sub, and Cloud Composer (Airflow).
  • Proficiency in SQL for querying and managing large datasets in BigQuery or similar data warehouses.
  • Strong programming skills in Python or Java, with experience automating data pipelines.
  • Familiarity with data lakes and data warehousing concepts, including schema design and optimization.
  • Understanding of cloud security best practices and data governance on GCP.
  • Experience with ETL tools and frameworks for building scalable data pipelines.
  • Proven ability to work with CI/CD pipelines and version control systems like Git.

 

user
CONSULTANT
Michael Whitbread
SIMILAR JOBS

SAP PLM Consultant

  • Salary: -
  • Location: Remote Working
  • Contract type: Freelance
bg-image

Ready to work with us? Let’s talk!