Data Engineer

Πικέρμι, Ανατολική Αττική, Ελλάδα | DEPK · IT Systems Administrator

About the role

 

Architect of Our Data-Driven Future

As our founding Data Engineer, you will do more than just build data pipelines; you will design and construct the central nervous system of our company. You are a systems thinker who sees data as the connective tissue that links our commercial, operational, and scientific functions. Your mission is to build a trusted, scalable, and secure data warehouse that will empower every department to move from intuition to insight.

You will be responsible for the entire data lifecycle—from ingestion and transformation to modeling and delivery—ensuring that our data is not just available, but is also clean, reliable, and structured to answer the most critical business questions. You will be the technical cornerstone of our data transformation.

 

Main Responsibilities

 

Architect & Build

  • Lead the design, development, and deployment of our new enterprise data warehouse on a modern cloud platform
  • Develop and implement robust ETL/ELT processes to ingest data from diverse source systems across the company
  • Create logical and physical data models that are optimized for performance and ease of use for analytics

 

Govern & Maintain

  • Establish and enforce data quality standards and frameworks to ensure the accuracy and integrity of our data assets
  • Monitor, troubleshoot, and optimize data pipelines and warehouse performance
  • Implement data security and access controls in partnership with the platform team

 

Collaborate & Enable

  • Work with business stakeholders and analysts to understand their data needs and translate them into technical specifications
  • Partner with Application Owners to ensure you have a clear understanding of source system data structures and APIs
  • Create clear documentation for data models, data dictionaries, and pipeline logic to enable self-service and team understanding

 

Your Core Expertise

 

Data Engineering Craftsmanship

This is your deep domain. You are a hands-on builder with demonstrable mastery in:

  • Data Warehousing: Architecting and implementing modern cloud data warehouses (e.g., Azure Synapse, Databricks) from the ground up
  • Data Modeling: Designing and implementing robust, scalable data models (e.g., star schemas, Data Vault) that serve analytics and reporting needs
  • ETL/ELT Pipeline Development: Building, orchestrating, and maintaining resilient data pipelines to extract data from a variety of source systems (ERPs, CRMs, LIMS, flat files)
  • Advanced SQL: Possessing deep, fluent expertise in SQL for complex querying, data manipulation, and performance tuning
  • Programming for Data: Strong proficiency in a language like Python and its data ecosystem (Pandas, PySpark, etc.) for data transformation and automation
  • Cloud Data Services: Hands-on experience with cloud data tools (e.g., Azure Data Factory, AWS Glue)

 

Your Broad Capabilities

 

The Data Synthesizer

This is what makes you a strategic partner, not just a technician:

  • Synthesis & Systems Thinking: You naturally see how data from different parts of the business (Sales, Supply Chain, Manufacturing) fits together to tell a larger story. You design data models that enable this holistic view, allowing the business to synthesize information, not just analyze it in silos
  • Business Acumen: You are deeply curious about the "why" behind the data. You seek to understand the business processes that generate the data and the business objectives that analytics will support
  • Collaboration: You are a key partner to the entire Digital & Technology team, working closely with Application Owners to understand source systems, Business Analysts to define data requirements, and BI Analysts who are the consumers of your work
  • Data Governance & Quality: You are a champion for data quality. You build frameworks for data cleansing, validation, and monitoring because you know that analytics is worthless if the underlying data isn't trusted
  • Compliance & Data Integrity: You have a strong appreciation for the principles of data integrity (ALCOA+) and the requirements of working in a GxP-regulated environment. You know how to build data systems that are secure, auditable, and compliant

 

Key Requirements

  • More than 3 years of dedicated experience as a Data Engineer, BI Developer, or in a similar data-focused role
  • Proven, hands-on experience building and managing data warehouses and complex ETL/ELT pipelines
  • Expert-level proficiency in SQL
  • Strong programming skills in Python for data manipulation
  • Experience with at least one major cloud platform (Azure, AWS, or GCP) and its data services
  • Bachelor's degree in Computer Science, Engineering, Mathematics, or a related quantitative field
  • Fluency in both Greek and English

 

Additional skills to be considered as advantages

  • Experience working in the pharmaceutical, life sciences, or another regulated GxP industry is a massive advantage
  • Hands-on experience with modern data stack tools (e.g., Spark, Airflow)
  • Experience with data visualization tools (e.g., Power BI, Tableau) to understand the needs of your end-users
  • Cloud data engineering certifications (e.g., Azure Data Engineer Associate, AWS Certified Data Analytics)

 

All applications will be treated with strict confidentiality and in accordance with applicable data protection regulations (GDPR).

Powered by Worklife Recruit (ex SmartCV)