Veri Bilimci / Senior Data Engineer (GCP)

  • Teknokent Arı-3
  • 4 ay önce
  • -
  • Full Time
  • Acil

İlan Açıklama

Get ready to take your place on n11, an open market platform has made valuable contributions to the e-commerce sector since its establishment by bringing more than 330 thousand registered business partners to customers. We are looking for "Senior Data Engineer" to join our team in Data & Insights Department. Data is at the very heart of n11 and as such Data related positions play key roles in our strategic initiatives. n11 Data & Insights team is looking for individuals with a background in full life cycle complex data-implementation projects such as growing our Data Lake on GCP or building/improving data processing systems and machine learning/dep learning models. The ideal candidates should have experience in setting up and managing cloud (i.e. GCP) and on-premise infrastructures, and who are able to translate business needs into data architecture solutions and then drive implementation of solutions in production environments. The Data Engineer will support our teams on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects as well as being part of the technical vision and next-generation direction for Data & Insights team’s products. What you'll do Analyze, Design, Implement, Test, and Document all new or modified AA/ML/DL systems, models and applications, Integrating data management technologies and software engineering tools into existing structures, Building and managing data jobs with various data platforms (on-prem rdbms to cloud and vice versa), Implementation of data orchestration pipelines, data sourcing, cleansing, augmentation and quality control processes, Deployment of data pipelines in GCP, DevOps and DataOps skills including “infrastructure as code” systems, Leverage latest technologies to deliver better insights more quickly and cost effectively Who you are Academic degree in Data Engineering, Computer Engineering, Computer Science, Software Engineering, Applied Mathematics or similar background, Minimum of 5 years work experience, 2+ years professional development experience with the GCP data stack, Expertise in architecting, developing, and managing real-time data pipelines, Expertise in deploying machine learning models in production, Experience with stream data pipeline frameworks or solutions, Experience working in message queuing, stream processing, and highly scalable ‘big data’ data stores, Experience with Kafka, PubSub, or other event-based systems, Experience working in cloud environments and with containerization frameworks, tools and platforms (e.g Docker, Kubernetes, GKE, etc). Expertise in building data products incrementally and integrating/managing datasets on GCP from multiple sources, Expertise in developing ETL/ELT workflows with Python or Scala on premise and cloud data sources and external systems, Expertise in ML/DL frameworks (e.g. PyTorch, TensorFlow, Keras) Expertise in Database Scripting languages, Expertise in columnar, distributed, row based databases. Intellectual curiosity and ability to handle multiple projects and challenging deadlines, Strong analytical, interpersonal and communication skills, Ability to communicate in English language. As n11, we care about your Personal Data Security. Please find the Personal Data Protection Information Notice from the link below.