MTI TECHNOLOGY
3.7- 3.6
Lương và phúc lợi
- 3.6
Sự hài lòng trong công việc
- 3.6
Văn hóa công ty
- 4.0
Môi trường
nâng cao kỹ năng - 3.6
Cân bằng công việc
cuộc sống - 3.4
Thăng chức & Đánh giá
- 4.1
Tính ổn định
việc làm
Việc làm Data Engineer MTI TECHNOLOGY tại Việt Nam - 1 việc làm
Job type [HCM] Data Engineer Location HCM Requirements Data Pipeline Development:
-
Design, develop, and maintain ETL (Extract, Transform, Load) pipelines to collect, clean, and transform raw data from various sources into structured, usable formats.
-
Implement and optimize data integration processes to ensure data quality and accuracy.
Data Warehousing:
-
Manage and optimize data storage solutions, such as data warehouses, data lakes, and NoSQL databases.
-
Ensure data is organized efficiently for easy access and retrieval by end-users.
Data Transformation and Modeling:
-
Create data models and schemas that support business requirements and data analytics.
-
Transform and aggregate data to meet specific reporting and analysis needs.
Data Integration:
-
Collaborate with cross-functional teams to integrate data sources from various departments and systems.
-
Establish data governance and quality standards to maintain data consistency.
Performance Optimization:
-
Monitor and fine-tune data pipelines and systems to enhance performance, scalability, and reliability.
-
Identify and resolve data-related issues and bottlenecks.
Data Security and Compliance:
-
Implement data security measures to protect sensitive information and ensure compliance with relevant regulations (e.g., GDPR, HIPAA).
-
Develop and enforce data access controls.
Documentation:
-
Maintain comprehensive documentation for data pipelines, data models, and data architecture.
-
Train and educate team members on data engineering best practices.
Collaboration:
-
Work closely with data analysts, data scientists, and business stakeholders to understand their data needs and provide data solutions.
-
Collaborate with IT and DevOps teams to ensure seamless data operations.
Who we are looking for:
-
Be fluent in English.
-
Proven experience in data engineering, ETL development, and data pipeline management.
-
Proficiency in some programming languages such as Python
-
Strong SQL and database management skills.
-
Familiarity with data warehousing technologies and big data technologies (e.g., Google BigQuery, Google DataFusion, Airflow).
-
Understanding of cloud computing platforms (e.g., GCP) and containerization (e.g., Docker, Kubernetes).
-
Understanding of legal frameworks concerning data protection
-
Careful mindset and understanding how to keep the quality of large amount of data
Advantages Nice-to-Have Skills:
Understanding what data is valuable to the company, specifically to the different products and projects
Familiarity with AI
Familiarity with the logistics domain
We offer you
-
05 working days/week (From Monday to Friday), applying flexible working hours
-
2 days of remote WFH per week (based on the team's decision)
-
Lunch + Gasoline + Coffee Allowance
-
Health, Social, and Unemployment Insurance (based on gross-based salary, according to Labor Code) and PVI Health Insurance
-
13th-month salary and Performance bonus
-
Annual salary review
-
12 days annual leave plus an extra 02 days of company leave
-
Company trips, sponsored team building, monthly Happy Hour, Sports Clubs (Soccer, Badminton, Pingpong, Yoga), and other joyful events;
-
A culture of relentless learning with free courses in specialized skills, soft skills, and English;
-
Yearly health checkup;
-
Seniority benefits: allowance & PVI Health Insurance for family members
-
Technical-certificate bonus
-
Japanese-certificate bonus
-
Employee Referral Incentive