- Candidate must possess at least Bachelor’s Degree in Engineering (Computer/Telecommunication) or equivalent.
- At least 2 Year(s) of working experience in the related field is required for this position.
- Preferably Manager/Assistant Manager specialized in IT/Computer – Software or equivalent.
- Statistical Analysis, Machine Learning, Data Mining, Tableau, Hadoop, R, BigQuery, TensorFlow
- Minimum 2 years of delivering Hadoop solutions using the Hadoop ecosystem
- Minimum 4 years working in data analytics, data warehousing or similar
- Experience with enterprise data modeling concepts, ETL, and advanced SQL
- Experience with programming languages such as Scala/Java in Spark using data frames and RDD
- Proven experience in designing, deploying, and monitoring Spark applications
- Demonstrated experience in design, development, and implementation of large scale data systems based on hadoop (Cloudera preferred)
- Excellent pattern recognition and predictive modeling skills
- Experience with data and machine learning services a plus such as Azure, Amazon Web Services (AWS), and / or Google Cloud
- Experience with data visualization tools (e.g. Tableau, QlickView, Alteryx)
- Familiar with Agile software development (Scrum is a plus)
- Excellent verbal and written communication skills
- Ability to collaborate on an agile team to solve problems
- Data modeling in RDBMS (transactional, data warehousing) on MSSQL, MySQL, Oracle as well as Hadoop data modeling & design.
- Create and manage frameworks for data management, processing, and analytics
- Develop and manage applications and frameworks within the Hadoop ecosystem (Yarn, HDFS, Spark, Hadoop, Impala, Kafka, Kudu, etc)
- HDFS file system design using techniques, format, and layout for optimal performance
- Develop applications in spark and using scala, python, or java to support analytics, predictive, and ETL/ELT
- Develop, deploy, and monitor application and server performance in developed applications (yarn, map reduce, spark)
- Performance Tuning of Map Reduce, Spark and other applications within hadoop
- Data Analytics and Data Visualization (Tableau, PowerBI, Qlik, etc)
- Design and development of applications utilizing data streaming (Spark, Kafka, Flume)
- Machine Learning and AI Skills in Scala, Python in spark
Please submit your comprehensive CV and resume to hrd(at)conexus.co.id within 2 (two) weeks after this posting. Only qualified candidates are invited to apply and only successful candidates will be notified.
- IndustryComputer / IT
- Career LevelStaff
- Offerd Salary10Jt - 15Jt
- ExperienceMin 3 Years
Pekerjaan Terkait (291)
- IT Application & Development Department Head (Java) on 04/05/2018 Full Time