Job Description
Purpose and Mission
- The aim of IT organization is to grow with the emerging and promising technology capabilities being enabled in the company and continue to be innovative going one-step ahead of current trends and use internal skills to introduce new technologies that give a competitive advantage. Enterprise Cloud Architect sits in Global IT Solution; the focus of this role will be in provide consultancy, best practice, implementation and participate in Roche Enterprise cloud project.
Dimensions
- Delivering IT solution builds and manage project or solution on enterprise cloud platform
- Working with multiple affiliates on architecture for cloud technology
Key competencies
- Strong background cloud technology especially AWS and data analytic area.
- Project management exposure (e.g.PMI, Prince2) and agile methodologies (like Scrum, Kanban, Lean).
- Ability to cooperate and communicate with people at different organization levels and within multidisciplinary teams
Key responsibilities
- Hand on experience in implementation of organization wide enterprise cloud environment especially in AWS technology.
- Experience building an enterprise-scale data warehouse and data lake solutions end-to-end.
- Knowledgeable about a variety of strategies for ingesting, modeling, processing, and persisting data.
- Experience with native AWS technologies for data and analytics such as Redshift Spectrum, Athena, S3, Lambda, Glue, EMR, Kinesis, SNS, CloudWatch, Logstash etc.
- AWS Solution Architect certification is a big plus
- Be matrix leader, trainer and lead architect for a team to
- Implement and support reporting and analytics infrastructure for internal business customers using AWS, services such as Athena, Redshift, S3, RDS, Spectrum, EMR, and Quick Sight.
- Develop and maintain data security and permissions solutions for enterprise-scale data warehouse and data lake implementations including data encryption and database user access controls and logging.
- Develop and optimize data warehouse and data lake tables using best practice for DDL, physical and logical tables, data partitioning, compression, and parallelization.
- Develop and maintain data warehouse and data lake metadata, data catalog, and user documentation for internal business customers.
- Work with internal business customers and development teams to gather and document requirements for data publishing and data consumption via data warehouse, Data Lake, and analytics solutions.
- Be able to learn and assess other and new technologies such as Snowflake, Ali Cloud.
Mandatory skills
- Bachelor’s degree in Computer Science, Info Systems, Business, or a related field.
- 3-5 years of experience with AWS cloud technology, familiar and hand-on experience in implementation of enterprise cloud architecture and solution.
- 3+ years of experience in distributed system concepts from data storage and compute perspective (e.g. data lake architectures) in cloud ecosystem
- 3+ years of experience with data warehouse, data lakes and big data processing systems
Job Category: IT-ComputerSoftware
Job Type: Full Time
Job Location: kuala lumpur
Salary: negotiable
Career Level: senior executive
Years of Experience: above 5 years
Qualification: Bachelor's Degree Diploma Master's Degree