Offered Salary 1000
Experience 8 Years +
Qualifications Degree Bachelor
Join a team recognized for leadership, innovation and diversity
As an Equal Opportunities Employer, Honeywell is committed to a diverse workforce environment. Applicants with relevant qualifications will be considered without regard to age, race, creed, color, national origin, ancestry, marital status, affectional or sexual orientation, gender identity or expression, disability, nationality, sex, religion, or veteran status.
Description/Brief Summary of what role entails (Job Purpose)
Sr. Advanced Data Engineer is responsible for the data infrastructure and architecture. Works as individual contributor, facilitates solution architecture design, use data and business principles to automate data flow, detect business exceptions, build diagnostic capabilities, improve both business and data knowledge base. Act as a mentor to other team members.
This role requires hands-on experience in working with BigData infrastructure. Drive the design, building, and launching of new data models and data pipelines in production.
He/She is encouraged to closely work with ISC Global Analytics team members as well with Enterprise IT team to align practices and priorities to support the transition of analytics to FORGE – Big Data platform.
Key Tasks and Responsibilities
- Connect with the business users and find opportunities to drive business value via analytics solutions.
- As a BigData Architect, will handle designing of solutions required to automate administration, data capture, processing, data modelling processes to meet business’s Analytics needs.
- The Architect is responsible for BigData storage, processing and analytics use cases development and design analytics models with NFR (Performance, Availability, Scalability, And Integrity). The resulting design may run on multiple platforms and composed of multiple software packages & custom components.
- This role defines standard methodologies in the critical evaluation and selection and /or development of the software components and hardware requirements of the applications and data, and the application, including evaluation and selection of development methods, development processes, standard methodologies and tools.
- Drive continuous improvement and innovation.
- Work with multi-functional teams effectively.
- Coach and mentor Data Engineers / Specialists, leading by example through acting as an individual contributor
- Work with Honeywell businesses and Enterprise IT teams to clearly outline how enterprise platforms can enable business growth & efficiency by seamlessly combining structured and unstructured data into a single, self-service analytical environment.
- Use knowhow and domain knowledge of Honeywell specialists to define, craft and develop intelligent solutions to operate on large data sets related to Supply Chain.
- Bachelors in Engineering / Technology degree or equivalent
- Specialized in Computer Science, Software Engineering / Information Technology or related field
- Certifications in Microsoft Certified Solutions Associate (MCSA) / Azure Data Solution Certification / AWS Solution Architect Certification / Snowflake SnowPro® certification is desirable
- Python / R-Programming / Visualization exposure is helpful
- 10+ years of Software development with hands on experience in crafting technical solutions on Cloud – Snowflake EDW / Informatica DEI / HVR
- 6+ years of proven experience in design and development on SQL Server
- 4+ years of proven experience in architecture definition and requirement development.
- 2+ years of experience with visualization software (Tableau, Power BI, Spotfire, Qlikview, d3.js) is helpful
- Experience in working on Data Ingestion, Transformation and Modelling tools like Informatica DEI, Airflow, Paxata
- Very good understanding on data integration from various systems viz., EDW platform (ADLS & Snowflake Landing Schema) using HVR and Informatica DEI
- Working knowledge of cloud / BigData Platform like Azure or AWS
- Proven hands-on experience of database governance including but not limited to SQL, Oracle, MySQL and open source technologies
- Good at release management such as Continuous Integration / Continuous Delivery (CICD)
- Expertise on Design, development, deployment and operation of business intelligence (BI) solutions
- Strong written and verbal communications skills. Ability to persuade, inform, and influence others based on technology solutions.
- Ability to quickly adapt to new technologies, tools, and techniques
- Strong business acumen, with ability to draw clear connections between data modeling activities and business challenges/opportunities
- Standout Colleague, open to feedback, and flexible & adaptable to changing needs.
- Mentors and develops the technical skills of Analysts and Specialist.
- Experience with the agile and DevOps methodologies
- Experience with columnar DB like Enterprise HANA and predictive modeling using R or Python
- Experience in migrating from on-premises to cloud e.g. move from on-premises Oracle to AWS Oracle RDS with little or no downtime
- Developing IoT Connectivity Solutions using Azure event hub, Apache Kafka & NodeJS
- Master Data Management (MDM)