Data Engineer - 12m Contract
AbbVie
- Mascot, NSW
- Permanent
- Full-time
- Build advanced data pipelines using Apache Spark, Python, and SQL (preparing, mapping, joining, and stitching ingested datasets).
- Build KPIs and connect to the dashboards and other front-end applications using TypeScript
- Develop advanced data visualisations and applications using open sources libraries and languages CSS, HTML, JavaScript, and SQL.
- Define and ensure data quality of ingested datasets by implementing health and content checks
- Convert and optimise existing SQL queries and R applications.
- Ensure appropriate security and compliance policies are followed for information access and dissemination.
- Define and apply information quality and consistency throughout the data processing lifecycle.
- Enforce and expand the use of AbbVie Common Data Model and industry standard information descriptions (ontologies, taxonomies, dictionaries, glossaries, etc.).
- Manage the information portal and its customer-facing resources (data catalogue, etc.)
- Experience in several data processing roles such as database developer/administrator, ETL developer, data analyst, BI analytics developer, and/or solution developer of contextual search applications.
- Understanding of distributed data processing frameworks like Hadoop/Spark Knowledge in programming languages like Python, JavaScript, and SQL to build data pipelines and visualisations.
- Previous experience in production ML models is preferred.
- Understanding of the pharmaceutical industry is preferred.
- Ability to think and plan strategically (work plans, activities, timetables, targeting) and operate with execution excellence.
- Must be at ease with technology (the use of various tools/systems to perform day-to-day tasks).
- Previous experience with ETL/ELT/orchestration tools (SSIS, Azure Data Factory, Talend, Airflow, Informatica, etc.).
- Previous experience with Data Visualisation tools (Microsoft Power BI, Tableau, IBM Cognos etc.).
- Comprehension of Palantir Foundry platform or other big data management systems (Databricks, Snowflake, etc.).
- Experience with cloud computing services (AWS, Azure, or GCP)
- Skilled with Microsoft Office Suite (Outlook, PowerPoint, Excel, Word, etc.).
- English language proficiency (oral and written).