Cardinal Health
Senior Consultant, Advanced Analytics (Sr. Cloud Data Engineer- AAA)
At a glance
Location: US-OH-Dublin Map
Posted: 11/08/2019
Closing: 12/07/2019
Degree: Not Specified
Type: Full-Time
Experience: Not Specified
Cardinal Health
Job description

What Advanced Analytics contributes to Cardinal Health


Advanced Analytics is responsible for applying quantitative methodologies, techniques and tools to develop best in class analytic solutions that solve complex business problems.

  • Works with stakeholders to identify business opportunities, goals, or objectives
  • determines if objectives can be met using an analytical approach and develops hypothesis.
  • Identifies key data requirements and acquires data
  • harmonizes, rescales, and cleans data for statistical techniques
  • applies data visualization techniques to evaluate data for model specification etc.
  • Builds custom quantitative models by applying advanced statistical methods such as causal and predictive modeling, forecasting, data mining, simulation, and/or optimization.
  • Demonstrates the ability to perform validation and testing of models to ensure adequacy and determines need for reformulation.
  • Interprets results of quantitative models, identifies trends and issues, and develops alternatives to support business objectives.
  • Demonstrates ability to clearly and concisely communicate complex information to a variety of audiences and mediums.
  • Partners with stakeholders and technologist to implement/automate/operationalize models into day-to-day business decision making.

Accountabilities


Cardinal Health EIT has identified “Intelligent Automation” as a key strategic initiative, and is investing in a broader enterprise-wide transformational program (AAA) to align action against the company’s vision and objectives. We are seeking experienced Cloud Data Engineers to join a newly formed, AAA team.  High-level team responsibility will include:

  • Process Analysis / Opportunity Assessment and create Roadmap for execution
  •  Build Opportunity Pipeline & Manage Opportunity Lifecycle
  •  Idea Incubation, Design Thinking, Conducting POCs / POVs
  • Planning and delivery execution of enterprise-grade intelligent and ML based solutions
  • Enabling Advanced Analytics, Automation, AI capabilities for enterprise-wide reuse (Playbooks, Reference Architecture,, Products, BotStore, etc.)
  •  Developing and fostering a culture of “Create”, “Consult”, and “Cultivate”
  • Support engineering requirements of Legacy / On-Prem enterprise applications and develop cloud migration solutions.
  • Troubleshoot and problem solve to remediate issues prohibiting migration or uncovered by migration.
  • Support the full project lifecycle - discovery, analysis, architecture, design, documentation, building, migration, automation and production-readiness.
  • Collaborative will share information, best practices and experiences with others and will be willing to embrace new and innovative ideas.
  • Proven hands-on software development experience with open-source technologies: Java, MySQL, Maven, Git, Jenkins, JUnit, Tomcat.
  • Ability to architect a highly available, distributed, and secure system on a cloud platform
  • Analyze, design and develop tests and test-automation suites.
  • Design and develop a processing platform using various configuration management technologies.
  • Provide ongoing maintenance, support and enhancements in existing systems and platforms.
  • Collaborate cross-functionally with data scientists, business users, project managers and other engineers to achieve elegant solutions.
  • Proficient with SQL
  • Develops large scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs.
  • Collaborates with data science team to transform data and integrate algorithms and models into automated processes.
  • Builds data marts and data models to support Data Science and other internal customers.
  • Integrates data from a variety of sources, assuring that they adhere to data quality and accessibility standards.
  • Analyzes current information technology environments to identify and assess critical capabilities and recommend solutions.
  • Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use case.
  • Defines and approves data engineering design patterns to be used for general re-use on multiple implementations
  • Help design data models, perform associated data engineering activities to meet business needs
  • Proactively manage technical debt incurred during software implementations by identifying opportunities for enhancement (debt repayment), even in tight deadlines


Qualifications

  • Experience developing standards in partnership with Engineering, Infrastructure Service, and Application Development to select appropriate technical solutions.
  • Experience within healthcare industry and healthcare data
  • Experience with multi-threaded, Big Data, and distributive Cloud architectures and frameworks, including using Hadoop, MapReduce, Cloudera, Hive, Spark, and Elasticsearch to conduct Big Data analytics
  • Experience with Extract, Transform, and Load (ETL) processes, including document parsing techniques and managing large data sets, such as multi-TB scale deployed environments while adhering to service-level agreements
  • Experience writing well-crafted, high-quality, self-documented, pragmatic code
  • 3+ years of work experience with ETL, data modeling, and business intelligence big data architectures preferred
  • Experience developing and managing data warehouses on a terabyte or petabyte scale.
  • Strong experience in massively parallel processing & columnar databases.
  • Experience with Python and shell scripting.
  • Deep understanding of advanced data warehousing concepts and track record of applying these concepts on the job.
  • 3+ years' experience with API development in various open source technologies (Spark, Hadoop, R, Apache Beam, Kafka) preferred
  • 3+ or more years of experience with open source development tool chain (SVN, Git, Jenkins, etc)
  • Experience with production database management and optimization at scale
  • Experience with user access, authentication, user permission management and security, LDAP, AD, Kerberos
  • Experience with tools such as Jenkins, Artifactory, etc. to build automation, CI/CD, Self-Service pipelines.
  • Experience with restful services, service-oriented architecture, distributed systems, cloud system (AWS) and micro-services.
  • Experience working with enterprise or cloud messaging systems such as Kafka, PubSub, AWS Kinesis
  • Experience with many GCP technologies: Compute Engines, Auto Scaling, Load Balancing, Container Service, object storage, VPC, data pipeline development, BQ.
  • Experience with Immutable Infrastructure and Infrastructure as Code patterns and technologies: Docker, Kubernetes, Terraform, Ansible, Packer, Vagrant.
  • Cloud/Data Engineering Certifications (preferred)
  • Experience with Machine Learning (preferred)
  • Broad experience in developing cloud migration solutions in either AWS or GCP (GCP preferred)
  • Experience developing ARM templates, CloudFormation templates or Terraform (preferred)
  • Experience working in an DevOps environment (preferred)
  • Expert in Agile development techniques
  • Extensive Data Analytics experience
  • Extensive experience building data and/or analytics solutions in AWS or GCPe
  • Experience in Intelligent Automation Technologies (RPA, ChatBots, AI, ML) is a plus
  • Able to co-locate with the development team or business partners/SME’s
  • May need travel (domestic)

Track/Level: P4 -- Individual contributor role

  • Applies advanced knowledge and understanding of concepts, principles, and technical capabilities to manage a wide variety of projects.
  • Participates in the development of policies and procedures to achieve specific goals
  • Recommends new practices, processes, metrics, or models
  • Works on or may lead complex projects of large scope
  • Develops technical solutions to a wide range of difficult problems.
  • Projects may have significant and long-term impact
  • Provides solutions which may set precedent
  • Independently determines method for completion of new projects
  • Receives guidance on overall project objectives
  • Will assist in testing and logging/tracking defects as needed
  • Will assist in supporting and documenting requested enhancements to applications as needed

What is expected of you and others at this level

  • Applies advanced knowledge and understanding of concepts, principles, and technical capabilities to manage a wide variety of projects
  • Participates in the development of policies and procedures to achieve specific goals
  • Recommends new practices, processes, metrics, or models
  • Works on or may lead complex projects of large scope
  • Projects may have significant and long-term impact
  • Provides solutions which may set precedent
  • Independently determines method for completion of new projects
  • Receives guidance on overall project objectives
  • Acts as a mentor to less experienced colleagues

Cardinal Health is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.

Senior Consultant, Advanced Analytics (Sr. Cloud Data Engineer- AAA)