Lead Data and Integration Engineer (REMOTE opportunity)

Arizona, North Carolina, Massachusetts, Texas, Florida, Colorado, Missouri

Remote position available

Job Description

Location: 100% Remote / Telecommute

Essential Functions


- Daily monitoring of database storage allocation and usage as well as other resource usage.
- Assists with backup standards and schedules and recovery procedures.
- Security plan development, testing and maintenance.
- Intent of this security plan is to establish a means of providing complete accountability for any and all use of the databases.
- Provides performance turning related to indexing, stored procedures, triggers and database/server configuration.
- Design and build data processing pipelines for structured and unstructured data using tools and frameworks in the Hadoop ecosystem.
- Implement and configure tools for Hadoop-based data lake implementations and Proof of concepts.
- Solid software engineer with excellent analytical and troubleshooting skills.

Job Responsibilities
- Work closely with Analysts to develop and implement data transformations within data lake systems, both on-prem and in the AWS environment.
- Develop and/or consume web services for data integration and ingestion of source data.
- Create and maintain workflows with an emphasis on reusability, scalability, optimization and parameterization in a variety of platforms.
- Responsible for understanding RDBMS and big data concepts and connectivity to various data sources and platforms.
- Develop detailed flow charts detailing data lineage across ingestion and transformation of data as needed.
- Analyze and document data from different platforms/products and determine appropriate transformations to standardize and potentially combine into a single destination.
- Document and develop pipelines that preserve a data dictionary and maintain/save appropriate metadata.
- Analyze data as needed in various data environments, including but not limited to Oracle, SQL Server, Hadoop / Hive, RedShift and more.

Target Skill Set:
Design and develop extremely efficient and reliable data pipelines to move terabytes of data into the Data Lake and other landing zones. Use expert coding skills in Hive, HiveQL, T-SQL, PL/SQL, Spark, Python and Lambda. Develop and implement data auditing strategies and processes to ensure data accuracy and integrity. Mentor and teach others. Solid Linux skills. At least 2-3 years of real-time experience with AWS. Experience with a wide variety of tools, including Attunity products, DMS, and SnapLogic is a plus.


This position will act as a liaison between IT Architects and IT Analysts to break down a complex system into smaller components and coach/lead a team of Data and Integration Engineers to design and develop these components. This role- functions as primary practitioner coach on the team to grow the capabilities of other engineers on the team. Responsible for driving new initiatives, conducts POC's and evaluates other products for seamless integration. Is an expert at data integration with RDBMS, Big Data/Hadoop, Data Warehouse, Data Lake concepts and has relevant experience with various OS, network and storage concepts. Will perform data modeling and create data architecture footprints that has operational integration capabilities. Must understand how to build advanced jobs using map-reduce technology. Ability identify and track key metrics produced by the application. Contribute to 24x7x365 days of on-call staff coverage.
  • Develops, test and maintains code using software development methodology and appropriate technologies for the system being used.
  • Works closely with Business Analysts to develop detail systems design and written test plans for online and report application programs.
  • Performs analysis on projects and provides a project plan that shows the tasks needing to be completed and a time estimate for each task.
  • Participates in design walkthroughs with appropriate focus groups and related users to verify accuracy of design in meeting business needs.
  • Prepares installation instructions and coordinates installation procedures.
  • Supports and troubleshoots application code problems.
  • Provides status reports that give a detailed description of the current projects progress and indicates time devoted to each task of the project.
  • Coordinates, guides and mentors programming efforts performed by in-house programmers or outside consultants to ensure that all programming is completed according to the project plan.

General Job Information

Title

Lead Data and Integration Engineer (REMOTE opportunity)

Grade

27

Job Family

Info Technology Group

Country

United States of America

FLSA Status

United States of America (Exempt)

Recruiting Start Date

4/2/2019

Date Requisition Created

3/21/2019

Minimum Qualifications

Education

Bachelors: Computer and Information Science

License and Certifications - Required

License and Certifications - Preferred

Other Job Requirements

Responsibilities

  • 5+ years related experience including a minimum of 2+ years designing, building and deploying software in Cloud.
  • Agile and Design Thinking (Coursera).
  • Critical thinker.
  • Demonstrated problem solving techniques.
  • Strong verbal and written communication skills.
  • AWS Certified.
  • Big Data/Hadoop Certified.
  • Some ETL/data movement certifications.
  • ServiceNow training.

Magellan Health Services is proud to be an Equal Opportunity Employer and a Tobacco-free workplace. EOE/M/F/Vet/Disabled. Every employee must understand, comply and attest to the security responsibilities and security controls unique to their position.

Top 5 reasons to work at Magellan

When you say

To be the best you have to work with the best

We hear you
Life @ Magellan