Randstad ETL Developer in Tempe, Arizona

ETL Developer

job details:

  • location:Tempe, AZ

  • salary:$56.85 - $60.41 per hour

  • date posted:Friday, May 18, 2018

  • job type:Contract

  • industry:Finance and Insurance

  • reference:621875

job description

Summary:

Client is seeking a ETL Developer to be responsible for managing data availability, integration, transformation, and delivery to the data mart.

Develops and implements comprehensive data integration processes, standardizes the presentation of information derived from multiple subledger transaction systems, and develops data sources necessary to support financial reporting needs.

Job Responsibilities:

  • Design, implement, automate and maintain large scale enterprise data ETL processes.

  • Extracts structured and unstructured data from source systems, transforming it to fit business performance management needs, and loading it into a data mart.

  • Ensures ETL process is developed in modular fashion in shared folders and reusable objects to enforce consistency timely development.

  • Reviews ETL production design for performance optimization.

  • Collaborates with business functional leaders by analyzing and evaluating business requirements; diagnose the design, propose/design alternatives and recommendations.

  • Plans and coordinates the implementation of new data requirements

  • Oversees the data load production process and the implementation of new data load files in accordance with the department?s change management process

  • Resolves issues through troubleshooting and quick diagnoses.

  • Ensures data quality by designing and implementing adequate internal controls and monitoring processes.

  • Includes controls to identify individual ETL flows that are performance issues.

  • Ensures controls are maintained when designing new structures by utilizing object naming conventions to encourage consistent development and ease of migration from development to test to production.

  • Works closely with Information Technology staff through the phases of coding and implementation of external data source files providing clarifications as necessary.

  • Maintains effective ongoing communications with the business and IT community and operational management regarding system plans, projects, budgets, expenditures and policies.

  • Participates in the research and resolution of enterprise data governance issues.

Qualifications:

  • Bachelor's degree in a technical field such as computer science, computer engineering or related field required

  • 10+ years experience required

  • Experience in providing technical support, covering database administration and operations to include monitoring maintenance capacity, performance monitoring/planning and incident/problem troubleshooting and resolution

  • Strong knowledge of SQL.

  • Expert knowledge of relational databases, flat files and non-relational database structures.

  • Expert knowledge of Help Desk tools and trouble ticket tracking processes.

  • Expert knowledge of support agreements and their implied service levels.

  • Proven ability to write excellent technical documentation.

  • Perform full SDLC life cycle work, including analysis, design, development, enhancement, testing, maintenance and technical support for this project. - Design, implement, automate and maintain large scale enterprise data ETL processes. - Evaluate the integration a various data sources like Oracle, SQL server, VSAM and sequential files into staging area and recommends and/or implements efficiencies - Performs technical walk-through as needed to communicate design/coded solution and to seek input from team members. - Develop ETL processes using Informatica, ETL control tables, error logging, auditing, data quality, etc. - Implement data cleanup procedures, transformations, scripts, stored procedures, and execution of test plans for loading the data successfully into the targets. - Enterprise-level ETL development and ETL architecture using Informatica and understanding/ using the COBOL file structures. - Work on Oracle Database using SQL and PL/SQL. - Work on Informatica Designer Components -Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor. - Perform data analysis, data mapping, data loading, and data validation; reusability, parameterization, workflow design, etc. - Support other ETL developers; providing mentoring, technical assistance, troubleshooting and alternative development solutions. - Perform performance tuning and work end-to-end lifecycle using Enterprise wide data warehouse. Requirements and Responsibilities: - Experience with designing, developing, testing, and supporting applications. - Detailed understanding of Informatica Power center Components 9.x - Expertise working with different kinds of sources - XML, flat files, Web services etc. - Good experience in Unix Scripting - Excellent Understanding of Oracle, Microsoft Sqlserver. Need to be able to write complex queries with ease & Performance tuning. - Excellent data analysis and debugging. - Exposure IDQ preferred. - Good knowledge on Informatica Data Services (IDS), Informatica Analyst and Informatica Developer - Data mapping experience - mapping data sources to and from multiple applications. - Design, develop and implement data feeds and transformations with minimal to no guidance. - Team Player with excellent collaborative skills. Qualifications: - Bachelor's degree in a technical field such as computer science, computer engineering or related field required - Minimum 10 years of overall IT experience - Minimum 7-8 years overall Informatica, Data Warehousing and Business Intelligence experience. - Desired experience in IDQ (Informatica Data Quality) - Expert knowledge of relational databases, flat files and non-relational database structures. - Strong knowledge of data warehousing concepts. - Understand Logical and Physical data modeling best practices and techniques - Experience in developing slowly changing dimensions, error handling and performance tuning. - Strong Data Analysis skills. - Strong Oracle SQL/PL SQL skills. - Strong Unix /Shell scripting skills - Scripting languages like Python is desired.