*We are unable to sponsor for this 6+ Month Contract role*
Prestigious Financial Institution Firm is currently seeking a Metadata Data Lineage Analyst. Candidate will develop Metadata and Data Lineage Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc.
Responsibilities:
Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings.
Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows.
Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources.
Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools.
Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/source to target data mapping.
Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures.
Design and build data capabilities like data quality, metadata, data catalog and data dictionary.
Qualifications:
6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings.
Ability to understand the Java Code base; read and/or write code using a programming language (eg, Java, Python, etc.).
Proficient with SQL and experience working with Git and experience with data analysis using Python/Pyspark.
Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams.
Experience working with various types of databases like Relational, NoSQL, Object-based.
Ability to review the application development code to ensure it meets functional requirements, architectural and data standards.
Proficiency in writing technical documentation for Java based applications that process data in Real Time and batch.
Ability to develop and implement Data Quality Solutions for multiple data sources across On-Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc.
Experience working on Protobuf, APIs, Kafka as Data Sources is preferred.
Experience working with draw.io or other tools creating architecture or data flow diagrams.
Ability to multitask and meet aggressive deadlines efficiently and effectively.
Experience in object-oriented design and software design patterns.
13 Nov 2024
Chicago, Illinois
Contractor
Information Technology, Telecommunications