Cyber Security Analyst Hybrid working - 3 days in London office The main purpose of this role is to strengthen the protection of the company's IT assets through the implementation and execution of the Group's Cyber Security Reference Framework and methodology. You will demonstrate an overall understanding of the group's security requirements, supports the business lines as their systems mature to ensure they follow the standard security practice and comply with corresponding security requirements. You will act as a subject matter expert and a trusted advisor by providing authoritative IT cyber security advice and guidance to internal IT teams, ensuring secure by design principles are met by relevant IT teams. Working as part of a wider cyber security team, you will be dedicated to the cyber security environment of the UK business. Essential experience Proven experience in IT Risk and Cyber Security Strong working knowledge and thorough understanding of Data Security, Network and Infrastructure Security, Application Security, Vulnerability Monitoring, Cyber threats, security operation control mechanisms and solution (such as Firewall, SIEM, WAF, Malware Defences and IAM) Good understanding of Cyber Security management and IT risk management processes Broad knowledge of IT process, methodology, IT infrastructure, application development as well as latest technologies (eg Cloud, AI) Experience in assessing and supporting compliance of security standards - such as PCI-DSS, Cyber Essentials, ISO 27001, NIST and those published by the NCSC Key skills/competencies Excellent communication skills including written and spoken English Experience of and ability to liaise with senior stakeholders Risk anticipation, risk articulation and constructive opinion Understanding of corporate governance and compliance procedure Motivated and driven Desirable Qualifications Formal IT/Cyber security certification - CISSP, SSCP, CISM, CSIRC Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
08/04/2025
Full time
Cyber Security Analyst Hybrid working - 3 days in London office The main purpose of this role is to strengthen the protection of the company's IT assets through the implementation and execution of the Group's Cyber Security Reference Framework and methodology. You will demonstrate an overall understanding of the group's security requirements, supports the business lines as their systems mature to ensure they follow the standard security practice and comply with corresponding security requirements. You will act as a subject matter expert and a trusted advisor by providing authoritative IT cyber security advice and guidance to internal IT teams, ensuring secure by design principles are met by relevant IT teams. Working as part of a wider cyber security team, you will be dedicated to the cyber security environment of the UK business. Essential experience Proven experience in IT Risk and Cyber Security Strong working knowledge and thorough understanding of Data Security, Network and Infrastructure Security, Application Security, Vulnerability Monitoring, Cyber threats, security operation control mechanisms and solution (such as Firewall, SIEM, WAF, Malware Defences and IAM) Good understanding of Cyber Security management and IT risk management processes Broad knowledge of IT process, methodology, IT infrastructure, application development as well as latest technologies (eg Cloud, AI) Experience in assessing and supporting compliance of security standards - such as PCI-DSS, Cyber Essentials, ISO 27001, NIST and those published by the NCSC Key skills/competencies Excellent communication skills including written and spoken English Experience of and ability to liaise with senior stakeholders Risk anticipation, risk articulation and constructive opinion Understanding of corporate governance and compliance procedure Motivated and driven Desirable Qualifications Formal IT/Cyber security certification - CISSP, SSCP, CISM, CSIRC Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
Endeavour Recruitment have a long-term contract opportunity for a Data Engineering Specialist to join a leading organisation based in Geneva. Daily rates: Onsite: 910 CHF Nearshore: 580 CHF Contract duration: ASAP to the end of December 2025 with strong possibility of 12-month extensions. Required experience: 5 + years' experience Expert knowledge and experience developing and implementing ETL jobs for data warehouses using SQL and Python. Good knowledge of software development language and tools such as SQL, Python, Spark Good knowledge and experience with AWS or Azure Big Data tools: Glue, Athena, Redshift, Kinesis, Data bricks, azure analytics, Data Explorer. Good knowledge and experience with cloud-based storage and functions. (S3, Blob, Lambda, Azure Functions) Expert knowledge and experience of data engineering tools and methodologies (Datawarehouse, data lake, star schema). Good knowledge and experience with AWS Cloud Formation or Terraform. Knowledge of CI/CD concepts in general and AWS CodePipeline/CodeBuild/CodeDeploy in particular. Knowledge of provisioning of data APIs. Knowledge of information security concepts and terminology Excellent written and verbal communication skills that are compelling, convincing, and reassuring, with the ability to articulate complex technical ideas to non-technical stakeholders. Confident communicator and team player, with an advanced level of written and spoken English. Good organizational and interpersonal skills to influence others towards a shared vision and positive results with or without the line of command. Personal drive, ownership, and accountability to meet deadlines and achieve agreed-upon results. Required skills: Proficient user of Git. Familiarity with Jira and Bitbucket. Experience in the implementation of, or demonstratable familiarity with the Gartner ODM framework would be an advantage Tasks: The Data Engineering Specialist provides the required expertise to enhance and elaborate the different security metrics components as part of the IP Analytics Platform, using an interactive/agile approach. Work with the SIAD Information Security team to define and implement data quality assurance processes Identify the appropriate data visualization tool based on the existing capabilities within client Create and implement detailed visualization and presentation layers within the selected data visualization tool in line with defined requirements. Identify and document data structures for additional data sources. Ensure continuing data drift monitoring. Assist Information Security experts in the definition and extraction of relevant features for data analysis. Collaborate with business areas to improve the performance of central collation of big datasets. Add new information security raw data on which metrics and visualizations will be created. Develop broad and deep visualized drill down capability for detailed analysis Perform other related duties as required Deliverables: Data products such as pipelines, jobs and transformations as required for new data sources. Use case diagrams, data, and process flow diagrams. User guides, analyses, and project documentation. Security Metrics presentation/visualization dashboards and granular drill-down capability for nontechnical business analysts, aligned with the Gartner Outcome Driven Metrics (ODM) framework Solution evolution as raw data sources change, and new technologies or approaches (such as AWS Security Lake) Education and Certifications: Required: Applicants must have a first level university degree information security, computer science, engineering, mathematics, business or related discipline. Desirable: Additional certifications such as Certified Amazon Web Services (AWS) Solutions Architect, and AWS Certified Data Analytics Specialty are highly desirable. Please apply, we look forward to receiving your CV!
08/04/2025
Contractor
Endeavour Recruitment have a long-term contract opportunity for a Data Engineering Specialist to join a leading organisation based in Geneva. Daily rates: Onsite: 910 CHF Nearshore: 580 CHF Contract duration: ASAP to the end of December 2025 with strong possibility of 12-month extensions. Required experience: 5 + years' experience Expert knowledge and experience developing and implementing ETL jobs for data warehouses using SQL and Python. Good knowledge of software development language and tools such as SQL, Python, Spark Good knowledge and experience with AWS or Azure Big Data tools: Glue, Athena, Redshift, Kinesis, Data bricks, azure analytics, Data Explorer. Good knowledge and experience with cloud-based storage and functions. (S3, Blob, Lambda, Azure Functions) Expert knowledge and experience of data engineering tools and methodologies (Datawarehouse, data lake, star schema). Good knowledge and experience with AWS Cloud Formation or Terraform. Knowledge of CI/CD concepts in general and AWS CodePipeline/CodeBuild/CodeDeploy in particular. Knowledge of provisioning of data APIs. Knowledge of information security concepts and terminology Excellent written and verbal communication skills that are compelling, convincing, and reassuring, with the ability to articulate complex technical ideas to non-technical stakeholders. Confident communicator and team player, with an advanced level of written and spoken English. Good organizational and interpersonal skills to influence others towards a shared vision and positive results with or without the line of command. Personal drive, ownership, and accountability to meet deadlines and achieve agreed-upon results. Required skills: Proficient user of Git. Familiarity with Jira and Bitbucket. Experience in the implementation of, or demonstratable familiarity with the Gartner ODM framework would be an advantage Tasks: The Data Engineering Specialist provides the required expertise to enhance and elaborate the different security metrics components as part of the IP Analytics Platform, using an interactive/agile approach. Work with the SIAD Information Security team to define and implement data quality assurance processes Identify the appropriate data visualization tool based on the existing capabilities within client Create and implement detailed visualization and presentation layers within the selected data visualization tool in line with defined requirements. Identify and document data structures for additional data sources. Ensure continuing data drift monitoring. Assist Information Security experts in the definition and extraction of relevant features for data analysis. Collaborate with business areas to improve the performance of central collation of big datasets. Add new information security raw data on which metrics and visualizations will be created. Develop broad and deep visualized drill down capability for detailed analysis Perform other related duties as required Deliverables: Data products such as pipelines, jobs and transformations as required for new data sources. Use case diagrams, data, and process flow diagrams. User guides, analyses, and project documentation. Security Metrics presentation/visualization dashboards and granular drill-down capability for nontechnical business analysts, aligned with the Gartner Outcome Driven Metrics (ODM) framework Solution evolution as raw data sources change, and new technologies or approaches (such as AWS Security Lake) Education and Certifications: Required: Applicants must have a first level university degree information security, computer science, engineering, mathematics, business or related discipline. Desirable: Additional certifications such as Certified Amazon Web Services (AWS) Solutions Architect, and AWS Certified Data Analytics Specialty are highly desirable. Please apply, we look forward to receiving your CV!
Prestigious opportunity with a Market Leading Global Retail organisation for a Senior Data Engineer to join our success story in Blackburn. Following a period of significant growth, we are expanding our team to further drive the business forward. You will be responsible for:- Designing, building and maintaining scalable data pipelines using Python, MS SQL server and robust data pipelines to transform and store data. Managing the process of developing and implementing data models, schemas and architecture solutions. Working with stakeholders, data scientists and analysts to understand data requirements. Providing insights into data trends and opportunities. Work on processes to improve efficiency, reduce latency and ensure high data availability. Monitoring, logging and alerting mechanisms to identify and resolve data-related issues. If you possess a combination of some of the following skills, then LETS TALK! Strong technical background in data engineering, a deep understanding of modern data platforms. Highly experienced in Python MS SQL Server and Power BI. Extensive experience with big data technologies, cloud platforms, and database systems. Ability to ensure the efficient, reliable, and secure flow of data across the organisation. In return, you will be rewarded with flexible working hours, a bonus scheme, retail discount cards, free parking, an on-site restaurant, career development and training. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
08/04/2025
Full time
Prestigious opportunity with a Market Leading Global Retail organisation for a Senior Data Engineer to join our success story in Blackburn. Following a period of significant growth, we are expanding our team to further drive the business forward. You will be responsible for:- Designing, building and maintaining scalable data pipelines using Python, MS SQL server and robust data pipelines to transform and store data. Managing the process of developing and implementing data models, schemas and architecture solutions. Working with stakeholders, data scientists and analysts to understand data requirements. Providing insights into data trends and opportunities. Work on processes to improve efficiency, reduce latency and ensure high data availability. Monitoring, logging and alerting mechanisms to identify and resolve data-related issues. If you possess a combination of some of the following skills, then LETS TALK! Strong technical background in data engineering, a deep understanding of modern data platforms. Highly experienced in Python MS SQL Server and Power BI. Extensive experience with big data technologies, cloud platforms, and database systems. Ability to ensure the efficient, reliable, and secure flow of data across the organisation. In return, you will be rewarded with flexible working hours, a bonus scheme, retail discount cards, free parking, an on-site restaurant, career development and training. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Quality Assurance Tableau Reporting Analyst. Candidate will be responsible for designing, developing, and maintaining reporting and monitoring solutions that support QA processes. The role will collaborate with cross-functional teams to identify automation needs and reporting needs to monitor and support governance processes, analyze data, and generate reports, and dashboards using Tableau, Alteryx, and Excel. Responsibilities: Collaborate with QA team members, developers, and other stakeholders to understand monitoring reporting requirements and deliver solutions that meet business needs. Execute, Maintain, and enhance python automation scripts as part of quality assurance monitoring. Develop and maintain QA data reports, dashboards, and visualizations to monitor key performance indicators and quality metrics using Tableau. Collect, analyze, and interpret data from various sources to identify trends, patterns, and anomalies. Connect to various data sources such as Jira and SpriaTest and prepare data for analysis. Monitor the performance of Tableau reports and Alteryx workflows, identifying and resolving issues. Ensure workflows are efficient and can handle large datasets. Ensure data accuracy, consistency, and integrity in all reports and dashboards. Apply best practices in data visualization to enhance user experience and report usability in Tableau. Optimize performance to ensure timely and accurate delivery of Tableau reports. Conduct regular audits of QA data to ensure compliance with industry standards and company policies. Generate ad-hoc reports and perform data analysis to support decision making processes. Establish and successfully manage relationships with clients as assigned. Report and escalate issues to management as needed. Document processes, best practices, and user guides as needed. Stay updated with the industry's best practices and advancements in data reporting and analysis tools. Perform other duties as assigned. Supervisory Responsibilities: None Qualifications: Bachelor's Degree (or equal experience) in Computer science, Data Analytics, or related field. Proven working experience in data analysis, reporting, and visualization in Tableau and Alteryx within a professional setting. Strong proficiency in SQL and experience with data querying and manipulation. Experience in data blending, preparation, and transformation. Strong skills in data visualization tools such as Tableau, Power BI, Alteryx, or similar platforms. Familiarity with statistical analysis tools and techniques. Experience with QA processes, software testing, change control management, test planning management, quality control process, methodologies. And management systems. Excellent analytical and problem-solving skills. Exemplary verbal and written communication skills. Excellent client-facing and internal communication skills. Solid organizational skills include attention to detail and multi-tasking skills. Strong interpersonal skills and core values include a positive attitude, balance, creativity, determination, and teamwork. Self-starter with the ability to identify need for and develop processes and materials. Technical Skills: Tableau: Report and dashboard development, data visualization, best practices, performance optimization. SQL: Query writing, data manipulation. Strong Experience in SQL and Python. Alteryx: Workflow development, data preparation, automation, and optimization. Data analysis, problem-solving, and critical thinking. Experience with tools like Jira, Zephyr, Confluence, SpiraTest, Selenium, or HP ALM. ETL tools such as Alteryx and reporting tools such as Tableau. Strong understanding of CI/CD pipelines and tools such as Jenkins, GitHub, GitLab CI, or Azure DevOps. Familiarity with performance testing tools such as JMeter, LoadRunner, or Gatling. Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Platform (Google Cloud Platform). Strong experience working with Linux OS. Experience in database management systems. Education and/or Experience: Bachelor's degree or Master's degree in computer science or a related discipline. Experience in Agile methodology. Experience in Enterprise Report development, or similar roles. Experience in working with large datasets and complex data models. Proven track record of delivering high quality reports & dashboards. Experience in database management systems. Strong communication skills, written and oral, analytical abilities, problem solving skills, sound judgment and time management skills. Experience in the Options and Futures industry is preferred.
08/04/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Quality Assurance Tableau Reporting Analyst. Candidate will be responsible for designing, developing, and maintaining reporting and monitoring solutions that support QA processes. The role will collaborate with cross-functional teams to identify automation needs and reporting needs to monitor and support governance processes, analyze data, and generate reports, and dashboards using Tableau, Alteryx, and Excel. Responsibilities: Collaborate with QA team members, developers, and other stakeholders to understand monitoring reporting requirements and deliver solutions that meet business needs. Execute, Maintain, and enhance python automation scripts as part of quality assurance monitoring. Develop and maintain QA data reports, dashboards, and visualizations to monitor key performance indicators and quality metrics using Tableau. Collect, analyze, and interpret data from various sources to identify trends, patterns, and anomalies. Connect to various data sources such as Jira and SpriaTest and prepare data for analysis. Monitor the performance of Tableau reports and Alteryx workflows, identifying and resolving issues. Ensure workflows are efficient and can handle large datasets. Ensure data accuracy, consistency, and integrity in all reports and dashboards. Apply best practices in data visualization to enhance user experience and report usability in Tableau. Optimize performance to ensure timely and accurate delivery of Tableau reports. Conduct regular audits of QA data to ensure compliance with industry standards and company policies. Generate ad-hoc reports and perform data analysis to support decision making processes. Establish and successfully manage relationships with clients as assigned. Report and escalate issues to management as needed. Document processes, best practices, and user guides as needed. Stay updated with the industry's best practices and advancements in data reporting and analysis tools. Perform other duties as assigned. Supervisory Responsibilities: None Qualifications: Bachelor's Degree (or equal experience) in Computer science, Data Analytics, or related field. Proven working experience in data analysis, reporting, and visualization in Tableau and Alteryx within a professional setting. Strong proficiency in SQL and experience with data querying and manipulation. Experience in data blending, preparation, and transformation. Strong skills in data visualization tools such as Tableau, Power BI, Alteryx, or similar platforms. Familiarity with statistical analysis tools and techniques. Experience with QA processes, software testing, change control management, test planning management, quality control process, methodologies. And management systems. Excellent analytical and problem-solving skills. Exemplary verbal and written communication skills. Excellent client-facing and internal communication skills. Solid organizational skills include attention to detail and multi-tasking skills. Strong interpersonal skills and core values include a positive attitude, balance, creativity, determination, and teamwork. Self-starter with the ability to identify need for and develop processes and materials. Technical Skills: Tableau: Report and dashboard development, data visualization, best practices, performance optimization. SQL: Query writing, data manipulation. Strong Experience in SQL and Python. Alteryx: Workflow development, data preparation, automation, and optimization. Data analysis, problem-solving, and critical thinking. Experience with tools like Jira, Zephyr, Confluence, SpiraTest, Selenium, or HP ALM. ETL tools such as Alteryx and reporting tools such as Tableau. Strong understanding of CI/CD pipelines and tools such as Jenkins, GitHub, GitLab CI, or Azure DevOps. Familiarity with performance testing tools such as JMeter, LoadRunner, or Gatling. Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Platform (Google Cloud Platform). Strong experience working with Linux OS. Experience in database management systems. Education and/or Experience: Bachelor's degree or Master's degree in computer science or a related discipline. Experience in Agile methodology. Experience in Enterprise Report development, or similar roles. Experience in working with large datasets and complex data models. Proven track record of delivering high quality reports & dashboards. Experience in database management systems. Strong communication skills, written and oral, analytical abilities, problem solving skills, sound judgment and time management skills. Experience in the Options and Futures industry is preferred.
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Quality Assurance Tableau Reporting Analyst. Candidate will be responsible for designing, developing, and maintaining reporting and monitoring solutions that support QA processes. The role will collaborate with cross-functional teams to identify automation needs and reporting needs to monitor and support governance processes, analyze data, and generate reports, and dashboards using Tableau, Alteryx, and Excel. Responsibilities: Collaborate with QA team members, developers, and other stakeholders to understand monitoring reporting requirements and deliver solutions that meet business needs. Execute, Maintain, and enhance python automation scripts as part of quality assurance monitoring. Develop and maintain QA data reports, dashboards, and visualizations to monitor key performance indicators and quality metrics using Tableau. Collect, analyze, and interpret data from various sources to identify trends, patterns, and anomalies. Connect to various data sources such as Jira and SpriaTest and prepare data for analysis. Monitor the performance of Tableau reports and Alteryx workflows, identifying and resolving issues. Ensure workflows are efficient and can handle large datasets. Ensure data accuracy, consistency, and integrity in all reports and dashboards. Apply best practices in data visualization to enhance user experience and report usability in Tableau. Optimize performance to ensure timely and accurate delivery of Tableau reports. Conduct regular audits of QA data to ensure compliance with industry standards and company policies. Generate ad-hoc reports and perform data analysis to support decision making processes. Establish and successfully manage relationships with clients as assigned. Report and escalate issues to management as needed. Document processes, best practices, and user guides as needed. Stay updated with the industry's best practices and advancements in data reporting and analysis tools. Perform other duties as assigned. Supervisory Responsibilities: None Qualifications: Bachelor's Degree (or equal experience) in Computer science, Data Analytics, or related field. Proven working experience in data analysis, reporting, and visualization in Tableau and Alteryx within a professional setting. Strong proficiency in SQL and experience with data querying and manipulation. Experience in data blending, preparation, and transformation. Strong skills in data visualization tools such as Tableau, Power BI, Alteryx, or similar platforms. Familiarity with statistical analysis tools and techniques. Experience with QA processes, software testing, change control management, test planning management, quality control process, methodologies. And management systems. Excellent analytical and problem-solving skills. Exemplary verbal and written communication skills. Excellent client-facing and internal communication skills. Solid organizational skills include attention to detail and multi-tasking skills. Strong interpersonal skills and core values include a positive attitude, balance, creativity, determination, and teamwork. Self-starter with the ability to identify need for and develop processes and materials. Technical Skills: Tableau: Report and dashboard development, data visualization, best practices, performance optimization. SQL: Query writing, data manipulation. Strong Experience in SQL and Python. Alteryx: Workflow development, data preparation, automation, and optimization. Data analysis, problem-solving, and critical thinking. Experience with tools like Jira, Zephyr, Confluence, SpiraTest, Selenium, or HP ALM. ETL tools such as Alteryx and reporting tools such as Tableau. Strong understanding of CI/CD pipelines and tools such as Jenkins, GitHub, GitLab CI, or Azure DevOps. Familiarity with performance testing tools such as JMeter, LoadRunner, or Gatling. Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Platform (Google Cloud Platform). Strong experience working with Linux OS. Experience in database management systems. Education and/or Experience: Bachelor's degree or Master's degree in computer science or a related discipline. Experience in Agile methodology. Experience in Enterprise Report development, or similar roles. Experience in working with large datasets and complex data models. Proven track record of delivering high quality reports & dashboards. Experience in database management systems. Strong communication skills, written and oral, analytical abilities, problem solving skills, sound judgment and time management skills. Experience in the Options and Futures industry is preferred.
07/04/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Quality Assurance Tableau Reporting Analyst. Candidate will be responsible for designing, developing, and maintaining reporting and monitoring solutions that support QA processes. The role will collaborate with cross-functional teams to identify automation needs and reporting needs to monitor and support governance processes, analyze data, and generate reports, and dashboards using Tableau, Alteryx, and Excel. Responsibilities: Collaborate with QA team members, developers, and other stakeholders to understand monitoring reporting requirements and deliver solutions that meet business needs. Execute, Maintain, and enhance python automation scripts as part of quality assurance monitoring. Develop and maintain QA data reports, dashboards, and visualizations to monitor key performance indicators and quality metrics using Tableau. Collect, analyze, and interpret data from various sources to identify trends, patterns, and anomalies. Connect to various data sources such as Jira and SpriaTest and prepare data for analysis. Monitor the performance of Tableau reports and Alteryx workflows, identifying and resolving issues. Ensure workflows are efficient and can handle large datasets. Ensure data accuracy, consistency, and integrity in all reports and dashboards. Apply best practices in data visualization to enhance user experience and report usability in Tableau. Optimize performance to ensure timely and accurate delivery of Tableau reports. Conduct regular audits of QA data to ensure compliance with industry standards and company policies. Generate ad-hoc reports and perform data analysis to support decision making processes. Establish and successfully manage relationships with clients as assigned. Report and escalate issues to management as needed. Document processes, best practices, and user guides as needed. Stay updated with the industry's best practices and advancements in data reporting and analysis tools. Perform other duties as assigned. Supervisory Responsibilities: None Qualifications: Bachelor's Degree (or equal experience) in Computer science, Data Analytics, or related field. Proven working experience in data analysis, reporting, and visualization in Tableau and Alteryx within a professional setting. Strong proficiency in SQL and experience with data querying and manipulation. Experience in data blending, preparation, and transformation. Strong skills in data visualization tools such as Tableau, Power BI, Alteryx, or similar platforms. Familiarity with statistical analysis tools and techniques. Experience with QA processes, software testing, change control management, test planning management, quality control process, methodologies. And management systems. Excellent analytical and problem-solving skills. Exemplary verbal and written communication skills. Excellent client-facing and internal communication skills. Solid organizational skills include attention to detail and multi-tasking skills. Strong interpersonal skills and core values include a positive attitude, balance, creativity, determination, and teamwork. Self-starter with the ability to identify need for and develop processes and materials. Technical Skills: Tableau: Report and dashboard development, data visualization, best practices, performance optimization. SQL: Query writing, data manipulation. Strong Experience in SQL and Python. Alteryx: Workflow development, data preparation, automation, and optimization. Data analysis, problem-solving, and critical thinking. Experience with tools like Jira, Zephyr, Confluence, SpiraTest, Selenium, or HP ALM. ETL tools such as Alteryx and reporting tools such as Tableau. Strong understanding of CI/CD pipelines and tools such as Jenkins, GitHub, GitLab CI, or Azure DevOps. Familiarity with performance testing tools such as JMeter, LoadRunner, or Gatling. Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Platform (Google Cloud Platform). Strong experience working with Linux OS. Experience in database management systems. Education and/or Experience: Bachelor's degree or Master's degree in computer science or a related discipline. Experience in Agile methodology. Experience in Enterprise Report development, or similar roles. Experience in working with large datasets and complex data models. Proven track record of delivering high quality reports & dashboards. Experience in database management systems. Strong communication skills, written and oral, analytical abilities, problem solving skills, sound judgment and time management skills. Experience in the Options and Futures industry is preferred.
Application Support Analyst - SaaS, API's, OAUTH, XML, JSON Company and Opportunity: Essex-based FinTech provider requires an Application Support Analyst, with commercial experience of supporting SaaS/Software Provider's applications, with a focus on SaaS platforms and API-driven services. You will be the primary point of contact for 1st/2nd line customer support, troubleshooting, ensuring all service level agreements (SLAs) and customer service requirements are consistently met. You will also assist customers with API integrations, help them understand documentation, and ensure a high level of customer satisfaction. As you will be the main point of contact from users, you will log detailed forensic information, track support tickets, and work to resolve issues in a timely and efficient manner, as well as collaborating with internal teams, including the Development Team and Customer Success. This is an office-based role, with the potential to be able to work one day from home, after appropriate training, and there may be occasional travel to client sites. Core Skills for the Application Support Analyst role are: Minimum 1+ years commercial experience working within a SaaS or software/application support environment. Experience of SOAP/REST APIs, OAuth authentication, JSON/XML, and experience with log monitoring tools (Splunk, Graylog, Elastic Stack etc). Experience with Cloud - AWS, Azure, or Google Cloud. Understanding/experience of SQL or NoSQL databases is a bonus. Knowledge of API security best practices (OAuth, SSL/TLS). JIRA - tracking & ticketing creation Any experience/knowledge in payment systems/processing will be a bonus, along with associated professional certifications (API, Security+, CompTIA A+ etc.) Responsibilities: 1st point of contact for customer interaction and enhancing the customer experience. Technical Support & Troubleshooting - for API integrations, HTTP responses etc. SLA's & Ticket Management. Analysis, Collaboration & Reporting - interacting with customers, engineering teams and the Product Owner. Strong documentation and communication including performance metrics, ticket resolution performance and keeping documentation up to date. This is an office-based role that pays between £27K-£33K dependent on experience, plus benefits package. Please apply now to discuss the Application Support Analyst role in more detail.
07/04/2025
Full time
Application Support Analyst - SaaS, API's, OAUTH, XML, JSON Company and Opportunity: Essex-based FinTech provider requires an Application Support Analyst, with commercial experience of supporting SaaS/Software Provider's applications, with a focus on SaaS platforms and API-driven services. You will be the primary point of contact for 1st/2nd line customer support, troubleshooting, ensuring all service level agreements (SLAs) and customer service requirements are consistently met. You will also assist customers with API integrations, help them understand documentation, and ensure a high level of customer satisfaction. As you will be the main point of contact from users, you will log detailed forensic information, track support tickets, and work to resolve issues in a timely and efficient manner, as well as collaborating with internal teams, including the Development Team and Customer Success. This is an office-based role, with the potential to be able to work one day from home, after appropriate training, and there may be occasional travel to client sites. Core Skills for the Application Support Analyst role are: Minimum 1+ years commercial experience working within a SaaS or software/application support environment. Experience of SOAP/REST APIs, OAuth authentication, JSON/XML, and experience with log monitoring tools (Splunk, Graylog, Elastic Stack etc). Experience with Cloud - AWS, Azure, or Google Cloud. Understanding/experience of SQL or NoSQL databases is a bonus. Knowledge of API security best practices (OAuth, SSL/TLS). JIRA - tracking & ticketing creation Any experience/knowledge in payment systems/processing will be a bonus, along with associated professional certifications (API, Security+, CompTIA A+ etc.) Responsibilities: 1st point of contact for customer interaction and enhancing the customer experience. Technical Support & Troubleshooting - for API integrations, HTTP responses etc. SLA's & Ticket Management. Analysis, Collaboration & Reporting - interacting with customers, engineering teams and the Product Owner. Strong documentation and communication including performance metrics, ticket resolution performance and keeping documentation up to date. This is an office-based role that pays between £27K-£33K dependent on experience, plus benefits package. Please apply now to discuss the Application Support Analyst role in more detail.
Nicholas Bernard seek a skilled MuleSoft Developer to design, develop, and maintain integrations and APIs using the MuleSoft Anypoint Platform . The ideal candidate will have strong experience in building robust, scalable integrations between cloud and on-premise systems and a solid understanding of enterprise integration patterns and API life cycle management. Key Responsibilities Design and implement MuleSoft APIs and integrations to connect various systems and services (REST/SOAP, databases, SaaS apps, etc.) Develop and deploy solutions using the Anypoint Platform (Mule 4 preferred) Use DataWeave for data transformation across formats like JSON, XML, CSV, etc. Collaborate with cross-functional teams including business analysts, QA, and DevOps Participate in the full API life cycle - from design and implementation to deployment and monitoring Create and maintain technical documentation, including integration flows, design specs, and test cases Ensure high availability, performance, and scalability of integrations Troubleshoot and resolve issues in existing APIs and integrations Required Skills & Qualifications 2+ years of hands-on experience with MuleSoft Anypoint Platform Proficiency in Mule 4 , API design (RAML/OpenAPI) , and DataWeave Strong knowledge of RESTful web services , SOAP , and integration patterns Experience with CI/CD tools , Git , Maven , and deployment pipelines Familiarity with cloud platforms (eg, AWS, Azure, Salesforce) Excellent problem-solving and communication skills Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience) Apply now for more details and Immediate Interviews
07/04/2025
Contractor
Nicholas Bernard seek a skilled MuleSoft Developer to design, develop, and maintain integrations and APIs using the MuleSoft Anypoint Platform . The ideal candidate will have strong experience in building robust, scalable integrations between cloud and on-premise systems and a solid understanding of enterprise integration patterns and API life cycle management. Key Responsibilities Design and implement MuleSoft APIs and integrations to connect various systems and services (REST/SOAP, databases, SaaS apps, etc.) Develop and deploy solutions using the Anypoint Platform (Mule 4 preferred) Use DataWeave for data transformation across formats like JSON, XML, CSV, etc. Collaborate with cross-functional teams including business analysts, QA, and DevOps Participate in the full API life cycle - from design and implementation to deployment and monitoring Create and maintain technical documentation, including integration flows, design specs, and test cases Ensure high availability, performance, and scalability of integrations Troubleshoot and resolve issues in existing APIs and integrations Required Skills & Qualifications 2+ years of hands-on experience with MuleSoft Anypoint Platform Proficiency in Mule 4 , API design (RAML/OpenAPI) , and DataWeave Strong knowledge of RESTful web services , SOAP , and integration patterns Experience with CI/CD tools , Git , Maven , and deployment pipelines Familiarity with cloud platforms (eg, AWS, Azure, Salesforce) Excellent problem-solving and communication skills Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience) Apply now for more details and Immediate Interviews
Request Technology - Craig Johnson
Chicago, Illinois
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Data Governance MDM Analyst. Candidate will act as a liaison and translation layer between business and technical teams and operate at system- and detailed technical level for analysis purposes. Implement and support Metadata Management, Data Lineage, Data Quality and other essential Data Governance functions. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Perform other duties as assigned. Qualifications: Ability to work independently and as part of a team to successfully execute projects. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Strong written and oral communication skills with ability to work with users, peers and management. Capital market or banking domain experience is preferred. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Structured Query Language (SQL) Data Governance Tools example Collibra, IBM ISEE, Informatica etc. Bachelors or masters degree in data analytics, computer science or related field. 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred.
01/04/2025
Full time
*We are unable to sponsor for this permanent Full time role* *Position is bonus eligible* Prestigious Financial Company is currently seeking a Data Governance MDM Analyst. Candidate will act as a liaison and translation layer between business and technical teams and operate at system- and detailed technical level for analysis purposes. Implement and support Metadata Management, Data Lineage, Data Quality and other essential Data Governance functions. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Perform other duties as assigned. Qualifications: Ability to work independently and as part of a team to successfully execute projects. Ability to multitask and meet aggressive deadlines efficiently and effectively. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Strong written and oral communication skills with ability to work with users, peers and management. Capital market or banking domain experience is preferred. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Structured Query Language (SQL) Data Governance Tools example Collibra, IBM ISEE, Informatica etc. Bachelors or masters degree in data analytics, computer science or related field. 7+ years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred.
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Technical Data Governance. This person will be the liaison between business/technical teams and will be focused on supporting and implementing data quality, Metadata management, data lineage, data mapping, data governance, etc. This person will need to be proficient with SQL data structure and will work within the Collibra Platform. This person must come from a financial company. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Qualifications: Bachelors or masters degree in data analytics, computer science or related field. 5-7 years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Data Governance Tools example Collibra, IBM ISEE, Informatica etc.
01/04/2025
Full time
*Hybrid, 3 days onsite, 2 days remote* *We are unable to sponsor as this is a permanent Full time role* A prestigious company is looking for an Associate Principal, Technical Data Governance. This person will be the liaison between business/technical teams and will be focused on supporting and implementing data quality, Metadata management, data lineage, data mapping, data governance, etc. This person will need to be proficient with SQL data structure and will work within the Collibra Platform. This person must come from a financial company. Responsibilities: Work closely with Data Domain Owners and SMEs to identify CDE s (Critical Data Elements), define data elements for the Business Glossary and define business rules. Identify data sources and build out business glossary collaborating with data owners/stewards, collaborate with data modelers to review definitions of business terms vs technical terms. Ability to communicate effectively with Business and Technical SMEs, Architects, Analysts, Developers and other IT and business teams. Create, develop, configure and execute end to end business and technical data lineage across disparate sources working with business and technical SMEs/developers understanding the applications system/technical design and create data flow diagrams/data mappings. Experience working on metadata management ingesting metadata using metadata bridges/connectors, creating and ingesting custom source to target mappings, custom metadata assets. Develop Metadata, Data Lineage and Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc. Design, Build and Execute Data Quality Rules over the CDEs identified according to the business needs to ensure clean and healthy data. Manage Data Quality Exceptions and help the team to find out the root cause analysis. Implement data governance policies, procedures, controls, and standards. Hands on experience working on data quality and data governance technology and tools. Utilize data profiling and data quality tools to analyze and determine causes of data quality issues. Design and implement Data Quality Dashboards for monitoring and reporting. Qualifications: Bachelors or masters degree in data analytics, computer science or related field. 5-7 years of experience in data governance disciplines: metadata management; data quality analysis; data quality remediation; data profiling and data lineage. DAMA certification preferred. Experience with Data Governance tools such as Collibra, IBM Infosphere Information Server Suite or Informatica. Proficient with SQL. Strong data analysis capabilities. Proficient with Microsoft Office desktop tools (Word, Excel, etc.) Experience with Databases (ie Oracle, SQL Server, DB2, Amazon Redshift, NoSQL, Object-based) and ETL tools. Previous experience in designing and building data capabilities like data quality, metadata, data lineage, data catalog and data dictionary. Experience with Business Intelligence/Reporting tools such as Tableau/Cognos. Prior development/coding experience is preferred. Experience working on Protobuf, APIs, Kafka as Data Sources is preferred. Experience working with draw.io or other tools creating architecture or data flow diagrams. Data Governance Tools example Collibra, IBM ISEE, Informatica etc.