<p>Our client is looking for an experienced Data Governance Analyst to join their growing team. They need someone who can: Lead the development and implementation of data governance frameworks to support academic, administrative, and research data needs across the university system. Establish data stewardship roles and clarify data ownership for key institutional domains such as student information, financial aid, HR, research compliance, and finance. Create and enforce data policies, standards, and procedures to improve data quality, accuracy, accessibility, and security across campuses and departments. Ensure compliance with higher-ed regulatory and reporting requirements (e.g., FERPA, IPEDS, NCAA, state reporting), and coordinate with Legal, IT Security, and Institutional Compliance teams. Implement and optimize governance technology (data catalog, lineage, and quality tools) to support system-wide reporting, analytics, and decision support. Promote data literacy and provide training to faculty, staff, and administrators to enhance responsible and effective data use. Facilitate collaboration across academic units, administrative offices, and central IT to align governance efforts with institutional priorities and operational needs. Monitor data quality and governance KPIs, report progress to leadership, and drive continuous improvement to support strategic planning, accreditation, and institutional research initiatives. Expereince as a Data Governance analyst. They have a fragmented Data Governance framework in place, and the goal is for this person to unify it across the enterprise. The ideal candidate will be a data Governance Analyst looking for a more challenging opportunity to lead the implementation of Purview and advancing our data governance practices. Administration experience with Microsoft Purview or a similar tool like Collibra, Informatica, Databricks, Etc. This role will be assisting to connect Microsoft Fabric to Purview. Experience with Microsoft Purview is preferred. They have the Data Security layer of Purview implemented. This role will be working with the Microsoft partner implement the Data Governance layer (Unified Data Catalogue, Data Quality, Data Lineage, Data Health management.) See attached overview. Excellent communication skills. Someone who will lead change and help advance their DG practice. Get buy in from stakeholders. </p>
We are looking for a skilled Data Engineer to support our organization's data initiatives in Savannah, Georgia. This Contract to permanent role focuses on managing, optimizing, and securing data systems to drive strategic decision-making and improve overall performance. The ideal candidate will work closely with technology teams, analytics departments, and business stakeholders to ensure seamless data integration, accuracy, and scalability.<br><br>Responsibilities:<br>• Design and implement robust data lake and warehouse architectures to support organizational needs.<br>• Develop efficient ETL pipelines to process and integrate data from multiple sources.<br>• Collaborate with analytics teams to create and refine data models for reporting and visualization.<br>• Monitor and maintain data systems to ensure quality, security, and availability.<br>• Troubleshoot data-related issues and perform in-depth analyses to identify solutions.<br>• Define and manage organizational data assets, including SaaS tools and platforms.<br>• Partner with IT and security teams to meet compliance and governance standards.<br>• Document workflows, pipelines, and architecture for knowledge sharing and long-term use.<br>• Translate business requirements into technical solutions that meet reporting and analytics needs.<br>• Provide guidance and mentorship to team members on data usage and best practices.
We are looking for a skilled Data Analyst to join our team in Delray Beach, Florida, on a contract with the potential for long-term employment. In this role, you will bridge the gap between management and technology teams, ensuring data is organized, analyzed, and leveraged effectively to meet business needs. This position offers an opportunity to take ownership of data reporting processes and deliver solutions that drive organizational success.<br><br>Responsibilities:<br>• Retrieve, organize, and analyze data to support strategic business decisions.<br>• Serve as a subject matter expert on organizational data systems and promote their adoption across teams.<br>• Develop and implement data reporting solutions for internal and external stakeholders.<br>• Conduct daily, monthly, and quarterly data uploads and submissions as required.<br>• Maintain the security of organizational data by enforcing appropriate access controls.<br>• Provide assistance and support to end-users on data systems and tools.<br>• Lead onboarding training sessions and ongoing education initiatives related to data systems.<br>• Utilize Microsoft Access and other reporting tools to create insightful reports.<br>• Manage ad hoc reporting requests to address immediate business needs.<br>• Participate in meetings, training sessions, and other tasks as assigned by the supervisor.
We are looking for an experienced Senior Data Engineer to join our team in Atlanta, Georgia. This role is ideal for someone with a strong background in data architecture, cloud platforms, and analytics tools. You will play a key role in designing, building, and optimizing data systems to support business operations and decision-making.<br><br>Responsibilities:<br>• Develop and maintain scalable data models and database designs to support business needs.<br>• Implement and manage data integration workflows using ETL processes and tools.<br>• Build and optimize data lakes and LakeHouse architectures on Azure platforms.<br>• Utilize Microsoft Fabric and Azure Databricks to create advanced data solutions.<br>• Design and develop dashboards and reports using Power BI to provide actionable insights.<br>• Ensure data governance by establishing policies, procedures, and standards for data use.<br>• Collaborate with cross-functional teams to align data strategies with organizational goals.<br>• Leverage Python and SQL for data analysis, transformation, and automation.<br>• Work with middleware solutions like MuleSoft for efficient data communication and integration.<br>• Stay updated on emerging technologies to continuously improve data engineering practices.
We are looking for an experienced Lead Data Engineer to oversee the design, implementation, and management of advanced data infrastructure in Houston, Texas. This role requires expertise in architecting scalable solutions, optimizing data pipelines, and ensuring data quality to support analytics, machine learning, and real-time processing. The ideal candidate will have a deep understanding of Lakehouse architecture and Medallion design principles to deliver robust and governed data solutions.<br><br>Responsibilities:<br>• Develop and implement scalable data pipelines to ingest, process, and store large datasets using tools such as Apache Spark, Hadoop, and Kafka.<br>• Utilize cloud platforms like AWS or Azure to manage data storage and processing, leveraging services such as S3, Lambda, and Azure Data Lake.<br>• Design and operationalize data architecture following Medallion patterns to ensure data usability and quality across Bronze, Silver, and Gold layers.<br>• Build and optimize data models and storage solutions, including Databricks Lakehouses, to support analytical and operational needs.<br>• Automate data workflows using tools like Apache Airflow and Fivetran to streamline integration and improve efficiency.<br>• Lead initiatives to establish best practices in data management, facilitating knowledge sharing and collaboration across technical and business teams.<br>• Collaborate with data scientists to provide infrastructure and tools for complex analytical models, using programming languages like Python or R.<br>• Implement and enforce data governance policies, including encryption, masking, and access controls, within cloud environments.<br>• Monitor and troubleshoot data pipelines for performance issues, applying tuning techniques to enhance throughput and reliability.<br>• Stay updated with emerging technologies in data engineering and advocate for improvements to the organization's data systems.
We are looking for a skilled Data Engineer to join our logistics team in Lithonia, Georgia. In this role, you will design, construct, and maintain data pipelines and infrastructure to support analytics and operational systems. You will play a key role in enabling data visualization tools, optimizing data processes, and ensuring the accuracy and availability of critical information.<br><br>Responsibilities:<br>• Design and implement data pipelines to efficiently extract, transform, and load data from multiple sources.<br>• Develop and maintain data models and storage solutions to support analytics and reporting needs.<br>• Collaborate with stakeholders to troubleshoot data inconsistencies and resolve technical issues.<br>• Utilize Tableau or Power BI to create meaningful data visualizations that drive business insights.<br>• Write and optimize database procedures, triggers, and other SQL-based functionalities.<br>• Manage and monitor databases to ensure their performance and reliability.<br>• Provide technical guidance to analysts on best practices in data governance and performance optimization.<br>• Participate in cross-functional projects to enhance data accessibility and quality across departments.<br>• Explore and integrate Python-based solutions to enhance data engineering processes.<br>• Assist in training and development related to data availability and analytics tools.
We are looking for a skilled Data Analyst / Engineer to join our team in Marysville, Ohio. In this long-term contract position, you will play a vital role in analyzing complex data sets, developing solutions, and ensuring data security and quality. This onsite role requires working four days a week on location, collaborating with cross-functional teams to drive insightful decision-making.<br><br>Responsibilities:<br>• Analyze and interpret complex data sets to uncover meaningful insights and trends.<br>• Design and maintain efficient data pipelines utilizing AWS services such as S3, Redshift, and Lambda.<br>• Collaborate with developers, designers, and business stakeholders to gather requirements and deliver tailored data solutions.<br>• Present findings and recommendations effectively to both technical and non-technical audiences.<br>• Lead data-related projects, ensuring accurate and timely delivery of tasks.<br>• Implement data governance standards and best practices to maintain data integrity and security.<br>• Create and manage data models, dashboards, and reports to visualize performance metrics.<br>• Troubleshoot data issues and provide technical support to resolve challenges.<br>• Stay informed on emerging data technologies to improve processes and outcomes.
<p>IMMEDIATE HIRE NEEDED. Interviews to begin the first week of February. </p><p><br></p><p>We are looking for a skilled Snowflake Marketing Data Engineer to join our team in Tampa, Florida in a hybrid in-office work schedule (2 to 3 days remote per week) preferably, remote candidates may be considered depending of the quality in match. </p><p><br></p><p>In this role, you will be responsible for designing, implementing, and maintaining data solutions that support critical business operations. Your expertise will play a key part in driving data-driven decisions and optimizing performance across various platforms.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain ETL processes to efficiently extract, transform, and load data from multiple sources.</p><p>• Analyze marketing data to uncover insights and support strategic decision-making.</p><p>• Create and manage dashboards and reports using Power BI to visualize data effectively.</p><p>• Integrate and leverage tools like Braze and Google Analytics to enhance data tracking and reporting capabilities.</p><p>• Collaborate with cross-functional teams to ensure the accuracy and reliability of data systems.</p><p>• Optimize database performance and troubleshoot any issues related to data pipelines.</p><p>• Document data workflows and provide training to stakeholders on best practices.</p><p>• Work with cloud-based platforms, such as Snowflake, to store and manage large datasets.</p><p>• Ensure data security and compliance with company policies and standards.</p>
We are looking for a skilled Data Reporting Analyst to join our team in Cincinnati, Ohio. This long-term contract position offers an exciting opportunity to contribute to data-driven decision-making processes within the Personal Loans Marketing team. The ideal candidate will leverage their expertise in data analysis, reporting, and visualization to provide actionable insights and optimize marketing performance.<br><br>Responsibilities:<br>• Develop and maintain efficient data workflows to support marketing dashboards and reporting systems.<br>• Document data sources, definitions, and reporting logic to ensure consistency and governance.<br>• Optimize database queries and backend data processes to enhance scalability and performance.<br>• Conduct quality assurance checks to maintain data integrity and accuracy across reports.<br>• Automate recurring reports to monitor campaign metrics such as performance, conversions, and cost-per-acquisition.<br>• Collaborate with engineering and data science teams to streamline data pipelines and improve data quality.<br>• Design and implement interactive dashboards using tools like Tableau to visualize key performance indicators.<br>• Provide ad hoc data analysis and reporting to support marketing campaign decision-making.<br>• Ensure data models and workflows are scalable and align with business needs.
We are looking for a skilled Data Engineer to join our team in Philadelphia, Pennsylvania. In this long-term contract position, you will play a key role in managing and optimizing large-scale data pipelines and systems within the healthcare industry. Your expertise will contribute to the development of robust solutions for data processing, analysis, and integration.<br><br>Responsibilities:<br>• Design, develop, and maintain large-scale data pipelines to support business needs.<br>• Optimize data workflows using tools such as Apache Spark and Python.<br>• Implement and manage ETL processes for seamless data transformation and integration.<br>• Collaborate with cross-functional teams to ensure data solutions align with organizational goals.<br>• Monitor and troubleshoot data systems to ensure consistent performance and reliability.<br>• Work with Apache Hadoop and Apache Kafka to enhance data storage and streaming capabilities.<br>• Ensure compliance with data security and privacy standards.<br>• Analyze and interpret complex datasets to provide actionable insights.<br>• Document processes and solutions to support future scalability and maintenance.
We are looking for a skilled Data Engineer to join our team in Houston, Texas. As part of the Manufacturing industry, you will play a pivotal role in developing and maintaining data infrastructure critical to our operations. This is a long-term contract position that offers the opportunity to work on innovative projects and collaborate with a dynamic team.<br><br>Responsibilities:<br>• Design and implement scalable data pipelines to support business operations and analytics.<br>• Develop, test, and maintain ETL processes for efficient data extraction, transformation, and loading.<br>• Utilize tools such as Apache Spark and Hadoop to manage and process large datasets.<br>• Integrate and optimize data streaming platforms like Apache Kafka.<br>• Collaborate with cross-functional teams to ensure data solutions align with organizational goals.<br>• Monitor and troubleshoot data systems to ensure optimal performance and reliability.<br>• Create and maintain documentation for data processes and systems.<br>• Stay updated on emerging technologies and recommend improvements to enhance data engineering practices.<br>• Ensure data security and compliance with industry standards and regulations.
We are seeking a detail-oriented and analytical Data Analyst to join our team in the Metro Atlanta area. This role is ideal for a detail oriented who enjoys transforming complex data into actionable insights that drive business decisions. You will work cross-functionally with stakeholders across operations, finance, marketing, and technology to support strategic initiatives through data-driven analysis. <br> Key Responsibilities Collect, clean, and analyze large datasets to identify trends and insights Develop dashboards and reports using BI tools (Power BI, Tableau, or similar) Write and optimize SQL queries to extract and manipulate data Perform ad hoc analysis to support business initiatives and leadership requests Translate business requirements into technical data solutions Monitor KPIs and provide actionable recommendations Support data validation, integrity checks, and quality assurance processes Collaborate with engineering and IT teams to improve data pipelines
<p>We are looking for an experienced Data Engineer to join a dynamic team in Oklahoma City, Oklahoma. In this role, you will play a crucial part in designing and maintaining data infrastructure to support analytics and decision-making processes. You will be a key contributor in developing, optimizing, and maintaining the data infrastructure that supports analytics and business intelligence initiatives, and data driven decision making using Snowflake, Matillion, and other tools. Position will be in-office to work closely with the team. No 3rd parties please.</p><p><br></p><p> Responsibilities:</p><p> </p><p> • Design, develop, and maintain scalable data pipelines to support data integration and real-time processing.</p><p> • Implement and manage data warehouse solutions, with a strong focus on Snowflake architecture and optimization.</p><p> • Write efficient and effective scripts and tools using Python to automate workflows and enhance data processing capabilities.</p><p> • Work with SQL Server to design, query, and optimize relational databases in support of analytics and reporting needs.</p><p> • Monitor and troubleshoot data pipelines, resolving any performance or reliability issues.</p><p> • Ensure data quality, governance, and integrity by implementing and enforcing best practices.</p><p> </p><p><br></p>
<p>We are looking for an experienced Senior Data Engineer to join our team in Denver, Colorado. In this role, you will design and implement data solutions that drive business insights and operational efficiency. You will collaborate with cross-functional teams to manage data pipelines, optimize workflows, and ensure the integrity and security of data systems.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain robust data pipelines to process and transform large datasets effectively.</p><p>• Advise on tools / technologies to implement. </p><p>• Collaborate with stakeholders to understand data requirements and translate them into technical solutions.</p><p>• Design and implement ETL processes to facilitate seamless data integration.</p><p>• Optimize data workflows and ensure system performance meets organizational needs.</p><p>• Work with Apache Spark, Hadoop, and Kafka to build scalable data systems.</p><p>• Create and maintain SQL queries for data extraction and analysis.</p><p>• Ensure data security and integrity by adhering to best practices.</p><p>• Troubleshoot and resolve issues in data systems to minimize downtime.</p><p>• Provide technical guidance and mentorship to less experienced team members.</p><p>• Stay updated on emerging technologies to enhance data engineering practices.</p>
<p>We are looking for a skilled and innovative Data Engineer to join our team in Grove City, Ohio. In this role, you will be responsible for designing and implementing advanced data pipelines, ensuring the seamless integration and accessibility of data across various systems. As a key player in our analytics and data infrastructure efforts, you will contribute to building a robust and scalable data ecosystem to support AI and machine learning initiatives.</p><p><br></p><p>Responsibilities:</p><p>• Design and develop scalable data pipelines to ingest, process, and transform data from multiple sources.</p><p>• Optimize data models to support analytics, forecasting, and AI/ML applications.</p><p>• Collaborate with internal teams and external partners to enhance data engineering capabilities.</p><p>• Implement and enforce data governance, security, and quality standards across hybrid cloud environments.</p><p>• Work closely with analytics and data science teams to ensure seamless data accessibility and integration.</p><p>• Develop and maintain data products and services to enable actionable insights.</p><p>• Troubleshoot and improve the performance of data workflows and storage systems.</p><p>• Align data systems across departments to create a unified and reliable data infrastructure.</p><p>• Support innovation by leveraging big data tools and frameworks such as Databricks and Spark.</p>
<p>Robert Half is seeking an experienced Data Architect to design and lead scalable, secure, and high-performing enterprise data solutions. This role will focus on building next-generation cloud data platforms, driving adoption of modern analytics technologies, and ensuring alignment with governance and security standards.</p><p><br></p><p>You’ll serve as a hands-on technical leader, partnering closely with engineering, analytics, and business teams to architect data platforms that enable advanced analytics and AI/ML initiatives. This position blends deep technical expertise with strategic thinking to help unlock the value of data across the organization.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design and implement end-to-end data architecture for big data and advanced analytics platforms.</li><li>Architect and build Delta Lake–based lakehouse environments from the ground up, including DLT pipelines, PySpark jobs, workflows, Unity Catalog, and Medallion architecture.</li><li>Develop scalable data models that meet performance, security, and governance requirements.</li><li>Configure and optimize clusters, notebooks, and workflows to support ETL/ELT pipelines.</li><li>Integrate cloud data platforms with supporting services such as data storage, orchestration, secrets management, and analytics tools.</li><li>Establish and enforce best practices for data governance, security, and cost optimization.</li><li>Collaborate with data engineers, analysts, and stakeholders to translate business requirements into technical solutions.</li><li>Provide technical leadership and mentorship to team members.</li><li>Monitor, troubleshoot, and optimize data pipelines to ensure reliability and efficiency.</li><li>Ensure compliance with organizational and regulatory standards related to data privacy and security.</li><li>Create and maintain documentation for architecture, processes, and governance standards.</li></ul>
<p>As a Business Data Analyst or Specialist you will be responsible for managing data retrieval and analysis within SCMHC</p><p>The position includes organizing data points, communicating between upper management and the IT department, and analyzing data to determine business needs</p><p>Become subject matter expert for the organization’s EMR/EHR (Credible Behavioral Health) and champions it’s adoption</p><p>Develops data solutions from multiple data sources / applications</p><p>Develops and manages data reporting for internal and external consumers</p><p>Daily, monthly, & quarterly uploads/ Submits data as required</p><p>Ensure data is secure via proper access controls</p><p>Provides EMR/HER assistance and support to end-users</p><p>Organizes, implements and manages new hire as well as ongoing training for EMR/HER</p>
<p><strong>Data Pipeline Development</strong></p><ul><li>Design, build, and optimize scalable ETL/ELT pipelines to support analytics and operational workflows.</li><li>Ingest structured, semi-structured, and unstructured data from multiple internal and external sources.</li><li>Automate and orchestrate data workflows using tools like Airflow, Azure Data Factory, AWS Glue, or similar.</li></ul><p><strong>Data Architecture & Modeling</strong></p><ul><li>Develop and maintain data models, data marts, and data warehouses (relational, dimensional, and/or cloud-native).</li><li>Implement best practices for data partitioning, performance optimization, and storage management.</li><li>Work with BI developers, data scientists, and analysts to ensure datasets are structured to meet business needs.</li></ul><p><strong>Cloud Engineering & Storage</strong></p><ul><li>Build and maintain cloud data environments (Azure, AWS, GCP), including storage, compute, and security components.</li><li>Deploy and manage scalable data systems such as Snowflake, Databricks, BigQuery, Redshift, or Synapse.</li><li>Optimize cloud data cost, performance, and governance.</li></ul><p><strong>Data Quality & Reliability</strong></p><ul><li>Implement data validation, error handling, and monitoring to ensure accuracy, completeness, and reliability.</li><li>Troubleshoot pipeline failures, performance issues, and data discrepancies.</li><li>Maintain documentation and data lineage for transparency and auditability.</li></ul><p><strong>Collaboration & Cross‑Functional Support</strong></p><ul><li>Partner with product, engineering, and analytics teams to translate business requirements into technical solutions.</li><li>Support self-service analytics initiatives by preparing high-quality datasets and data products.</li><li>Provide technical guidance on data best practices and engineering standards.</li></ul><p><br></p><p><br></p>
<p>The Data Architect is responsible for defining, governing, and evolving the organization's enterprise data architecture across ERP, operational systems, and analytics platforms. This role ensures data consistency, scalability, and integrity as the organization executes ERP implementations and OpCo rollouts.</p><p>This role <strong>does not</strong> perform day-to-day reporting or operational data fixes. It defines the structure, standards, and guardrails that others operate within.</p><p><strong>Key responsibilities</strong></p><p><strong>Data architecture and design</strong></p><ul><li>Define and maintain enterprise data models across ERP, operational, and analytics platforms.</li><li>Design canonical data models for core domains such as customers, vendors, jobs, projects, financials, and assets.</li><li>Define data relationships and ownership across BuildOps, Procore, ERP finance, and downstream analytics systems.</li><li>Establish standards for master data, reference data, and transactional data.</li></ul><p><strong>Data governance and quality</strong></p><ul><li>Define data ownership, stewardship, and accountability by domain.</li><li>Establish data quality rules, validation standards, and reconciliation frameworks.</li><li>Partner with Applications Management and Operations Success Managers to align system configuration with data standards.</li><li>Define auditability and traceability standards for financial and operational data.</li></ul><p><strong>Integration and analytics enablement</strong></p><ul><li>Partner with integration engineers to define data contracts, schemas, and transformation rules.</li><li>Ensure data models support reporting, BI, and downstream analytics use cases.</li><li>Review and approve data design decisions for new integrations and ERP modules.</li></ul><p><strong>ERP and implementation support</strong></p><ul><li>Support ERP implementations by validating data design, mappings, and cutover readiness.</li><li>Review data migration strategies to ensure alignment with target-state architecture.</li><li>Provide architectural guidance during fit-gap, design, and testing phases.</li></ul><p><br></p>
<p><strong>Lead Data Architect / Senior Data Modeler</strong></p><p><strong>Location:</strong> Onsite in Burbank, CA (4 days/week, Mon–Thu)</p><p><strong>Team:</strong> Studio Technology – Data Services</p><p><br></p><p><strong>About the Role</strong></p><p>The Studio Technology group at Walt Disney Studios empowers filmmakers, analysts, and business partners with scalable, secure, and innovative data solutions. As part of the Data Services team, you will shape how the Studio builds, organizes, and delivers data — from financial systems to marketing analytics to consumer engagement products.</p><p>We are looking for a Lead Data Architect whose primary passion is data modeling and data architecture, not software architecture. This person lives and breathes semantic models, dimensional design, medallion-layer structuring, and data-as-a-product principles. The ideal candidate is doing this work hands-on today and has done it consistently in recent roles — this is their core craft, not an adjacent skill.</p><p><br></p><p><strong>What You’ll Do (Day-to-Day)</strong></p><p>● Partner with business stakeholders and product managers to understand data requirements and translate them into scalable data strategies and architecture.</p><p>● Design and own semantic models, dimensional models, entity definitions, and medallion-layer data structures (Bronze/Silver/Gold).</p><p>● Determine what gets built where and why, ensuring consistent modeling patterns across projects and data domains.</p><p>● Define and implement data-as-a-product approaches, including domain boundaries, ownership models, SLAs, and documentation standards.</p><p>● Work closely with data engineers to guide how assets should be built, structured, and optimized.</p><p>● Establish and maintain data standards, governance practices, naming conventions, and documentation for modeling and architecture.</p><p>● Provide technical leadership on data modeling best practices across teams, including coaching junior members.</p><p>● Evaluate modern data technologies and recommend tools that improve modeling consistency, performance, and quality.</p><p>● Collaborate with analytics, BI, and data science teams to design models that support reporting, ML/AI workloads, and cross-domain analytics.</p>
<p>We are looking for a skilled Data Analyst to join our team on a long-term contract basis in Southern California. In this role, you will take charge of analyzing complex datasets to derive actionable insights and support critical business decisions within the automotive industry. You will collaborate across teams to ensure data accuracy and streamline workflows while leveraging your expertise in advanced analytics and cloud-based platforms.</p><p><strong>What will this person be working on</strong></p><p>As a Data Analyst / CDP Developer, the candidate will support the customer data platform, improve data quality, implement scalable data solutions, and address end‑to‑end data challenges. This role requires advanced SQL development, CDP integration, pipeline operations, and troubleshooting across large‑scale datasets. The ideal candidate is highly analytical, detail‑oriented, and thrives in a fast‑paced, cross‑functional environment</p><p><strong>Responsibilities:</strong></p><p>• Analyze large-scale datasets to uncover trends, validate data quality, and provide insights that drive business strategies.</p><p>• Develop and optimize advanced queries using tools like Presto, Hive, and NoSQL to support data operations.</p><p>• Utilize Python for data processing, automation, and workflow development to enhance system functionality.</p><p>• Apply cloud data architecture principles within platforms such as AWS to improve data management.</p><p>• Conduct exploratory data analysis to identify anomalies, validate assumptions, and meet stakeholder needs.</p><p>• Document data processes, workflows, and business rules in platforms such as Confluence to ensure clarity and consistency.</p><p>• Collaborate with cross-functional teams, including engineering, marketing, and subject matter experts, to define and refine data requirements.</p><p>• Operate workflow orchestration tools like Digdag and TD Workflows to manage data pipelines and scheduled tasks.</p><p>• Implement and monitor data quality frameworks to ensure reliable data validation and system observability.</p><p>• Adhere to compliance and privacy standards while maintaining best practices for version control and CI/CD processes.</p>
Key Responsibilities:<br><br>· Design, develop, and maintain scalable backend systems to support data warehousing and data lake initiatives.<br><br>· Build and optimize ETL/ELT processes to extract, transform, and load data from various sources into centralized data repositories.<br><br>· Develop and implement integration solutions for seamless data exchange between systems, applications, and platforms.<br><br>· Collaborate with data architects, analysts, and other stakeholders to define and implement data models, schemas, and storage solutions.<br><br>· Ensure data quality, consistency, and security by implementing best practices and monitoring frameworks.<br><br>· Monitor and troubleshoot data pipelines and systems to ensure high availability and performance.<br><br>· Stay up-to-date with emerging technologies and trends in data engineering and integration to recommend improvements and innovations.<br><br>· Document technical designs, processes, and standards for the team and stakeholders.<br><br><br><br>Qualifications:<br><br>· Bachelor’s degree in Computer Science, Engineering, or a related field; equivalent experience considered.<br><br>· Proven experience as a Data Engineer with 5 or more years of experience; or in a similar backend development role.<br><br>· Strong proficiency in programming languages such as Python, Java, or Scala.<br><br>· Hands-on experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Informatica, etc.).<br><br>· Extensive knowledge of relational and non-relational databases (e.g., SQL, NoSQL, PostgreSQL, MongoDB).<br><br>· Expertise in building and managing enterprise data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake).<br><br>· Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their data services.<br><br>· Experience with API integrations and data exchange protocols (e.g., REST, SOAP, JSON, XML).<br><br>· Solid understanding of data governance, security, and compliance standards.<br><br>· Strong analytical and problem-solving skills with attention to detail.<br><br>· Excellent communication and collaboration abilities.<br><br><br><br>Preferred Qualifications:<br><br>· Certifications in cloud platforms (AWS Certified Data Analytics, Azure Data Engineer, etc.)<br><br>· Experience with big data technologies (e.g., Apache Hadoop, Spark, Kafka).<br><br>· Knowledge of data visualization tools (e.g., Tableau, Power BI) for supporting downstream analytics.<br><br>· Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Jenkins).
<p>The Data Engineer role focuses on designing, building, and optimizing scalable data solutions that support diverse business needs. This position requires the ability to work independently while collaborating effectively in a fast-paced, agile environment. The individual in this role partners with cross-functional teams to gather data requirements, recommend enhancements to existing data pipelines and architectures, and ensure the reliability, performance, and efficiency of data processes.</p><p>Responsibilities</p><ul><li>Support the team’s adoption and continued evolution of the Databricks platform, leveraging features such as Delta Live Tables, workflows, and related tooling</li><li>Design, develop, and maintain data pipelines that extract data from relational sources, load it into a data lake, transform it as needed, and publish it to a Databricks-based lakehouse environment</li><li>Optimize data pipelines and processing workflows to improve performance, scalability, and overall efficiency</li><li>Implement data quality checks and validation logic to ensure data accuracy, consistency, and completeness</li><li>Create and maintain documentation including data mappings, data definitions, architectural diagrams, and data flow diagrams</li><li>Develop proof-of-concepts to evaluate and validate new technologies, tools, or data processes</li><li>Deploy, manage, and support code across non-production and production environments</li><li>Investigate, troubleshoot, and resolve data-related issues, including identifying root causes and implementing fixes</li><li>Identify performance bottlenecks and recommend optimization strategies, including database tuning and query performance improvements</li></ul>
We are looking for a motivated Data Analyst to join our team in Pittsburgh, Pennsylvania. This contract-to-permanent position offers an exciting opportunity to work closely with finance teams, gaining hands-on experience with data systems and processes. The ideal candidate will be eager to learn and contribute to maintaining and improving our data operations.<br><br>Responsibilities:<br>• Collaborate with finance teams to understand and address data-related queries and issues.<br>• Learn and master the organization's data warehouse systems to provide effective support.<br>• Perform regular maintenance routines to ensure data systems are functioning optimally.<br>• Automate existing processes to enhance efficiency and accuracy.<br>• Gain familiarity with Oracle's EPM platform and its application in financial data analysis.<br>• Develop an understanding of hierarchy and dimensions within data structures.<br>• Participate in training sessions to expand knowledge of Oracle platforms and related tools.<br>• Implement solutions to streamline data workflows and operations.<br>• Work onsite for a majority of the workweek, adhering to the team's schedule.
We are looking for a skilled Data Reporting Analyst to join our team in Philadelphia, Pennsylvania. This long-term contract position offers an exciting opportunity to support healthcare operations by transforming data into actionable insights. As part of this role, you will leverage your expertise in business intelligence and reporting tools to contribute to the success of our organization.<br><br>Responsibilities:<br>• Develop and maintain data reports using tools such as BusinessObjects and Microsoft SQL to meet business needs.<br>• Analyze complex datasets to uncover trends and provide actionable insights to stakeholders.<br>• Collaborate with teams across the organization to understand reporting requirements and deliver accurate solutions.<br>• Utilize Python and other programming languages to automate data processes and enhance reporting capabilities.<br>• Work with Erwin Data and other modeling tools to design and optimize data structures.<br>• Ensure the integrity and accuracy of data used for reporting and analysis.<br>• Support Epic Hospital Billing systems by generating reports and resolving data-related issues.<br>• Implement business intelligence strategies to improve decision-making processes.<br>• Troubleshoot and resolve issues related to reporting tools and data systems.<br>• Document reporting processes and maintain up-to-date technical specifications.