• Browse jobs
  • Find the right job type for you
  • Explore how we help job seekers
  • Finance and Accounting
  • Technology
  • Marketing and Creative
  • Administrative and Customer Support
  • Legal
  • Preview candidates
  • Contract talent
  • Permanent talent
  • Learn how we work with you
  • Executive search
  • Finance and Accounting
  • Technology
  • Marketing and Creative
  • Administrative and Customer Support
  • Legal
  • Technology
  • Risk, Audit and Compliance
  • Finance and Accounting
  • Digital, Marketing and Customer Experience
  • Legal
  • Operations
  • Human Resources
  • 2025 Salary Guide
  • Demand for Skilled Talent Report
  • Building Future-Forward Tech Teams
  • Job Market Outlook
  • Press Room
  • Salary and hiring trends
  • Adaptive working
  • Competitive advantage
  • Work/life balance
  • Diversity and inclusion
  • Browse jobs Find your next hire Our locations
    ;

    422 results for Data Engineer in Usa

    RelevanceDate Posted
    Create a Job Alert
    Email me about new Data Engineer jobs in Usa
    Are you sure you want to pass on this job?
    We are seeking a Senior Data Engineer to join our team in the Higher Education industry, based in PHILADELPHIA, Pennsylvania. The role involves enhancing data engineering practices within a collaborative setting, automating data quality and validation, and working closely with business users and stakeholders to define and implement data solutions that align with strategic business priorities.

    Responsibilities:
    • Enhancing data engineering practices within a team setting
    • Designing, developing, and deploying data ingestion processes and data products
    • Automating data quality and validation for efficient data management
    • Conducting technical peer reviews and contributing to architectural improvements
    • Working closely with business users and stakeholders to define and implement data solutions that align with strategic business priorities
    • Implementing secure data pipelines and data products
    • Troubleshooting and optimizing data processes and queries
    • Ensuring robust data governance, security, and data privacy practices
    • Translating business requirements into precise technical specifications
    • Participating in project planning, identifying milestones and resource needs
    • Collaborating with others as part of a cross-functional team
    • Evaluating business needs and priorities with the team
    • Ensuring the operation and support of production.
    • Proficiency in Apache Kafka, Apache Pig, and Apache Spark is essential.
    • Demonstrable experience with Cloud Technologies, including but not limited to AWS Technologies.
    • Expertise in Data Visualization techniques and tools is required.
    • Proven track record of Algorithm Implementation in previous roles.
    • Strong background in Analytics, preferably within the higher education sector.
    • Extensive experience with Apache Hadoop is a must.
    • Ability to develop robust and secure APIs.
    • Must be comfortable working with AWS Technologies.
    • Experience working in the higher education industry is desirable but not mandatory.
    • Candidates must have excellent problem-solving abilities and a strong team player mindset.

    Technology Doesn't Change the World, People Do.®

    Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles. Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. Download the Robert Half app and get 1-tap apply, notifications of AI-matched jobs, and much more. All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit roberthalf.gobenefits.net for more information. © 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking “Apply Now,” you’re agreeing to Robert Half’s Terms of Use.

    422 results for Data Engineer in Usa

    Sr. Data Engineer We are seeking a Senior Data Engineer to join our team in the Higher Education industry, based in PHILADELPHIA, Pennsylvania. The role involves enhancing data engineering practices within a collaborative setting, automating data quality and validation, and working closely with business users and stakeholders to define and implement data solutions that align with strategic business priorities.<br><br>Responsibilities:<br>• Enhancing data engineering practices within a team setting<br>• Designing, developing, and deploying data ingestion processes and data products<br>• Automating data quality and validation for efficient data management<br>• Conducting technical peer reviews and contributing to architectural improvements<br>• Working closely with business users and stakeholders to define and implement data solutions that align with strategic business priorities<br>• Implementing secure data pipelines and data products<br>• Troubleshooting and optimizing data processes and queries<br>• Ensuring robust data governance, security, and data privacy practices<br>• Translating business requirements into precise technical specifications<br>• Participating in project planning, identifying milestones and resource needs<br>• Collaborating with others as part of a cross-functional team<br>• Evaluating business needs and priorities with the team<br>• Ensuring the operation and support of production. Data Engineer <p> We are seeking a skilled Data Engineer to join a dynamic team in Oklahoma City. The ideal candidate will have hands-on experience working with Snowflake, Python, and SQL Server to design, implement, and maintain a data warehouse. The candidate must be passionate about transforming raw data into actionable insights and possess a strong ability to solve complex data challenges.</p><p><br></p><p> Responsibilities: </p><ul><li>Design, develop, and maintain scalable data pipelines to support data integration and real-time processing.</li><li>Implement and manage data warehouse solutions, with a strong focus on Snowflake architecture and optimization.</li><li>Write efficient and effective scripts and tools using Python to automate workflows and enhance data processing capabilities.</li><li>Work with SQL Server to design, query, and optimize relational databases in support of analytics and reporting needs.</li><li>Monitor and troubleshoot data pipelines, resolving any performance or reliability issues.</li><li>Ensure data quality, governance, and integrity by implementing and enforcing best practices.</li></ul><p><br></p> Data Engineer We are looking for a skilled Data Engineer to join our team in the Energy/Natural Resources industry, located in The Woodlands, Texas. As a Data Engineer, you will be tasked with developing, maintaining, and testing data architecture, as well as implementing algorithms and utilizing various technologies such as Apache Kafka, Apache Pig, and Apache Spark. <br><br>Responsibilities:<br>• Develop and maintain scalable data pipelines and build out new API integrations to support continuing increases in data volume and complexity.<br>• Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.<br>• Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.<br>• Write complex queries on large data sets and optimize their performance.<br>• Develop and maintain data architecture, data management standards, and conventions.<br>• Utilize Apache Kafka, Apache Pig, and Apache Spark to process and analyze large data sets.<br>• Implement algorithms and utilize Cloud Technologies for data visualization.<br>• Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.<br>• Work with data and analytics experts to strive for greater functionality in our data systems.<br>• Utilize AWS Technologies and Azure Active Directory for managing and governing the data lake, and other architectural components. Data Engineer <p>Robert Half is hiring! We are offering an exciting opportunity for a Data Engineer in the Manufacturing industry. This role will primarily involve designing and maintaining an efficient data architecture, data integration solutions, and data models, as well as managing databases and establishing data governance policies. </p><p><br></p><p>Responsibilities:</p><p>• Develop a robust and secure data architecture using Azure services, SQL Server, Dynamics 365, and Power Platform</p><p>• Design and implement data integration solutions using Azure Data Factory, Azure Synapse Link, Logic Apps, and Power Automate to ensure seamless data flow across multiple systems</p><p>• Create and upkeep conceptual, logical, and physical data models in line with the business requirements and overall data strategy</p><p>• Provide recommendations for database design, performance tuning, and optimization for SQL Server and Azure SQL Database environments</p><p>• Establish and enforce data governance policies to ensure data quality, consistency, and security across all data systems</p><p>• Continuously monitor and optimize data systems and data pipelines for performance, scalability, and cost efficiency</p><p>• Collaborate with cross-functional teams to comprehend data requirements and deliver effective solutions</p><p>• Create detailed documentation for data architecture, including design specifications, data dictionaries, and operational procedures</p><p>• Stay updated with industry trends and best practices in data architecture and recommend improvements to enhance data management capabilities</p><p>• Troubleshoot and resolve data-related issues, ensuring data availability and reliability</p> Data Engineer We are looking for a skilled Data Engineer to join our team in Warren, New Jersey. As a Data Engineer, you will be responsible for designing and implementing data ETL frameworks for a secure Data Lake, maintaining the optimal pipeline architecture, and examining complex data to enhance its efficiency and quality. You will also be required to work hands-on with various AWS services and write advanced SQL scripts.<br><br>Responsibilities:<br>• Design and implement ETL frameworks for a secured Data Lake and maintain the optimal pipeline architecture<br>• Examine complex data sets to optimize the efficiency and quality of the data and resolve any data quality issues<br>• Collaborate with database developers to improve system and database designs<br>• Build data applications using AWS Glue, Lake Formation, Athena, AWS Batch, AWS Lambda, Python, Linux shell, and Batch scripting<br>• Use AWS Database services such as Redshift, RDS, DynamoDB, Aurora, etc.<br>• Write advanced SQL scripts involving self-joins, windows function, correlated subqueries, CTE’s etc.<br>• Utilize data management fundamentals, including concepts such as data dictionaries, data models, validation, and reporting<br>• Leverage skills in Apache Kafka, Apache Pig, Apache Spark, Cloud Technologies, Data Visualization, Algorithm Implementation, Analytics, Apache Hadoop, API Development, and Amazon Web Services (AWS) Data Engineer <p>We are on the lookout for a Data Engineer in Basking Ridge, New Jersey. (1-2 days a week on-site*) In this role, you will be required to develop and maintain business intelligence and analytics solutions, integrating complex data sources for decision support systems. You will also be expected to have a hands-on approach towards application development, particularly with the Microsoft Azure suite.</p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Develop and maintain advanced analytics solutions using tools such as Apache Kafka, Apache Pig, Apache Spark, and AWS Technologies.</p><p>• Work extensively with Microsoft Azure suite for application development.</p><p>• Implement algorithms and develop APIs.</p><p>• Handle integration of complex data sources for decision support systems in the enterprise data warehouse.</p><p>• Utilize Cloud Technologies and Data Visualization tools to enhance business intelligence.</p><p>• Work with various types of data including Clinical Trials Data, Genomics and Bio Marker Data, Real World Data, and Discovery Data.</p><p>• Maintain familiarity with key industry best practices in a regulated “GXP” environment.</p><p>• Work with commercial pharmaceutical/business information, Supply Chain, Finance, and HR data.</p><p>• Leverage Apache Hadoop for handling large datasets.</p> Cybersecurity Systems & Data Engineer We are inviting applications for the position of Cybersecurity Systems & Data Engineer in our team located in WEST CONSHOHOCKEN, Pennsylvania. You will play a pivotal role in maintaining and implementing data architecture in accordance with industry security standards and best practices. This role offers an exciting opportunity to contribute to our operations and work in a dynamic industry.<br><br>Responsibilities:<br>• Implement robust data encryption and access controls across critical data platforms including Azure Blob Storage, Databricks, SQL, and 3rd party data environments.<br>• Analyze vendor services and data requirements, working both independently and in coordination with key vendors.<br>• Assist in developing secure capabilities for data delivery and management between platforms, focusing on Data Loss Prevention methods and protections for data in transit, at rest, and under transformation.<br>• Participate in incident response and troubleshoot complex issues.<br>• Identify opportunities to enhance network segmentation and protection strategies.<br>• Perform complex data analysis and suggest new network flows and architectures.<br>• Support the development of reporting and communication methods for the Director and client/venue IT personnel.<br>• Stay updated on trends and development opportunities within security regulatory, technology, and operational requirements.<br>• Implement platform and service configuration changes to meet information security requirements.<br>• Provide Tier III capabilities as needed to support Operations and GRC teams. Sr. Data Engineer ****This is a Fully Onsite Position**** <br> Are you ready to take your data engineering career to the next level? Do you excel in designing, developing, and implementing scalable analytics solutions on cloud platforms? If so, we are looking for a talented Senior Data Engineer to join our growing team. In this role, you will: Collaborate with source data application teams and product owners to design, implement, and support analytics solutions, providing actionable insights for better decision-making. Build, test, and deploy data migration and data engineering solutions utilizing Microsoft Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, among others. Take ownership of all facets of the development lifecycle, including cloud infrastructure, network security, data ingestion and preparation, data modeling, CI/CD pipeline setup, performance tuning, deployments, and production support. Drive the implementation of batch and streaming data pipelines leveraging state-of-the-art Azure technologies. Act as a technical leader, working effectively both within team environments and independently. Be part of a DevOps team, owning and maintaining the systems and analytics products you help build. Solve complex problems and design solutions to optimize data flow and analytics processes. Provide support and coordination across geographically dispersed teams while working within an Agile development framework. Sr Data Engineer We are in the search for a Sr Data Engineer to become a part of our team. This opportunity is based in WARMINSTER, Pennsylvania. The role involves the design of data architecture, building data pipelines, and ensuring system security within our industry.<br><br>Responsibilities:<br>• Develop storage solutions by selecting relevant technologies and writing code.<br>• Enable data flow by connecting systems, databases, and data warehouses.<br>• Create rules for cleaning, standardizing, and transforming data.<br>• Implement security controls and access management policies to ensure system security.<br>• Use monitoring tools to oversee system performance, troubleshoot issues, and optimize performance.<br>• Collaborate with data scientists, analysts, and business teams to understand needs and deliver solutions.<br>• Share knowledge by creating technical designs, workflows, and documentation. Data Engineer We are offering a long term contract employment opportunity for a Data Engineer in the Financial Services industry, based in New York. In this role, you will be focused on delivering Machine Learning solutions, working with different technologies and frameworks, and implementing analytics. <br><br>Responsibilities:<br><br>• Utilize your proficiency in Python and related AI libraries to deliver solutions<br>• Implement and maintain LLM frameworks and related technologies<br>• Work with RAG solutions, including vector embedding<br>• Use your hands-on experience to deliver LLM solutions to production<br>• Develop and maintain Apache Kafka, Apache Pig, and Apache Spark systems<br>• Work with Cloud Technologies and AWS Technologies to enhance our data infrastructure<br>• Implement algorithms and analytics to extract insights from data<br>• Develop APIs to facilitate data access and use<br>• Use Apache Hadoop for large scale data processing<br>• Create data visualizations to communicate findings and insights. Data Engineer - Informatica <p>We are seeking a skilled Data Analyst to join our team. The ideal candidate will have a strong background in building and executing SQL code, as well as experience with ETL (Extract, Transform, Load) processes. This role will involve working closely with various departments to gather, analyze, and interpret data to support business decisions and strategies.</p><p>Key Duties and Responsibilities:</p><p><br></p><p>· Develop, execute, and optimize SQL queries to extract and analyze data from various databases.</p><p>· Design, implement, and maintain Informatica ETL processes to ensure data is accurately and efficiently loaded into data warehouses.</p><p>· Collaborate with stakeholders to understand data requirements and provide actionable insights.</p><p>· Create and maintain data, reports, dashboards, and visualizations to support business operations.</p><p>· Ensure data integrity and accuracy by performing regular data quality checks.</p><p>· Identify trends, patterns, and anomalies in data to support business decision-making.</p><p>· Document data processes, methodologies, and findings for future reference.</p><p>· Collaborate closely with cross-functional teams to gather requirements and deliver reporting solutions that meet business needs.</p><p>· Provide guidance on best practices and ensure compliance with company policies and industry standards.</p> Data Engineer <p>We are looking for a highly motivated and experienced Implementation/Technical Lead to oversee the end-to-end deployment and operationalization of our Privates data platform. The ideal candidate will need to have a strong background in data engineering, platform architecture, sound technical knowledge on full stack technologies - Java/Python and will work on diverse projects from building APIs and web applications to data processing and automation around the data platform while ensuring alignment with business and technical requirements. This role requires Java/Python programming and related frameworks along with strong data engineering and problem solving skills mindset. </p><p><br></p><p>Key Responsibilities</p><ul><li>Architect, develop, and deploy scalable and reliable data pipelines, storage solutions, and analytical tools on Azure and Snowflake.</li><li>Ensure platform scalability, reliability, and security to meet current and future business needs.</li><li>Implement and manage data lakes, warehouses, and ELT/ETL processes.</li><li>Lead code reviews, enforce development best practices, and drive continuous improvement within the team.</li><li>Integrate real-time and batch data processing capabilities.</li><li>Implement and enforce data governance policies, security protocols, and regulatory requirements.</li><li>Establish robust data quality, lineage, and cataloging processes.</li><li>Develop and integrate APIs and backend systems using frameworks like Django, Flask, or FastAPI.</li><li>Create scripts and tools for data analysis, transformation, and automation.</li><li>Write and execute unit and integration tests and perform debugging to ensure software quality.</li><li>Collaborate effectively with developers, product managers, business analysts, and other stakeholders.</li><li>Translate business requirements into scalable technical solutions.</li><li>Manage code deployments through CI/CD pipelines and utilize GitHub for version control.</li></ul> Snowflake Senior Data Engineer <p>Role: Snowflake Data Engineer</p><p>Location: Remote</p><p>Duration: Through June, with potential for long-term extension</p><p>Interview Process:</p><p>• Two rounds:</p><p>1. Conversational round focused on background and work experience.</p><p>2. Technical interview (one hour) with Lead Architect and Data Engineers.</p><p>Project Details:</p><p>• Reason for Opening: Implementing a new order management system to manage data flow in and out of their data repository system. The new system will go live, and all data will flow into Snowflake. The role involves building the infrastructure, storing data in Snowflake, and creating data mappings for the new structure.</p><p>Day-to-Day Responsibilities:</p><p>• Heavy focus on Snowflake and PySpark transformations, data storage, and structuring.</p><p>• Participate in architectural conversations and design for Snowflake.</p><p>• Testing, scrubbing, and cleaning data.</p><p>Required Skillsets:</p><p>• 10+ years of experience as a data engineer, with recent experience in Snowflake.</p><p>• Expertise in Snowflake, Python, and PySpark.</p><p>• Familiarity with Airflow and GitHub.</p><p>• Experience with version control, Medallion Architecture, and ETL pipelines.</p><p>• Experience building DAGs.</p><p>• Preferred experience with Databricks, understanding jobs, and converting them to Snowflake.</p><p>• Strong data architecture knowledge.</p><p>Work Breakdown:</p><p>• 80% development</p><p>• 20% architecture</p> CW - Senior Data Engineer (E) Position Description<br><br>The Senior Data Engineer is a senior technical role in supporting the information management architecture of the Enterprise Data Warehouse solution. The role will be actively responsible for designing the data acquisition data staging loading and transformation into the Enterprise Data Warehouse. This role will be a technical expert and resource collaborating with the Data Architect Software Engineers Product Owners and Project Team to develop and deliver data storage and movement solutions and to organize and oversee the loading of data into the related systems. Additionally the Senior Data Engineer will bridge gaps related to Business Intelligence functions supporting the analytics produced by the organization and providing expertise tying data movement together with data consumption. <br><br> Position Accountabilities:<br><br>· Design and develop complex ETL solutions using data warehouse design best practices <br><br>· Analyze data requirements data models and determine the best methods in extracting transforming and loading the data into the data staging warehouse and other system integration projects<br><br>· Create complex business intelligence reports and data visualizations using tools like Python Tableau and PowerBI<br><br>· Analyze business requirements and outline solutions<br><br>· Validate code against business and architectural requirements<br><br>· Create and test prototypes<br><br>· Troubleshoot applications and resolve defects<br><br>· Work within an agile framework<br><br>· Plan Prioritize and Deliver Resilient Scalable technical solutions<br><br>· Communicate ideas in both technical and user friendly language<br><br>· Update and maintain product documentation<br><br>· Escalate issues and impediments in a timely manner<br><br>· Work within established framework and processes Agile<br><br>· Collaboratively work with Agile teams as well as independently<br><br>· Perform and coordinate unit and system integration testing when required<br><br>· Participate in peer programming mobbing hackathons and code reviews as required<br><br> <br><br>· Support and occasionally lead business intelligence efforts data analytics efforts and data governance/quality efforts<br><br> <br><br>Position Qualifications<br><br>Education amp Experience<br><br>A Bachelor’s Degree or a combination of equivalent work experience<br>7 years of previous experience in information technology preferably within the financial services or other highly-regulated industry<br>5 years ETL development experience<br>3 years of Business Intelligence Data Analytics or Data Science experience<br>3 years of experience in an Agile environment<br>Knowledge of ETL and data warehouse design<br>Experience using Python for data movement/manipulation<br>Extensive experience with data dictionaries data analysis and relational databases<br>Experience with a business intelligence toolset<br><br><br>Preferred Qualifications<br><br>A Master’s Degree in a technology area of study preferably in Computer Science MIS or Analytics.<br><br><br> <br><br>Knowledge amp Skills<br><br>· · Creative problem solver with excellent communication leadership and collaboration ... Data Engineer Product Owner <p><strong>Job Description: Data Product Owner</strong></p><p><strong>Location:</strong> Glendale, CA (Onsite 4 days per week)</p><p><strong>Experience Level:</strong> 5+ years required</p><p><strong>Education:</strong> Bachelor’s Degree required</p><p><strong>Position Overview:</strong></p><p>The Technology group of a major entertainment studio is seeking a skilled and experienced <strong>Data Product Owner</strong> to join an internal data engineering team. This role will drive the development and management of datasets, reports, dashboards, and web applications, which support marketing and production teams. The team focuses on empowering analysts, data scientists, and operations to make data-driven decisions by delivering innovative technology solutions for storytelling.</p><p>This role involves coordinating multiple stakeholders, managing critical data products, and designing solutions that make data easily accessible and actionable. The position is customer-facing and requires onsite engagement with internal teams and stakeholders at least four days per week.</p><p><strong>Key Responsibilities: </strong></p><ul><li>Collaborate with full-stack, reporting, and data engineers in an agile team environment.</li><li>Define, maintain, and communicate roadmaps internally and to key stakeholders.</li><li>Coordinate the ingestion, aggregation, and materialization of datasets ranging in size from small to large scale for use by data analysts and scientists.</li><li>Lead the design and development of data visualizations and dashboards, primarily leveraging tools like Tableau and Power BI.</li><li>Audit and evaluate existing reports, dashboards, and Tableau usage to identify optimization opportunities and ensure alignment with best practices.</li><li>Act as a bridge between business and engineering teams to foster collaboration and ensure successful project implementation.</li><li>Gather and distill business requirements into precise product roadmaps and user stories.</li><li>Design data schemas and create fabricated test datasets to aid development efforts.</li></ul> Software Engineer <p>Seeking a proactive Software Engineer who find ways to improve efficiency, optimize workflows, and enhance productivity through innovative solutions. You will play a key role in designing, developing, and maintaining scalable and efficient software solutions while ensuring seamless integration with cloud and data platforms.</p><p><br></p><p>Key Responsibilities:</p><ul><li>Develop, test, and maintain software applications using Python and other relevant technologies.</li><li>Build and optimize APIs to enable seamless communication between services and external systems.</li><li>Work with AWS services to design, deploy, and manage scalable cloud-based applications.</li><li>Implement and manage Snowflake data solutions to support data engineering and analytics initiatives.</li><li>Design and maintain CI/CD pipelines to streamline the deployment process and improve system reliability.</li><li>Automate repetitive tasks and improve system efficiency using scripting and DevOps best practices.</li><li>Ensure software performance, scalability, and security best practices are followed.</li><li>Troubleshoot and resolve technical issues in Linux/Unix environments.</li><li>Collaborate with cross-functional teams, including DevOps, data engineers, and product managers, to deliver high-quality solutions.</li></ul><p><br></p> Sr. Software Engineer <p>Seeking a Sr. Software Engineer who find ways to improve efficiency, optimize workflows, and enhance productivity through innovative solutions. You will play a key role in designing, developing, and maintaining scalable and efficient software solutions while ensuring seamless integration with cloud and data platforms.</p><p><br></p><p>Key Responsibilities:</p><ul><li>Develop, test, and maintain software applications using Python and other relevant technologies.</li><li>Build and optimize APIs to enable seamless communication between services and external systems.</li><li>Work with AWS services to design, deploy, and manage scalable cloud-based applications.</li><li>Implement and manage Snowflake data solutions to support data engineering and analytics initiatives.</li><li>Design and maintain CI/CD pipelines to streamline the deployment process and improve system reliability.</li><li>Automate repetitive tasks and improve system efficiency using scripting and DevOps best practices.</li><li>Ensure software performance, scalability, and security best practices are followed.</li><li>Troubleshoot and resolve technical issues in Linux/Unix environments.</li><li>Collaborate with cross-functional teams, including DevOps, data engineers, and product managers, to deliver high-quality solutions.</li></ul><p><br></p> Software Application Engineer <p>We are offering an exciting opportunity for a Software Application Engineer that can sit in either NYC or the Greater Boston area. In this role, you will be working in a life sciences consulting firm, working closely with a team of data analysts, data engineers, software engineers, and product experts. You will be part of a team developing and maintaining software applications, contributing to their AI Software Development team. This role is hybrid, onsite 3 days a week. </p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain software applications using Javascript/Typescript, React, and Python</p><p>• Work with SQL, ideally PostgreSQL, for database management and operations</p><p>• Contribute to the AI team by participating in the development of GenAI/LLM</p><p>• Collaborate closely with a diverse team including data analysts, data engineers, and product experts</p><p>• Interface with clients, showcasing your excellent interpersonal skills</p><p>• Utilize your skills in Backend Development, Bug Tracking, and API Development</p><p>• Participate in the development of full-stack software using Node.js, Node, and React.js</p><p>• Show your entrepreneurial spirit by participating in the creation of new solutions and systems from scratch</p><p>• Ensure all solutions are properly tested and debugged before deployment.</p> Data Analytics Engineer We are on the lookout for an experienced Data Analytics Engineer to join our team based in Mesquite, Texas. In this role, you will be pivotal in transforming intricate employee data into digestible insights to aid in strategic HR decision-making, ultimately driving our organization's success. <br><br>Responsibilities:<br>• Accurately gather, clean, and integrate large-scale employee data from various sources like HRIS, surveys, and payroll using data management tools.<br>• Apply techniques such as regression analysis, predictive modeling, and machine learning to identify workforce trends including attrition, performance, and training needs. <br>• Interpret data findings to support various HR functions such as talent acquisition, compensation, workforce planning, learning and development, and retention strategies. <br>• Establish key performance indicators (KPIs) to evaluate HR initiatives and measure organizational progress.<br>• Collaborate with HR and IT departments to develop a comprehensive data warehouse and ensure analytics are aligned with business objectives.<br>• Prepare reports and present insights to leadership, serving as a subject matter expert in people analytics. <br>• Identify data integrity issues and recommend corrective actions to support HR process optimization.<br>• Stay updated on best practices in HR analytics and contribute to continuous improvement initiatives.<br>• Utilize data visualization tools like Tableau, Power BI, matplotlib, ggplot, d3.js, and programming languages like Python, R, SQL, DAX.<br>• Work independently and with cross-functional teams to enhance data-driven decision-making. Salesforce Data Integration Engineer <p>Are you a Data Integration Engineer with expertise in integration between multiple layers for real time live data? Do you have experience integrating data between Salesforce (SoQL), Mulesoft (or similar), Kafka (or similar), and ODS (SQL)? We have an excellent opportunity with a highly rated organization based in Des Moines, IA. This is an opportunity to work remotely anywhere in the US (CST work schedule). <u>Applicants must be eligible to convert without sponsorship. </u></p><p><br></p><p>What would you be doing in this Data Integration Engineer role?</p><p>The Data Integration Engineer will play a crucial role in designing, implementing, and managing data integration solutions to ensure seamless data flow between systems applications using SSIS, Kafka, MuleSoft & Salesforce.</p><p>Candidate should have strong knowledge and capability to ensure various data quality checks in Data & Integration platforms by using SQL, SOQL and or other database query languages. Familiarity with data integration process and tool is required.</p><p><br></p><ul><li><strong>Data Mapping and Transformation:</strong> Create and maintain data mapping documentation and implement rules to standardize and transform data during integration processes, outlining how data flows between systems.\</li><li><strong>Data Integration:</strong> Connect and integrate various data sources such as databases, APIs, cloud services, and third-party tools. Document issues related to data integration processes and APIs to report back to developers.</li><li><strong>Data Integrity:</strong> Ensure data integrity and accuracy during the integration process by implementing validation checks.</li><li><strong>Workflow Monitoring:</strong> Monitor integration workflows to ensure they run smoothly and are on schedule, adapting to changing requirements as needed.</li><li><strong>Collaboration and Deployment:</strong> Work closely with data engineers, business analysts, and stakeholders to understand data integration needs and deploy and test integration solutions.</li><li><strong>Documentation & Compliance:</strong> Document data integration processes, workflows, and system configurations and ensure data integration processes comply with relevant data governance policies and regulations.</li><li><strong>Secure Data Transfer:</strong> Ensure secure data transfer mechanisms are in place to protect sensitive information.</li><li><strong>Efficiency Improvement:</strong> Identify opportunities to improve the efficiency of data integration processes.</li><li><strong>Issue Diagnosis and Resolution:</strong> Diagnose and resolve data integration issues promptly to minimize downtime.</li><li><strong>Support and Recommendations:</strong> Provide support and recommendations for complex integration challenges.</li><li><strong>API Proficiency:</strong> Experience with APIs and data exchange protocols.</li><li><strong>Communication Skills:</strong> Strong communication skills with technical and non-technical stakeholders.</li><li><strong>Impact Analysis:</strong> Analyze and review potential adjustments or modifications for impacts on other programs.</li><li><strong>Technical Proficiency:</strong> Strong skills in SQL Server, Kafka, MuleSoft, and Salesforce.</li></ul><p><u>Applicants must be eligible to convert without sponsorship. </u></p> Senior Manager, Data Platform and Tools We are seeking an experienced detail oriented to lead and manage a suite of data tools and platforms, ensuring they effectively support business intelligence and data analytics initiatives. This role requires strong leadership, communication, and strategic planning abilities to drive adoption, foster a knowledge-sharing culture, and optimize platform performance. The ideal candidate will be customer-facing, capable of articulating business value, and skilled at guiding teams through change while maintaining a collaborative environment. <br> Key Responsibilities: Act as the primary liaison for data tools and platforms, effectively communicating their business value to stakeholders. Lead and develop a team of engineers, fostering a strong culture of growth, mentorship, and technical excellence. Establish and oversee a Community of Expertise (COE) for data cataloging, data quality, and database tools. Manage vendor relationships, including license renewals and contract negotiations. Develop and implement a strategic roadmap for enterprise-wide adoption of data tools such as Snowflake, Power BI, and graph databases. Design training programs to enhance the organization's ability to leverage BI and data management tools effectively. Guide the execution of BI and data engineering initiatives, ensuring alignment with business needs. Provide structure and leadership to transform "doers" into effective leaders within the team. Overcome organizational challenges by fostering resilience, accountability, and a results-driven approach. Data Integration Engineer <p><strong>Key Responsibilities:</strong></p><ul><li>Develop, maintain, and enhance business applications by providing expert technical and business knowledge.</li><li>Identify, analyze, and resolve complex business problems in multifunctional projects.</li><li>Assess the impact of technological changes and communicate their effects on business operations.</li><li>Apply advanced knowledge of technical concepts to design, implement, and deliver innovative solutions that align with organizational objectives.</li><li>Design integration solutions across cloud-native and microservice architectures, ensuring seamless data flow and system compatibility.</li><li>Lead or participate in discussions with cross-functional teams to build consensus on project or technical solutions.</li><li>Recommend and implement best practices, refined processes, and new methodologies to enhance efficiency and effectiveness.</li><li>Ensure timely product delivery through proper coordination and execution of advanced systems development and integration tasks.</li></ul> Technical Writer <p>We are in search of a Technical Writer to join our team in Glendale, California. As a Technical Writer, you will work closely with data engineers, software developers, and data product teams to develop and maintain high-quality documentation that supports the success of our data platform. This contract employment opportunity requires you to create both engineering-focused and user-centered materials, ensuring the usability and adoption of our data platform.</p><p><br></p><p>This will be an onsite role, available only on a W2.</p><p><br></p><p>Responsibilities:</p><p>• Collaborate with data engineers, software developers, and other team members to understand project requirements and develop comprehensive documentation.</p><p>• Translate complex technical concepts into clear, concise, and engaging content that caters to both technical and non-technical audiences.</p><p>• Develop and maintain information architecture templates that support easy updates and data migration while adhering to organizational and legal standards.</p><p>• Conduct research and create various types of documentation such as procedure manuals, API developer guides, configuration guides, and in-platform instructions.</p><p>• Evaluate, organize, and update existing documentation repositories, working with development leads to establish content management and storage best practices.</p><p>• Create alternate formats for documentation, such as video demos and guided tours, to enhance user engagement and understanding.</p><p>• Assess the quality and effectiveness of current content, proposing and implementing innovative methods for continuous improvement.</p><p>• Gain a deep understanding of data products, services, and technical infrastructure to create accessible, user-friendly content.</p> Cloud Engineer We are offering a contract to permanent employment opportunity for a Cloud Engineer in the Financial Services industry. The position is based in Plano, Texas, and involves working with Microsoft Azure technologies to build and optimize data pipelines, integrations, and analytics solutions.<br><br>Responsibilities:<br><br>• Collaborate with data analysts, engineers, and business stakeholders to understand requirements and deliver effective data solutions<br>• Build and maintain Power BI dashboards and reports to provide business insights<br>• Develop serverless data processing solutions utilizing your skills in C# and Azure Functions<br>• Design, develop, and maintain data pipelines using Azure Data Factory (ADF) and Azure Synapse Analytics<br>• Utilize Azure Synapse to optimize SQL queries and data storage for performance and scalability<br>• Manage Logic Apps for workflow automation and system integrations<br>• Implement CI/CD pipelines with GitHub and Octopus Deploy to automate deployment processes<br>• Uphold data security, governance, and compliance within the Azure ecosystem<br>• Monitor customer accounts and take appropriate action. Application Support Engineer <p>We are seeking an Power Generation Application Support Engineer who is passionate about technology and has experience in the integrated energy and commodity trading industry. The role is based in Houston, Texas and involves supporting power generation management applications. As an Application Support Engineer, you will have the opportunity to work with various systems that are crucial for the real-time operations of our power generation assets. </p><p><br></p><p>Responsibilities:</p><p>• Manage and maintain SCADA configurations, including RTU/FEP/ICCP config, setpoints, analog/digital control points</p><p>• Set up GMS calculations, AGC control logic, and calculations</p><p>• Proactively monitor and fix issues related to connectivity between RTU, GMS, and ICCP links</p><p>• Investigate and resolve data discrepancies and telemetry issues impacting dispatch</p><p>• Implement market (ISO) changes and build/modify advanced displays</p><p>• Manage user access, roles, and permissions, ensuring the system is NERC compliant</p><p>• Maintain system and process documentation</p><p>• Communicate with stakeholders/vendors on incident progress, impending changes, or agreed outages</p><p>• Log and prioritize support tickets in the ticketing system</p><p>• Set up and maintain external data interfaces, create and maintain one-line diagrams.</p>