Data Engineer<p>We are offering a short-term contract employment opportunity for a Data Engineer in Chicago, Illinois. As a Data Engineer, you will be instrumental in developing new data pipelines, enhancing current data architecture, and troubleshooting any issues that may arise. Your role will also involve implementing Azure data architectures and environments and collaborating within a project team to solve complex problems.</p><p><br></p><p>Responsibilities:</p><p>• Develop and architect new data pipelines to handle large amounts of data</p><p>• Enhance and optimize current data pipelines and data architecture to meet emerging data and analytic needs</p><p>• Troubleshoot and resolve issues and bottlenecks within the current data architecture</p><p>• Implement and design Azure data architectures and environments, optimizing applications for Azure when necessary</p><p>• Collaborate within a project team environment to solve complex problems, contributing to project success</p><p>• Utilize Azure services for data migration and data processing, including Serverless Architecture, Azure Storage, Azure SQL DB/DW, Data Factory, and more</p><p>• Maintain familiarity with the technology stack available for data management, data ingestion, capture, processing, and curation</p><p>• Work with cloud migration methodologies and processes, including tools like Azure Data Factory, Event Hub, etc.</p><p>• Display a strong knowledge of private and public cloud architectures, their pros/cons, and migration considerations.</p><p>• Contribute to a team-oriented environment, fostering creativity and problem-solving skills.</p>Data EngineerWe are seeking a Data Engineer to join our team based in Cleveland, Ohio. This role primarily involves working with Azure technologies including Azure Data Factory, Azure Data Lake, and Azure Synapse Analytics. The Data Engineer will be responsible for managing and optimizing data pipelines and developing techniques for big data handling and analysis.<br><br>Responsibilities:<br>• Design, construct, install, test and maintain data management systems.<br>• Ensure systems meet business requirements and industry practices.<br>• Build high-performance algorithms, prototypes, and conceptual models.<br>• Identify ways to improve data reliability, efficiency, and quality.<br>• Collaborate with data architects, modelers, and IT team members on project goals.<br>• Use Azure Data Factory for the orchestration and ETL/ELT data movement.<br>• Utilize Azure Data Lake for storing and analyzing data.<br>• Implement Azure Synapse Analytics to blend big data and relational data.<br>• Translate business needs into data solutions.<br>• Conduct testing and troubleshooting of data systems.Data EngineerWe are seeking a Data Engineer to join our team. As a Data Engineer, you will be responsible for designing and implementing data engineering solutions, leveraging tools like Snowflake, DBT, and Fivetran. This role offers a long term contract employment opportunity. <br><br>Responsibilities:<br>• Design, develop, and maintain data engineering solutions using Snowflake<br>• Implement ETL processes using DBT and other tools<br>• Utilize Fivetran for data ingestion tasks<br>• Ensure the efficient scheduling of data warehouse tasks with tools like Airflow or Dagster<br>• Develop and maintain APIs for data interaction<br>• Implement algorithms and analytics for data processing<br>• Leverage Apache Kafka, Apache Pig, and Apache Spark in data processing tasks<br>• Use AWS technologies for cloud-based data engineering solutions<br>• Implement data visualization techniques for data presentation<br>• Mentor entry level team members on data warehouse concepts and practices<br>• Ensure clear and effective communication with team members and stakeholders.Data EngineerWe are offering a long term contract employment opportunity for a proficient Data Engineer in Pittsburgh, Pennsylvania. The selected candidate will be primarily tasked with converting data from legacy Microsoft tools to modern AWS services, employing their expertise within the federal government contractor industry.<br><br>Responsibilities:<br><br>• Execute data migration from legacy Microsoft tools to AWS services with a focus on maintaining data integrity and accuracy.<br>• Utilize data visualization tools like Tableau or Power BI to process, analyze, and present data during migration workflows.<br>• Build and manage comprehensive ETL pipelines using AWS Glue, Athena, and related AWS tools.<br>• Employ AWS Step Functions to automate workflows and synchronize distributed application services.<br>• Establish secure data storage, processing, and access workflows in AWS S3 and RDS.<br>• Leverage AWS Kinesis for real-time data streaming in scalable data migration strategies.<br>• Collaborate with cross-functional teams to ensure all data migrations adhere to federal compliance standards and organizational requirements.<br>• Provide technical expertise on federal regulatory constraints, security policies, and operational best practices unique to the AWS.gov environment.<br>• Troubleshoot and resolve migration-related issues while mitigating risks during the transition process.<br>• Document processes, technical configurations, and migration steps effectively for audits and team reference.Data EngineerWe are offering a permanent employment opportunity for a Data Engineer in the insurance industry, based in Branchville, New Jersey. As a Data Engineer, your role will encompass the development and support of data applications, working in close collaboration with business analysts, application and enterprise architects, and providing technical guidance to the team for implementing complex data solutions.<br><br>Responsibilities:<br>• Collaborate with business analysts to understand data and business processes, making recommendations on best practices or long-term solutions to resolve current issues and for future system design.<br>• Engage in hands-on development and support of new or existing data applications.<br>• Work in partnership with Application and Enterprise Architects to understand high-level data flow designs and create/review low-level implementation designs.<br>• Offer technical guidance to the team for implementing intricate data solutions.<br>• Maintain detailed documentation to support downstream integrations.<br>• Provide support for production issues and execute the activities of a scrum master.<br>• Identify technology trends and explore opportunities for their use within the organization. <br>• Participate in the design, development, code reviews, testing, deployment, and documentation of data engineering and data integration applications.Sr Data Engineer<p>We are offering a contract employment opportunity for a Sr Data Engineer, EST time zone preferred. This role is deeply rooted in the data architecture and cloud data warehousing industry and involves extensive interaction with multiple departments such as IT, HR, Finance, Business Operations, Business Development, Sales, and Marketing. </p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Design, develop, and implement enterprise data solutions using tools like Snowflake and DBT.</p><p>• Collaborate with cross-functional teams to ensure data solutions are comprehensive, accessible, accurate, and secure.</p><p>• Lead the establishment and management of the organization's first Enterprise Data Lake and Data warehouse.</p><p>• Manage key reference data for the enterprise, ensuring accuracy and accessibility.</p><p>• Design, develop, and maintain data integration solutions to facilitate seamless data flow between 20+ enterprise systems/applications.</p><p>• Ensure proficiency in stakeholder management, Azure database platforms, and programming languages related to data warehouse management.</p><p>• Handle the processing of customer credit applications efficiently and accurately.</p><p>• Maintain accurate customer credit records and monitor customer accounts.</p>Data EngineerWe are in search of a Data Engineer, ready to utilize their skills in a dynamic setting. The industry we operate in is highly data-driven, and your role will be pivotal in managing and interpreting this data. Our workplace is situated in Grand Blanc, Michigan, United States. <br><br>Responsibilities:<br><br>• Develop and manage streaming applications using Apache Kafka for real-time data processing<br>• Implement algorithms and data structures to solve complex data problems<br>• Utilize Apache Pig and Apache Spark to process large datasets efficiently<br>• Develop APIs to facilitate data access<br>• Leverage cloud technologies to ensure scalable and secure data storage<br>• Perform data visualization to present insights in an easily understandable manner<br>• Use Python and SQL for data extraction and manipulation<br>• Implement Kubernetes and Terraform for orchestration and infrastructure as code<br>• Manage SQL Server and DB2 databases for structured data storage<br>• Ensure continuous integration and continuous delivery (CICD) for efficient software development and deployment<br>• Conduct analytics to derive valuable insights from collected data.Data Engineer<p>We are seeking a highly skilled <strong>Data Engineer</strong> with <strong>cloud experience (Azure, AWS, or Google Cloud)</strong> to design, develop, and maintain scalable data pipelines and infrastructure. You will work closely with data scientists, analysts, and software engineers to optimize data architecture and enable data-driven decision-making. If you have expertise in <strong>ETL processes, cloud platforms, and big data technologies</strong>, we’d love to hear from you!</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, build, and maintain <strong>scalable ETL pipelines</strong> to support analytics and machine learning workflows.</li><li>Develop and optimize <strong>data storage solutions</strong> on <strong>Azure, AWS, or Google Cloud</strong>.</li><li>Ensure <strong>data integrity, quality, and security</strong> across cloud-based architectures.</li><li>Work with <strong>SQL and NoSQL databases</strong> to manage structured and unstructured data.</li><li>Automate data workflows and integrate data from multiple sources (APIs, logs, third-party systems).</li><li>Collaborate with cross-functional teams to support <strong>real-time data streaming</strong> and analytics.</li><li>Implement <strong>best practices for data governance, compliance, and security</strong>.</li><li>Monitor and troubleshoot data pipelines to ensure <strong>high availability and performance</strong>.</li></ul><p><br></p>Data Engineer<p>Robert Half is growing its dedicated team of full-time IT consultants. We are experts in developing technology solutions for clients across local and national landscapes. Join us to not only shape innovative solutions but also hone your skills and learn new ones.</p><p>Be a part of a forward-thinking team, passionately providing optimal solutions to our clients. We champion collaboration, nurturing productive and prosperous partnerships to perform as a high-caliber team.</p><p> </p><p>Position: Data Engineer</p><p> </p><p>In this role, you'll use your expertise in data analysis and engineering to help us deliver data-driven solutions that drive client success.</p><p> </p><p>As part of our collaborative team, you'll have the opportunity to make your mark and influence decision-making through the power of data. If you have a passion for extracting insights from data and using that knowledge to drive strategic business decisions, we encourage you to apply.</p><p> </p><p> </p><p>Location: Minneapolis, MN (Must be local)</p><p> </p><p>Responsibilities:</p><p> </p><p> </p><p>• Manipulate, analyze, and interpret complex datasets using tools such as SQL/Snowflake.</p><p> </p><p>• Design and create data reports using SQL Server Reporting Services (SSRS).</p><p> </p><p>• Develop, implement, and maintain databases for data collection.</p><p> </p><p>• Create and manage Power BI data visualization and reporting tools.</p><p> </p><p>• Liaise with internal and external clients to fully understand data content.</p><p> </p><p>• Collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.</p><p> </p><p>• Drive data quality, integrity, and reliability throughout data pipelines and processes.</p><p> </p><p>• Adopt and promote a culture of data-driven decision-making in the organization.</p><p> </p><p>Requirements:</p><p> </p><p> </p><p>• Proven experience as a data analyst or business data analyst, with strong analytical skills and the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.</p><p> </p><p>• Experience with data analysis tools and languages such as SQL/Snowflake.</p><p> </p><p>• Experience with Power BI, SSRS, and other visualization tools.</p><p> </p><p>• Demonstrated ability to manage databases and database servers.</p><p> </p><p>• In-depth understanding of database management systems and online analytical processing (OLAP).</p><p> </p><p>• Exceptional data interpretation and data visualization skills.</p><p> </p><p>• Strong written and verbal communication skills, with the ability to simplify complex concepts and messages to the wider team.</p><p> </p><p>• Previous consulting or client-facing experience is a plus.</p>Senior Data EngineerWe are on the search for a Senior Data Engineer to join our team. In this role, you will be expected to leverage your analytical and problem-solving skills to manage and administrate our DBMS. Working in the IRVINGTON, New York, you will have the opportunity to design, develop, deploy, and optimize DB schemas, as well as liaise with developers to improve applications and establish best practices. <br><br>Responsibilities:<br><br>• Manage and administer the database management systems on a daily basis.<br>• Collaborate with application developers and business teams to assist with data-related technical issues and ensure successful design and delivery of data driven solutions.<br>• Design, develop, and deploy database schemas to meet application functionality and performance requirements.<br>• Create and implement SQL scripts and stored procedures.<br>• Develop and execute data reporting, data extract, data load and data validation processes and procedures.<br>• Implement processes and procedures for the development and release of products/projects that facilitate high quality and rapid deployment.<br>• Provide solutions to promote data integrity in enterprise systems, including data auditing, archive, backup, and restore solutions.<br>• Participate in code reviews to validate effectiveness and quality of code with internal team members as well as external vendor supported products and databases.<br>• Publish documentation and collaborative information using internal tools. <br>• Troubleshoot and resolve data related issues.Data Engineer<p>We are seeking a skilled <strong>Data Engineer</strong> to join our team and help build, optimize, and maintain our data infrastructure. In this role, you will work closely with data analysts, software engineers, and business stakeholders to design and implement scalable data solutions that support analytics, reporting, and business intelligence initiatives. You will ensure efficient data pipelines, data integrity, and the overall reliability of our data ecosystem.</p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, and maintain scalable data pipelines and ETL processes.</li><li>Build and optimize data architectures to support data warehousing, analytics, and business intelligence.</li><li>Develop and manage database solutions using SQL, NoSQL, and cloud-based technologies.</li><li>Ensure data quality, governance, and compliance with security standards.</li><li>Collaborate with cross-functional teams to identify data needs and deliver actionable insights.</li><li>Implement real-time and batch processing solutions for large-scale data sets.</li><li>Monitor and improve data system performance, troubleshooting bottlenecks as needed.</li><li>Stay up to date with industry trends and emerging data technologies.</li></ul><p><br></p>Data Engineer<p>We are seeking a skilled and detail-oriented <strong>Data Engineer</strong> to join our team! The ideal candidate will have extensive experience with <strong>SQL Server</strong>, <strong>ETL/ELT processes</strong>, <strong>Business Intelligence (BI)</strong> tools, and proficiency in <strong>Python</strong> or <strong>C#</strong>. In this role, you’ll play a vital part in ensuring seamless data processing, analytics, and the development of solutions that empower key business decisions.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and optimize <strong>SQL Server</strong> databases to support data-driven applications and reporting.</li><li>Implement and maintain efficient <strong>ETL (Extract, Transform, Load)</strong> or <strong>ELT</strong> pipelines to move and transform data across systems.</li><li>Develop and manage <strong>Business Intelligence (BI)</strong> solutions, including dashboards and reports, using tools like Power BI or Tableau.</li><li>Write clean, scalable code in <strong>Python</strong> or <strong>C#</strong> to automate data workflows, process external datasets, and manage APIs.</li><li>Collaborate with stakeholders to understand requirements, troubleshoot issues, and provide data-driven recommendations.</li><li>Ensure data accuracy, integrity, and security through proper validation, testing, and governance.</li><li>Optimize existing data architecture, identify performance issues, and provide solutions to improve scalability and reliability.</li><li>Stay up to date on technological advancements, databases, and tools to continuously enhance data practices.</li></ul>Data Engineer<p>We are seeking a Data Engineer to be an integral part of our team in Dublin, Ohio. This role is pivotal in our IT Corporate Finance Reporting department, where you will provide technical and consultative support on complex matters. You will be involved in the design and development of systems based on user specifications, as well as providing assistance related to data and infrastructure needs. This role offers a short-term contract employment opportunity.</p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Provide technical assistance to solve hardware or software problems.</p><p>• Analyze, design, and develop systems based on user specifications.</p><p>• Maintain knowledge of technical industry trends.</p><p>• Troubleshoot database issues related to connectivity and performance.</p><p>• Be involved in source control procedures using Git.</p><p>• Use Visual Studio for .NET and ETL SSIS development.</p><p>• Conduct Manual and Automated Testing.</p><p>• Work on enhancements or cross-impact projects.</p><p>• Utilise cloud technologies such as AWS, Azure, and Google Cloud Platform.</p><p>• Understand the processes and procedures within a corporate environment and work with different stakeholders.</p><p>• Have general database knowledge, including Microsoft SQL, Oracle, Snowflake, and Azure Data Lakes.</p><p>• Communicate effectively with team members and stakeholders.</p><p>• Support treasury applications, ensuring they run smoothly.</p>AWS Data Engineer<p>We are offering an excellent long-term opportunity for an AWS Data Engineer with one of our industry leading clients. This role allows for a hybrid work schedule. As an AWS Data Engineer, you will use your skills in data integration, processing, and optimization to support the organization's data-driven decision-making.</p><p><br></p><p>Responsibilities:</p><p>• Design and implement data integration workflows utilizing AWS tools, including Glue/EMR, Lambda, and Redshift</p><p>• Employ Python and PySpark for processing large datasets, ensuring efficiency and accuracy</p><p>• Validate and cleanse data as part of maintaining high data quality</p><p>• Implement monitoring, validation, and error handling mechanisms within data pipelines to ensure data integrity</p><p>• Enhance the performance optimization of data workflows, identifying and resolving performance bottlenecks</p><p>• Fine-tune queries and optimize data processing to enhance Redshift's performance</p><p>• Translate business requirements into technical specifications and coded data pipelines</p><p>• Collaborate with data analysts and business stakeholders to meet their data requirement needs</p><p>• Document all data integration processes, workflows, and technical system specifications</p><p>• Ensure compliance with data governance policies, industry standards, and regulatory requirements.</p>Data Engineer<p>We are seeking an experienced <strong>Data Engineer</strong> to design, develop, and optimize scalable data pipelines and infrastructure. This role requires deep expertise in data architecture, cloud technologies, and automation.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design and implement scalable data pipelines, ETL processes, and real-time data streaming solutions.</li><li>Develop and maintain data lakes, data warehouses, and analytical platforms on cloud environments (AWS, Azure, or GCP).</li><li>Optimize database performance, indexing strategies, and query tuning.</li><li>Work with structured and unstructured data to ensure efficient storage and retrieval.</li><li>Automate data workflows and deployment using CI/CD pipelines and infrastructure-as-code (Terraform, CloudFormation).</li><li>Implement data security best practices, including encryption, masking, and role-based access controls.</li><li>Collaborate with data scientists, analysts, and software engineers to provide high-quality data solutions.</li><li>Monitor and troubleshoot data pipelines, ensuring reliability and scalability.</li></ul><p><br></p>Data Engineer<p>Hands-On Technical SENIOR Microsoft Stack Data Engineer / On Prem to Cloud Senior ETL Engineer - Position WEEKLY HYBRID position with major flexibility! FULL Microsoft On-Prem stack.</p><p><br></p><p>LOCATION : HYBRID WEEKLY in Des Moines. You must reside in the Des Moines area for weekly onsite . NO travel back and forth and not a remote position! If you live in Des Moines, eventually you can MOSTLY work remote!! This position has upside with training in Azure.</p><p><br></p><p>IMMEDIATE HIRE ! Solve real Business Problems.</p><p><br></p><p>Hands-On Technical SENIOR Microsoft Stack Data Engineer | SENIOR Data Warehouse Engineer / SENIOR Data Engineer / Senior ETL Developer / Azure Data Engineer / ( Direct Hire) who is looking to help modernize, Build out a Data Warehouse, and Lead & Build out a Data Lake in the CLOUD but FIRST REBUILD an OnPrem data warehouse working with disparate data to structure the data for consumable reporting.</p><p><br></p><p>YOU WILL DOING ALL ASPECTS OF Data Engineering. Must have data warehouse & Data Lake skills. You will be in the technical weeds and technical data day to day BUT you could grow to the Technical Leader of this team. ETL skills like SSIS., working with disparate data. SSAS is a Plus! Fact and Dimension Data warehouse experience AND experience.</p><p>Hands-On Technical Hands-On Technical SENIOR Microsoft Stack Data Engineer / SENIOR Data Warehouse / SENIOR Data Engineer / Azure Data Factory Data Engineer This is a Permanent Direct Hire Hands-On Technical Manager of Data Engineering position with one of our clients in Des Moines up to 155K Plus bonus</p><p><br></p><p>PERKS: Bonus, 2 1/2 day weekends !</p>Azure Data Engineer<p>We are seeking a Data Engineer to join our team in the North Houston area. In this role, you will focus on creating, optimizing, and maintaining data pipeline architecture. This role is essential in supporting data initiatives across the company. </p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Apply data architecture principles for modeling, stored procedures, replication, security, and compliance to meet both technical and business objectives.</p><p>• Construct and maintain scalable data pipelines using Microsoft Fabric components such as Lakehouse, Azure Data Factory (ADF), Data Warehouses, Notebooks, and Dataflows.</p><p>• Develop and implement solutions for data extraction, processing, and analysis of large volumes of structured and unstructured data.</p><p>• Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.</p><p>• Optimize and troubleshoot data pipelines to ensure reliability, efficiency, and performance.</p><p>• Implement best practices for data governance and security to ensure data integrity and regulatory compliance.</p><p>• Oversee and maintain data infrastructure to ensure high availability and scalability.</p><p>• Stay updated with advancements in data engineering and Microsoft Fabric technologies.</p><p>• Develop technology architecture strategies informed by various business scenarios and motivations.</p><p>• Assess emerging technology trends and provide guidance on their potential impact on organizational opportunities and risks.</p><p>• Enhance the quality, consistency, accessibility, and security of data across the company continuously.</p><p>• Manage the Data Platform roadmap, ensuring future-proofing, capacity planning, and optimization.</p><p>• Work with IT Operations teams and BI vendors to resolve production issues.</p><p>• Manage stakeholder demands and prioritize reporting requirements and needs.</p><p>• Maintain relationships with key vendors to plan and adapt the Data Platform roadmap and leverage existing capabilities.</p><p>• Monitor Data Platform usage, deprecate unused reports and datasets, and drive continuous improvement.</p><p>• Work with users to address data issues, trace data lineage, and implement data cleansing and quality processes.</p><p>• Manage secure access to data sets for the business.</p><p>• Ensure effective collaboration with BI vendors, IT, and business teams.</p><p>• Manage project schedules, focus on critical path items, define and communicate key milestones, and coordinate with the broader team for tollgate reviews.</p><p>• Monitor and report on emerging technologies, seeking opportunities for continuous improvement.</p><p>• Ensure adherence to data visualization and data modeling processes, procedures, and standards.</p><p>• Manage data sets and work with ETL and Data Warehouse solutions.</p><p>• Launch and drive a data literacy program for business users.</p>Data Engineer - Azure Data Factory / SQL Server<p>We are looking to bring on board a detail-oriented Data Engineer with experience in Azure Data Factory and SQL Server. This position is FULLY REMOTE [with the exception of a 1-week on-site onboarding which is COMPANY PAID. Candidate will NOT need to submit for reimbursement of expenses]. Applicants must be able to work business hours in the Eastern Standard Time Zone. This role is a great fit for individuals passionate about the construction and contractor industry. The selected candidate will be heavily involved in various aspects of data management and data pipeline development, working to ensure seamless integration of data from diverse sources into our data warehouse.</p><p><br></p><p>Responsibilities:</p><p>• Build robust data pipelines using tools such as Azure Data Factory, Synapse pipelines, and Fabric pipelines</p><p>• Oversee the management and maintenance of Azure SQL and Azure Synapse environments to ensure accurate data ingestion and storage</p><p>• Develop and implement data models to support business reporting and analytics needs, utilizing the capabilities of Microsoft Fabric and other relevant tools</p><p>• Construct and maintain efficient ETL/ELT processes to ensure data accuracy, integrity, and availability across all projects</p><p>• Continually monitor and optimize data workflows and processes to meet performance and scalability standards</p><p>• Collaborate with cross-functional teams, including Power App developers, Business Intelligence Developers, and business stakeholders, to understand and deliver on data requirements</p><p>• Maintain comprehensive documentation of data processes, configurations, and best practices.</p>Data Engineer-Advent/AxysWe are in search of a meticulous Data Engineer-Advent/Axys to be a part of our Investment Management team located in Evanston, Illinois. This role involves handling customer applications, maintaining customer records, and resolving customer inquiries with precision. The position also requires monitoring customer accounts and taking necessary actions. <br><br>Responsibilities:<br><br>• Efficiently process customer credit applications using Advent - Axys and Microsoft Excel.<br>• Utilize SQL - Structured Query Language to manage SQL databases.<br>• Maintain customer credit records with the highest level of accuracy.<br>• Resolve customer inquiries by demonstrating excellent listening skills.<br>• Monitor customer accounts and take appropriate actions when necessary.<br>• Display a high level of organization and attention to detail in all tasks.Senior Manager Data Engineer<p>We are looking for a highly skilled Data Engineering and Software Engineering professional to design, build, and optimize our Data Lake and Data Processing platform on AWS. This role requires deep expertise in data architecture, cloud computing, and software development, as well as the ability to define and implement strategies for deployment, testing, and production workflows.</p><p><br></p><p>Key Responsibilities:</p><ul><li>Design and develop a scalable Data Lake and data processing platform from the ground up on AWS.</li><li>Lead decision-making and provide guidance on code deployment, testing strategies, and production environment workflows.</li><li>Define the roadmap for Data Lake development, ensuring efficient data storage and processing.</li><li>Oversee S3 data storage, Delta.io for change data capture, and AWS data processing services.</li><li>Work with Python and PySpark to process large-scale data efficiently.</li><li>Implement and manage Lambda, Glue, Kafka, and Firehose for seamless data integration and processing.</li><li>Collaborate with stakeholders to align technical strategies with business objectives, while maintaining a hands-on engineering focus.</li><li>Drive innovation and cost optimization in data architecture and cloud infrastructure.</li><li>Provide expertise in data warehousing and transitioning into modern AWS-based data processing practices.</li></ul>Data Engineer<p><strong>Position: Data Engineer</strong></p><p><strong>Location: Des Moines, IA - HYBRID</strong></p><p><strong>Salary: up to $130K permanent position plus exceptional benefits</strong></p><p> </p><p><strong>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***</strong></p><p> </p><p>Our clients is one of the best employers in town. Come join this successful organization with smart, talented, results-oriented team members. You will find that passion in your career again, working together with some of the best in the business. </p><p> </p><p>If you are an experienced Senior Data Engineer seeking a new adventure that entails enhancing data reliability and quality for an industry leader? Look no further! Our client has a robust data and reporting team and need you to bolster their data warehouse and data solutions and facilitate data extraction, transformation, and reporting.</p><p> </p><p>Key Responsibilities:</p><ul><li>Create and maintain data architecture and data models for efficient information storage and retrieval.</li><li>Ensure rigorous data collection from various sources and storage in a centralized location, such as a data warehouse.</li><li>Design and implement data pipelines for ETL using tools like SSIS and Azure Data Factory.</li><li>Monitor data performance and troubleshoot any issues in the data pipeline.</li><li>Collaborate with development teams to track work progress and ensure timely completion of tasks.</li><li>Implement data validation and cleansing processes to ensure data quality and accuracy.</li><li>Optimize performance to ensure efficient data queries and reports execution.</li><li>Uphold data security by storing data securely and restricting access to sensitive data to authorized users only.</li></ul><p>Qualifications:</p><ul><li>A 4-year degree related to computer science or equivalent work experience.</li><li>At least 5 years of professional experience.</li><li>Strong SQL Server and relational database experience.</li><li>Proficiency in SSIS, SSRS.</li><li>.Net experience is a plus.</li></ul><p> </p><p><strong>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. *** </strong></p><p> </p>Data Engineer<p>We are offering a contract to permanent employment opportunity for a Data Engineer in Glendale, California. This role is an integral part of our team, where you'll be involved in the creation of data solutions that inspire innovation and stimulate business growth, contributing significantly to our data-driven decision-making processes.</p><p>Certainly! Here's the revised job posting without the mention of the client:</p><p><br></p><p><strong>About the Role</strong></p><p>As a <strong>Senior Analytics Engineer</strong>, you will play a vital role in transforming data into actionable insights. Collaborate with a dynamic team of technologists to develop innovative data solutions that fuel business growth and drive impactful decision-making. This role involves managing complex data structures and delivering scalable, efficient solutions to optimize our data-driven processes. If you're passionate about leveraging data to create meaningful impact, this opportunity is for you.</p><p><strong>Responsibilities:</strong></p><ul><li>Architect and design data products using foundational data sets.</li><li>Develop and maintain code for data products.</li><li>Partner with business stakeholders to define data strategies and optimize the use of current data assets.</li><li>Provide specifications for data ingestion, preparation, and transformation workflows.</li><li>Document processes and train teams to utilize data products effectively for automation and decision-making.</li><li>Build and maintain automated pipelines to produce knowledge from models.</li><li>Monitor and refine statistical and machine learning models within data products.</li><li>Collaborate with data scientists to implement solutions for marketing and business problem-solving.</li><li>Work in coordination with other technology and science teams to deliver integrated solutions</li></ul>Data EngineerWe are offering an exciting opportunity for a Data Engineer to join our team in Miami, Florida. This role primarily involves designing, creating, and managing large datasets using a variety of data technologies and tools. The Data Engineer will also be tasked with developing algorithms and deploying data-driven analytics to business problems.<br><br>Responsibilities:<br><br>• Develop and implement algorithms to enhance data processing and analytics.<br>• Utilize tools like Apache Kafka, Apache Pig, and Apache Spark for data management and processing.<br>• Leverage cloud technologies for efficient data storage and retrieval.<br>• Collaborate with the team to develop APIs for data usage and sharing.<br>• Apply AWS Technologies for managing and processing large datasets.<br>• Implement data visualization strategies to represent data in a comprehensible way.<br>• Use Google Data Studio for effective data reporting and representation.<br>• Work with Apache Hadoop for distributed processing of large data sets across clusters.<br>• Ensure the implementation of efficient algorithms for data processing and analytics.<br>• Continuously monitor, refine and report on the performance of data management systems.Data Engineer<p>We are searching for a Data Engineer with skills in Fabric/Azure, Python, and cloud technologies to join our team. This role is in Chicago, Illinois and offers a short-term contract employment opportunity. As a Data Engineer, you will be tasked with developing data pipelines in Fabric, understanding the Fabric medallion architecture, and utilizing Python for various projects. </p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Develop and manage data pipelines in Fabric, ensuring smooth deployment from development to production.</p><p>• Gain a comprehensive understanding of Fabric medallion architecture, including bronze, silver, and gold layers.</p><p>• Utilize your Python knowledge to enhance and optimize data processing tasks.</p><p>• Utilize your skills in Apache Kafka, Apache Pig, and Apache Spark for data processing and analytics.</p><p>• Use your knowledge of cloud technologies to manage and maintain the data infrastructure.</p><p>• Leverage your skills in data visualization and algorithm implementation to create insightful data reports and visualizations.</p><p>• Work with AWS technologies and API development to enhance data processing capabilities.</p><p>• Maintain excellent communication skills and be able to work remotely and independently under tight deadlines.</p><p>• Use your knowledge of Apache Hadoop and analytics to analyze large data sets and derive insights.</p><p>• Apply your skills in Microsoft Fabric and data pipelines to enhance data processing and storage capabilities.</p>Data Engineer<p>Olivia Quinlan from Robert Half is hiring for a Data Engineer, ideally in Rochester, NY but New York residents may be considered.</p>