<p>**** For Faster response on the position, please send a message to Jimmy Escobar on LinkedIn or send an email to Jimmy.Escobar@roberthalf(.com) with your resume. You can also call my office number at 424-270-9193****</p><p><br></p><p>My client, a Burbank entertainment based firm is looking for a Java Tech Lead Developer to join their application development team. The Java Tech Lead Developer position is hybrid 3 days a week on-site and 2 days remote. The Java tech Lead Developer should have at least 7 years of experience in Java and have some Tech Lead experience. The Tasks for the Java Tech Lead Developer position includes mentoring Java Developers, developing spring boot into Java based applications, and performing unit test cases. This is a great opportunity for a Java Tech Lead Developer to work for enterprise organization that offers amazing benefits.</p>
<p>Our team is seeking a highly experienced Software Engineer with deep expertise in front-end (ReactJS), back-end (NestJS), and mobile development (iOS - Swift, Android - Kotlin). The ideal candidate has a proven track record of building and scaling modern web and mobile applications and strong experience working within cloud environments and modern DevOps practices (AWS, Terraform, Kubernetes, PostgreSQL, Redis, RabbitMQ, S3, CloudFront).</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Architect, design, and implement scalable, secure, and maintainable solutions across web, mobile, and backend platforms.</li><li>Develop user-friendly web applications using ReactJS.</li><li>Build performant, robust native mobile apps in Swift (iOS) and Kotlin (Android).</li><li>Design and implement RESTful APIs and backend services using NestJS.</li><li>Lead DevOps initiatives leveraging AWS, Terraform, Kubernetes (EKS), S3, and CloudFront.</li><li>Manage data storage and retrieval with PostgreSQL, Redis, and integrate message queues via RabbitMQ.</li><li>Review code, mentor junior developers, and promote best practices in coding, testing, and automation.</li><li>Collaborate cross-functionally with design, product, and business stakeholders to deliver exceptional software solutions.</li><li>Participate in and drive continuous improvement for deployment, monitoring, and reliability.</li></ul>
<p><strong>Senior Front-End Engineer (Hybrid – West LA)</strong></p><p><strong>Compensation:</strong> Up to $150K + bonus & benefits</p><p>We’re seeking a <strong>Senior Front-End Engineer</strong> to drive modern UI development for cloud-based business applications. This hybrid role (3 days onsite in West LA) will work closely with product, design, and offshore teams to deliver scalable, high-quality, and user-centric solutions.</p><p><strong>What You’ll Do:</strong></p><ul><li>Lead front-end architecture and development with Angular and modern web technologies.</li><li>Collaborate across teams to design and implement cloud-based solutions.</li><li>Ensure application performance, scalability, and best practices in code quality.</li><li>Mentor developers and champion user experience through intuitive design.</li><li>Troubleshoot issues and drive improvements through CI/CD and DevOps practices.</li></ul><p>For immediate consideration, direct message Reid Gormly on LinkedIN</p><p><br></p><p><strong>Why Join:</strong></p><ul><li>Competitive salary up to $165K + bonus.</li><li>Hybrid schedule in West LA.</li><li>Strong career growth opportunities and professional development support.</li><li>Comprehensive benefits: medical, dental, vision, 401K with match, PTO, and more.</li></ul>
We are looking for a skilled Data Engineer to join our team in Los Angeles, California. This role focuses on designing and implementing advanced data solutions to support innovative advertising technologies. The ideal candidate will have hands-on experience with large datasets, cloud platforms, and machine learning, and will play a critical role in shaping our data infrastructure.<br><br>Responsibilities:<br>• Develop and maintain robust data pipelines to ensure seamless data extraction, transformation, and loading processes.<br>• Design scalable architectures that support machine learning models and advanced analytics.<br>• Collaborate with cross-functional teams to deliver business intelligence tools, reporting solutions, and analytical dashboards.<br>• Implement real-time data streaming solutions using platforms like Apache Kafka and Apache Spark.<br>• Optimize database performance and ensure efficient data storage and retrieval.<br>• Build and manage resilient data science programs and personas to support AI initiatives.<br>• Lead and mentor a team of data scientists, machine learning engineers, and data architects.<br>• Design and implement strategies for maintaining large datasets, ensuring data integrity and accessibility.<br>• Create detailed technical documentation for workflows, processes, and system architecture.<br>• Stay up-to-date with emerging technologies to continuously improve data engineering practices.
<p>we are seeking a <strong>Data Engineer</strong> to join its growing data team in West LA. This role is perfect for someone early in their data engineering career who wants to work with modern data stacks, cloud technologies, and high-impact analytics projects in a collaborative, fast-paced environment.</p><p><br></p><p> <strong>Compensation:</strong> $100–130K + 5% bonus (flexible for strong candidates)</p><p><br></p><p><strong>About the Role</strong></p><p>In this position, you’ll support the full data lifecycle—from ingesting and transforming raw data to building pipelines, reporting tools, and analytics infrastructure that empower teams across the business. You’ll work with Python, SQL, cloud platforms, ETL solutions, and visualization tools, contributing to the evolution of next-generation data systems supporting large-scale digital operations.</p><p><br></p><p><strong>What You'll Do</strong></p><ul><li>Build, maintain, and optimize ETL/ELT pipelines using tools such as Talend, SSIS, or Informatica</li><li>Work hands-on with cloud platforms (any cloud; GCP preferred) to support data workflows</li><li>Develop reports and dashboards using visualization tools (Looker, Tableau, Power BI, etc.)</li><li>Collaborate with product, analytics, and engineering teams to deliver reliable datasets and insights</li><li>Own data issues end-to-end — from collection and extraction to cleaning and validation</li><li>Support data architecture, pipeline resilience, and performance tuning</li><li>Assist in maintaining and scaling datasets, data models, and analytics environments</li><li>Contribute to real-time streaming initiatives (a plus)</li></ul><p><br></p>
<p><strong>Data Engineer – CRM Integration (Hybrid in San Fernando Valley)</strong></p><p><strong>Location:</strong> San Fernando Valley (Hybrid – 3x per week onsite)</p><p><strong>Compensation:</strong> $140K–$170K annual base salary</p><p><strong>Job Type:</strong> Full Time, Permanent</p><p><strong>Overview:</strong></p><p>Join our growing technology team as a Data Engineer with a focus on CRM data integration. This permanent role will play a key part in supporting analytics and business intelligence across our organization. The position offers a collaborative hybrid environment and highly competitive compensation.</p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, and optimize data pipelines and workflows integrating multiple CRM systems (Salesforce, Dynamics, HubSpot, Netsuite, or similar).</li><li>Build and maintain scalable data architectures for analytics and reporting.</li><li>Manage and advance CRM data integrations, including real-time and batch processing solutions.</li><li>Deploy ML models, automate workflows, and support model serving using Azure Databricks (ML Flow experience preferred).</li><li>Utilize Azure Synapse Analytics & Pipelines for high-volume data management.</li><li>Write advanced Python and Spark SQL code for ETL, transformation, and analytics.</li><li>Collaborate with BI and analytics teams to deliver actionable insights using PowerBI.</li><li>Support streaming solutions with technologies like Kafka, Event Hubs, and Spark Streaming.</li></ul><p><br></p>
<p><strong>Senior Data Engineer</strong></p><p><strong>Location:</strong> Calabasas, CA (Fully Remote if outside 50 miles)</p><p> <strong>Compensation:</strong> $140K–$160K </p><p> <strong>Reports to:</strong> Director of Data Engineering</p><p>Our entertainment client is seeking a <strong>Senior Data Engineer</strong> to design, build, and optimize enterprise data pipelines and cloud infrastructure. This hands-on role focuses on implementing scalable data architectures, developing automation, and driving modern data engineering best practices across the company.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design and maintain ELT/ETL pipelines in Snowflake, Databricks, and AWS.</li><li>Build and orchestrate workflows using Python, SQL, Airflow, and dbt.</li><li>Implement medallion/lakehouse architectures and event-driven pipelines.</li><li>Manage AWS services (Lambda, EC2, S3, Glue) and infrastructure-as-code (Terraform).</li><li>Optimize data performance, quality, and governance across systems.</li></ul><p>For immediate consideration, direct message Reid Gormly on Linkedin and Apply Now!</p>
<p><strong><em><u>Client </u></em></strong>= Nationally Recognized Leader in the Travel & Leisure Industry</p><p><strong><em><u>Job Title</u></em></strong> = Data Engineer (mid and senior level candidates being considered)</p><p><strong><em><u>Location</u> </em></strong>= San Fernando Valley</p><p><br></p>
<p>Our team is seeking an experienced Data Engineer to help architect, build, and optimize data infrastructure supporting enterprise decision making and advanced analytics. This position is ideal for hands-on professionals who excel in SQL, Python, and have proven expertise across enterprise cloud platforms, databases, and modern data engineering practices.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and maintain scalable data pipelines that ingest, process, and transform structured and unstructured data.</li><li>Write advanced SQL queries—including complex joins, aggregations, window functions, and CTEs—to support ETL, data analysis, and migration projects.</li><li>Develop, optimize, and productionize Python scripts for data movement and validation using vanilla Python; familiarity with PySpark is a plus.</li><li>Collaborate across teams to deliver robust APIs and services for data integration with frontend (React, Angular, Vue) and backend (Node.js, Express) frameworks.</li><li>Architect and maintain data models and schemas, ensuring alignment with business requirements and efficient data retrieval.</li><li>Manage production-level SQL (e.g., SQL Server) and NoSQL (e.g., MongoDB) databases; ensure data quality through continuous monitoring and rigorous testing.</li><li>Deploy and manage automated ETL pipelines and schema migrations using Azure Data Factory, Microsoft Fabric, or comparable technologies.</li><li>Continuously improve performance through query optimization and efficient use of data structures (lists, dictionaries, sets) in Python.</li><li>Implement agile methodologies, continuous delivery, and DevOps best practices for reliable and scalable data infrastructure.</li><li>Enforce rigorous data quality management and automated testing strategies to ensure accuracy and reliability of data assets.</li></ul><p><br></p>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>**** For Faster response on the position, please send a message to Jimmy Escobar on LinkedIn or send an email to Jimmy.Escobar@roberthalf(.com) with your resume. You can also call my office number at 424-270-9193****</p><p><br></p><p>We are looking for an experienced Senior Data Engineer with expertise in Databricks and the Adobe Experience Platform (AEP) to join our team on a long-term contract basis. In this role, you will design, implement, and optimize data solutions that support diverse business domains, leveraging advanced technologies such as Databricks, Azure cloud tools, and AEP. Based in Woodland Hills, California, this position offers the opportunity to collaborate with cross-functional teams and play a pivotal role in transforming data strategies to drive business success.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement scalable data pipelines to support integrations with Adobe Experience Platform and Databricks.</p><p>• Design data architecture solutions that ensure quality, reliability, and consistency across all data flows.</p><p>• Collaborate with business stakeholders and IT teams to deliver technical solutions aligned with organizational objectives.</p><p>• Optimize data workflows for improved performance and reduced latency in Databricks and Adobe environments.</p><p>• Monitor and troubleshoot issues within data pipelines to ensure seamless operations.</p><p>• Create and maintain comprehensive documentation for data architectures, integration processes, and workflows.</p><p>• Partner with analysts and stakeholders to gather requirements and deliver effective data solutions.</p><p>• Provide training and guidance on best practices for Databricks and Adobe data processes.</p><p>• Ensure the efficient use of Azure Synapse Analytics, Azure Data Factory, and related technologies in data integration projects.</p><p>• Drive continuous improvements in data strategies to support business growth.</p>
<p>We are looking for a skilled QA Analyst with expertise in the finance realm to join our team in Southern California. In this long-term contract role, you will play a pivotal part in ensuring the reliability, compliance, and performance of critical account servicing systems. The position requires hands-on experience with mainframe testing environments and a strong ability to work within complex systems.</p><p><br></p><p>Responsibilities:</p><p>• Validate upgrades to the Customer Account Servicing System by executing structured test plans and performing manual script-based testing.</p><p>• Analyze system updates and classify them into enhancements, corrections, or regulatory changes while determining the appropriate testing methods.</p><p>• Conduct regression testing by planning and executing tests for impacted processes, ensuring compatibility between base code and custom modules.</p><p>• Develop and run detailed test scripts in Quality Center for both batch processing and online components, focusing on calculations, error handling, and system integrity.</p><p>• Execute full file production parallel tests by reconciling output files and documenting results.</p><p>• Collaborate with developers and operations teams to identify, record, and resolve defects, ensuring system stability.</p><p>• Coordinate business reviews for regulatory changes and enhancements, capturing approvals and maintaining traceability to test results.</p><p>• Manage and maintain audit-ready documentation, including test scripts, defect logs, and change control records.</p><p>• Prepare and curate test data, batch input files, and account information for mainframe testing environments.</p><p>• Drive root cause analysis for defects and escalate issues as needed while maintaining compliance with established practices.</p>
We are looking for a creative and dynamic Content Creator to join our team in Newport Beach, California. This role focuses on building a strong social media presence and developing engaging content that resonates with our target audience within the luxury real estate market. As a Contract to permanent position, this opportunity is ideal for someone passionate about showcasing the Orange County lifestyle in a fresh and relatable way.<br><br>Responsibilities:<br>• Develop and implement social media strategies to position the brokerage and agents as local influencers.<br>• Create compelling content that highlights the coastal Orange County lifestyle, blending luxury with authenticity.<br>• Produce engaging videos, blogs, and posts to attract affluent demographics and potential homebuyers.<br>• Monitor industry trends and incorporate them into recruiting campaigns and content strategies.<br>• Collaborate with agents and the brokerage team to align content with brand goals and messaging.<br>• Write and edit web page content, ensuring clarity and appeal to the target audience.<br>• Manage influencer partnerships to amplify brand visibility and engagement.<br>• Utilize analytics tools to track content performance and optimize strategies.<br>• Plan and execute marketing campaigns that reflect the latest trends in real estate and lifestyle.<br>• Stay updated on social media platforms and tools to enhance content delivery.