Job Title: Staff Database Engineer (Contractor)
Location: (Remote)
Duration: 6 months
Overview
We are looking for an experienced Staff Database Engineer (Contractor) to join our team. This senior-level role focuses on designing, building, and optimizing complex data systems that power scalable, high-performance analytics and operations. The ideal candidate will bring strong technical expertise across data architecture, pipeline development, and system reliability, with a particular focus on healthcare data environments.
Key Responsibilities
- Design and implement scalable, reliable data architectures to support large-scale data processing and analytics.
- Develop, maintain, and optimize ETL/ELT pipelines using modern frameworks to ingest and transform data from various sources (flat files, streaming data, REST APIs, EHRs, etc.).
- Build and manage cloud-based systems for real-time and batch data processing (e.g., data lakes, data warehouses, data mesh).
- Collaborate with engineering, data science, and product teams to understand requirements and deliver impactful data solutions.
- Integrate Electronic Health Records (EHRs) and other healthcare data, ensuring accuracy and regulatory compliance.
- Ensure operational excellence through effective logging, monitoring, alerting, and incident response.
- Engineer robust data services using SQL and at least one programming language (.NET, Java, or Python).
- Contribute to architectural standards, technical documentation, and best practices.
- Serve as a subject matter expert (SME), mentoring junior engineers and fostering a culture of engineering excellence.
Qualifications
- 7–10+ years of experience in data engineering, software development, or database system design.
- Hands-on experience with SSIS (SQL Server Integration Services) and Azure Data Factory (ADF).
- Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent professional experience).
- Advanced expertise in SQL and modern data processing frameworks.
- Strong coding proficiency in one or more languages such as .NET, Java, or Python.
- Experience with cloud platforms like Azure, AWS, or GCP.
- Familiarity with streaming and messaging technologies (Kafka, RabbitMQ, SNS, etc.).
- Knowledge of CI/CD practices, infrastructure-as-code (e.g., Terraform), and containerization (e.g., Kubernetes).
- Skilled in production system support, debugging, and performance optimization.
- Strong communication, problem-solving, and collaboration skills.
- Understanding of big data architectures and design patterns (data lake, warehouse, data mesh).
- Experience with RESTful APIs and integrating third-party data into internal systems.