Skip to form
Landing page banner

SENIOR DATA ENGINEER

OVERVIEW OF THE ROLE

DashLX is seeking a critical thinker and self-starter to assist with vendor integrations, data pipeline optimization, and scalable data infrastructure. This is a unique opportunity to work with a small startup, utilizing cutting-edge cloud service technologies to enhance data processing and analytics.

For this position, we are seeking a senior-level data engineer who is experienced in Python, Java, and cloud-based data pipelines. The ideal candidate will be responsible for integrating vendors via OAuth1/2, maintaining streaming data applications, and optimizing our AWS-based data infrastructure.

DashLX is not currently sponsoring applicants for work visas.

CORE RESPONSIBILITIES

  • Develop and maintain secure vendor connections using OAuth1/2 protocols.

  • Write and optimize Python code with FastAPI for seamless integrations, deployed via AWS Lambda and API Gateway v2.

  • Update and maintain our Apache Flink application in Java to parse, process, and map incoming data.

  • Ensure that data streams from Kinesis are correctly ingested and transformed into Apache Iceberg tables on S3.

  • Work closely with cross-functional teams to improve data workflows and integration processes.

  • Troubleshoot, monitor, and optimize the performance of data ingestion and processing pipelines.

  • Act as a subject matter expert (SME) for data engineering best practices within the organization and for clients.

  • Provide occasional insights on native app development (experience with Apple HealthKit and Google Fit is a strong plus).

  • Ensure your home work environment is adequate to carry out your responsibilities. (e.g., work computer, internet connection, phone, etc.)

WORK HOURS AND BENEFITS

We are a mostly distributed workforce; therefore, you will likely be required to work from home. Hours are flexible, though you will need to be available at some point between 8-5 Central Time for meetings.

BENEFITS

  • Competitive salary and comprehensive benefits (healthcare, PTO, etc.).

  • Flexible working hours with a fully remote work environment.

  • Stock options based on performance and company growth.

  • Opportunity to work in a fast-paced, innovative environment with room for professional growth.

  • Be part of a dynamic team building innovative SaaS solutions from the ground up.

  • Collaborate closely with experienced leaders who value creativity, autonomy, and technical excellence.

  • Shape the future of our cloud services while growing your career in Data Engineering.

EXPERIENCE REQUIREMENTS

  • Formal Computer Science degree in programming, algorithms, data structures, systems design, and computational theory.

  • 6+ years of experience in data engineering or related fields.

  • Strong proficiency in Python, particularly with FastAPI, and experience deploying applications on AWS Lambda and API Gateway v2.

  • Demonstrated expertise with OAuth1/2 protocols for vendor integrations.

  • Solid experience in Java and hands-on work with Apache Flink.

  • Familiarity with AWS Kinesis (managed Flink and stream processing) and data storage solutions like Apache Iceberg on S3.

  • Excellent problem-solving skills, with a keen ability to optimize and troubleshoot data pipelines.

  • Strong communication and collaboration skills to work effectively with technical and non-technical teams.

PREFERRED QUALIFICATIONS

  • Experience in native mobile app development (e.g., integrating with Apple HealthKit or Google Fit).

  • Familiarity with CI/CD practices for data engineering projects.

  • Experience with modern data processing architectures and a passion for scalable, high-performance systems.

START YOUR APPLICATION

Thanks for answering a few questions 

1 year minimum
1 year minimum

On a scale of 1-4, where 1 = little to no prior experience, 2 = moderate experience and excited to learn more, 3 = considerable experience and 4 = mastery, how would you rank yourself with these technologies and methodologies:

Domain Driven Design (DDD)
Object Oriented Programming (OOP)
Svelte
Web Workers
Pandas/Numpy
Distributed Computing
Enterprise Architecture Patterns (EAP)
AWS Cloud Services
Business Intelligence (BI)
Data Science
Agile, including Scrum

Even if you already provided us with a copy of your resume, please upload your resume here. You may also upload a portfolio as well.