Back to Jobs
F

Senior Data Engineer, Data Platform - IntelliScript (Remote)

Flexionis

Remotefull time1 day agosenior
Apply Now

About the position Milliman IntelliScript is a group of a few hundred experts in fields ranging from actuarial science to information technology to clinical practice. Together, we develop and deploy category-defining, data-driven, software-as-a-service (SaaS) products for a broad spectrum of insurance, health IT and life sciences clients. We are a business unit within Milliman, Inc., a respected consultancy with offices around the world. Candidates who have their pick of jobs are drawn to IntelliScript's entrepreneurial and collaborative culture of innovation, excellence, exceptional customer service, balance, and transparency. Every single person has a voice in our company, and we challenge each other to push the outer limits of our full, diverse potential. And we've shown sustained growth that ensures you'll have room to grow your skillset, responsibilities, and career. Our team is smart, down-to-earth, and ready to listen to your best ideas. We reward excellence and offer competitive compensation and benefits. Visit our LinkedIn page for a closer look at our company and learn more about our cultural values here. Milliman invests in skills training and career development and gives all employees access to a variety of learning and mentoring opportunities. Our growing number of Milliman Employee Resource Groups (ERGs) are employee-led communities that influence policy decisions, develop future leaders, and amplify the voices of their constituents. We encourage our employees to give back to their varied professions, including leadership in professional organizations. Please visit our website to learn more about Milliman's commitments to our people, diversity and inclusion, social impact, and sustainability. Responsibilities • Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise. • Configuring and maintaining unity catalog to enable enterprise data lineage and data quality. • Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions. • Building data solutions for clients while upholding the best standards for reliability, quality, and performance. • Building solutions within Delta Live Tables and automation of transformations. • Building out performant enterprise-level medallion architecture(s). • Building fit-for-purpose near real-time streaming and batch solutions. • Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data. • Building out Infrastructure as Code using Terraform and Asset Bundles. • Working with the business to build cost effective and cost transparent Data solutions. • Architecting, building, and maintaining robust and scalable data pipelines, monitoring, and optimizing performance. • Identifying and implementing improvements to enhance data processing efficiency. • Building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ‘big data' technologies. • Leading design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data. • Assembling large, complex data sets that meet functional and non-functional business requirements. • Developing and maintaining data models, ensuring they align with business objectives and data privacy regulations. • Partnering internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data. • Effectively working with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. • Communicating complex technical concepts to non-technical stakeholders and providing guidance on best practices. • Ensuring that technology execution aligns with business strategy and provides efficient, secure solutions and systems. • Identifying, designing, and implementing internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. • Building analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics. • Creating data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Requirements • 7+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products. • Expert level experience working in Databricks and AWS. • Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, and MongoDB. • Experience managing and standardizing clinical data from structured an

via JSearch
FREE WEEKLY NEWSLETTER

Stay on the Nerd Track

One email per week — courses, deep dives, tools, and AI experiments.

No spam. Unsubscribe anytime.