The job below is no longer available.

You might also like

in Raleigh, NC

  • $25.50
    Verified per hour
    Avis Budget Group 2h ago
    Good payUrgently hiring4.7 mi Use left and right arrow keys to navigate
  • $25.50
    Verified per hour
    Avis Budget Group 2h ago
    Good payUrgently hiring4.7 mi Use left and right arrow keys to navigate
  • $73773 - $127873
    Verified per year
    First Citizens Bank 2h ago
    Urgently hiring12.6 mi Use left and right arrow keys to navigate
  • $72
    est. per hour
    First Citizens Bank 2h ago
    Just postedUrgently hiring12.6 mi Use left and right arrow keys to navigate
  • $19
    est. per hour
    WIRB - Copernicus Group 5h ago
    Urgently hiring13.8 mi Use left and right arrow keys to navigate
Use left and right arrow keys to navigate
Estimated Pay $56 per hour
Hours Full-time, Part-time
Location Raleigh, North Carolina

Compare Pay

Estimated Pay
We estimate that this job pays $56.17 per hour based on our data.

$33.07

$56.17

$79.15


About this job

Description

About Us

Curi is a full-service advisory firm that serves physicians and medical practices. Equal parts fierce physician advocates, smart business leaders, and thoughtful partners, Curi's advisory, capital, and insurance offerings deliver valued advice that is grounded in client priorities and elevated by their outcomes. From data-driven advisory services to private wealth offerings, to tailored medical malpractice insurance solutions and beyond, we deliver performance that is time-tested and trusted-in medicine, business, and life.

The Role

We're looking for a skilled and experienced self-starter to work in our Technology department as a Data Engineer. Reporting to our Manager, Data Management and Engineering, you will be involved in data Lakehouse development including requirements gathering, database modeling, ETL and dashboard/report development activities. You will frequently conduct data analysis, write advanced database queries, and optimize data storage needed by business and/or executive operations. You will enable daily business activities by providing the foundation of all federated querying and analysis produced by the BI, Actuarial, and Data Science teams.

Key Result Areas

Develop and maintain enterprise data warehousing platform

  • Work with business users to establish data warehousing and data lake requirements
  • Develop and maintain ETL to support the Data and Analytics reporting platform
  • Development of SQL procedures, triggers, views, functions, and reports to support enhancement to critical SQL based business systems for existing systems
  • Support existing Data Warehouse (MS SQL Server)
  • Participate peer code reviews, unit testing and documentation of code developed

Maintain data Lakehouse, ETL, and reporting environments

  • Participate in design, coding, testing, implementation, and documentation of solutions
  • Contribute to, enforce, and document database policies, procedures and standards
  • Provide technical and business knowledge support to the team
  • Participate in data governance
  • Perform tests and evaluations regularly to ensure data security, privacy and integrity

Detect and resolve production performance issues

  • Participate in performance tuning and database optimization
  • Provide ongoing maintenance support through query tuning and optimization
  • Analyze troubleshoot and remediate data integrity issues

Skills

  • Strong working knowledge of:
    • SQL (including complex querying)
    • Python
  • Must be a self-starter who requires minimal supervision, with excellent problem-solving skills with a focus on data quality and performance optimization
  • Familiarity with Data Lake and/or Lakehouse concepts
  • Exposure to Cloud Database Management Systems
  • Proven ability to design and develop ETL processes for Kimball star schema data warehouse and reporting platform (MS SQL Server preferred)
  • Solid understanding of relational database theory, principles, and best practices
  • Excellent analytical skills with the ability to identify patterns and insights from large datasets
  • Strong collaboration skills to work effectively with cross-functional teams and stakeholders

Qualifications

  • At least three years of data engineering experience
  • Property and casualty insurance data experience preferred