Overview
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
This position has responsibility for development, improvements and maintenance on FDM power BI reports, it requires resources to work on business requirement and follow Optum Coding standards and work on new projects in Fabric, Power BI, Advance SQL, Databricks and Azure devops/Github. Developer responsibilities include generating dashboards and reporting for business processes in projects. To be successful in this role, you should have experience using advanced dashboarding and analytical skills to work well in a team.
Primary Responsibilities
- Works with minimal guidance or on routine issues/activities
- Development, enhancement, and maintenance of FDM Power BI reports and Fabric-based analytics solutions
- Work directly with business teams to understand requirements and deliver high quality solutions aligned with Optum coding standards and enterprise architectural guidelines
- Contribute to new initiatives using Microsoft Fabric, Power BI, Databricks, Azure DevOps, and advanced SQL-based data engineering practices
- Build end to end Power BI data models, visualizations, implement Row Level Security (RLS) and Object Level Security (OLS), and design optimized semantic models
- Create and maintain Fabric pipelines, Dataflows, Lakehouse tables, and end-to-end analytics workloads
- Implement best practices related to performance tuning, DAX optimization, and Fabric workload governance
- Extract, transform, and model data using Databricks notebooks (PySpark/SQL)
- Work with large, complex datasets and develop advanced SQL-based transformations
- Perform root cause analysis on data-quality issues and deliver corrective solutions
- Build and maintain Power Automate flows to support data refresh, notifications, and workflow automation
- Reuse or build reusable code, algorithms, and data structures for common business needs
- Understand operational characteristics such as scalability, performance, and security across analytics solutions
- Develop awareness of upstream and downstream impacts of code and data flows
- Use data-driven approaches to solve business problems and provide actionable insights
- Track, identify, and escalate issues affecting data quality, performance, or user experience
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
- 5+ years of combined hands-on experience across Power BI, Advanced SQL, Databricks, Power Automate, Azure DevOps, with solid expertise in Microsoft Fabric
- Hands-on experience with Azure DevOps, Git integration, and CI/CD workflows
- Developing / Fully Proficient
- Experience with relational Database: Power Apps, PowerBI, Basic SQL and Azure Devops
- Knowledge on GitHub Integration with Azure Devops
- Familiarity with current security controls, governance practices, and compliance frameworks
- Proven solid analytical skills with the ability to work on structured and unstructured datasets
#NJP