We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
Remote New

Enterprise Data Analytics and AI Developer

Dudek
United States
Jan 31, 2026

Location(s): Multi-state
Practice/Department: IT
Work Environment: Remote
Compensation: $140,000-$170,000 annually*

Dudek's journey began in 1980 with a vision to serve Southern California's water and wastewater agencies.
Today, we are a 100% employee-owned firm supporting clients nationwide and delivering projects that improve and protect the built and natural environments of communities throughout the United States. Our work has been recognized by leading industry organizations, and we've been honored with multiple national Top Workplace Awards.
Our employee-owners are unified by a singular commitment to supporting projects that address key societal issues, such as the transition to renewable energy, infrastructure hardening and repair, environmental protection, and community resilience.
Learn more about our award-winning culture, the benefits and perks of being a Dudekian, and the projects you will have the opportunity to shape. Who You Are As an employee-owner, you embrace accountability, working safely, and collaboration while thinking resourcefully and independently.
Like all Dudekians, you are curious and solution-oriented, with the ability to adapt quickly to changes and approach challenges with a spirit of innovation. How You'll Make an Impact The Enterprise Data Analytics and AI Developer is responsible for the design, implementation, and deployment of enterprise-scale data management, data analytics, and AI capabilities. The role leverages Microsoft Fabric (including OneLake, Data Factory, lakehouses, warehouses, and Power BI) and Microsoft AI Foundry (including model catalog, agent service, evaluations/observability, and control planes) to deliver secure, governed, and reliable solutions. Working across traditional corporate organizations and lines of business, this role ensures solutions are delivered to specification, aligned with enterprise architecture standards, resilient in production, and optimized for cost, performance, and compliance.

Duties and Responsibilities Strategy & Planning
  • Partner with enterprise architecture & engineering to define data and AI roadmaps that align with business objectives and operating models.
  • Develop reference architectures and patterns for Fabric (OneLake, Lakehouse/Warehouse, Data Factory) and AI Foundry (agents, grounding, evaluations, guardrails).
  • Shape data service standards (naming, domains, data contracts), semantic modeling conventions, and model lifecycle policies.
  • Contribute to backlog planning, estimation, release planning, and solution sizing for enterprise programs.
  • Influence security, privacy, and compliance requirements (RBAC, sensitivity labels, DLP) for data and AI workloads.
Organizational Leadership
  • Provide technical leadership and mentorship to technical teams and practitioners while establishing code review, testing, and deployment standards.
  • Translate business outcomes into technical designs and acceptance criteria; communicate tradeoffs and risks to non-technical stakeholders.
  • Collaborate with corporate governance teams to ensure responsible AI and governed data usage.
  • Enable knowledge transfer with high-quality documentation, runbooks, and enablement sessions for end users and support teams.
Project Leadership
  • Lead end-to-end technical delivery for multiple initiatives-from discovery and design through build, test, release, and operations.
  • Define technical work breakdown structures (WBS), estimates, and resource plans; provide progress updates tied to backlog items and milestones.
  • Own technical quality gates: design reviews, data model reviews, security reviews, and production readiness assessments.
  • Coordinate integration with third-party systems and data providers; support vendor RFP/SOW technical inputs and evaluation criteria.
  • Drive non-functional requirements (performance, availability, observability, cost) and execute performance/scalability tests prior to go-live.
  • Facilitate UAT, cutover planning, and incident response playbooks; ensure smooth transitions to operations.
Technical Microsoft Copilot
  • Design, configure, and deploy custom copilots using Microsoft's Copilot studio
  • Train technical users on the design and prototyping of custom copilots
  • Integrate custom copilots into Teams and Sharepoint user interfaces
Microsoft Fabric
  • Design Lakehouse and Warehouse architectures in OneLake and implement domain-driven data services.
  • Build ingestion and transformation ETL pipelines with Data Factory, notebooks, shortcuts, and mirroring
  • Develop Power BI semantic models and datasets; optimize aggregations, partitions, incremental refreshes, and query performance.
  • Implement KQL databases for streaming/operational analytics and monitoring use cases.
  • Harden solutions with OAuth, SAML assertions, RBAC, sensitivity labels, row-level/object-level security, and workspace isolation; integrate with Purview where applicable.
  • Automate CI/CD for Fabric items (Lakehouse, Warehouse, Semantic Models, Data Factory) using deployment pipelines.
Microsoft AI Foundry
  • Select and evaluate models via the model catalog and implement model router policies and versioning/upgrade strategies.
  • Build and host single and multi-agent solutions with Agent Service while integrating frameworks agent frameworks as needed.
  • Implement retrieval-augmented generation (RAG) using Azure AI Search & vector indices while securely grounding agents with Fabric Data Agents where applicable.
  • Instrument tracing, evaluations, and guardrails and configure data leakage prevention per enterprise policy.
  • Operate with the Foundry control plane for fleet governance, cost controls, and policy enforcement while integrating alerts with enterprise monitoring.
  • Implement Machine Learning tools to design and train AI models to solve business challenges.
  • Experience with data pipelines related to enterprise structured and unstructured data models.
  • Deep understanding of Azure tooling.
  • DevOps, Quality, and Operations
  • Establish CI/CD for data and AI assets using GitHub and implement environment promotion, approvals, and rollback strategies.
  • Create automated tests (unit, pipeline, data quality, prompt/agent evals) and define associated runbooks
  • Set up cost observability and right-size capacity and throughput.
  • Implement telemetry and logging for data pipelines, query performance, agent runs, tool calls, error handling, and publish operational dashboards.
Minimum Qualifications
  • 7+ years of hands-on experience in data engineering, analytics engineering, and/or AI application development in enterprise environments.
  • Deep understanding of data modeling frameworks: Kimbal, EDW Streaming and Lakehouse Architecture.
  • Expertise with Microsoft Fabric: OneLake, Data Factory, Lakehouse, Warehouse, Real-Time Intelligence/KQL, and Power BI semantic models.
  • Expertise with Microsoft AI Foundry: model catalog, Agent Service, evaluations/observability, safety/guardrails, and Control Plane.
  • Proficiency in SQL, Python, and KQL; experience with data modeling (star, data vault), and performance tuning.
  • Experience implementing RAG pipelines (Azure AI Search/vector indices) and securely grounding agents to governed enterprise data.
  • Proven delivery of production-grade solutions with CI/CD, IaC, automated testing, and operations runbooks on Azure.
  • Strong understanding of data governance, privacy, and security (RBAC, sensitivity labels, row-level/object-level security, DLP).
  • Excellent communication skills with the ability to explain complex technical topics to diverse stakeholders.
  • Must possess a valid driver's license and have active personal automobile liability insurance by the first day of employment
Preferred Qualifications
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field.
  • Certifications such as Microsoft Certified: Azure Data Engineer Associate, Azure AI Engineer Associate, or Microsoft Fabric certifications.
  • Experience with Purview governance, DLP policies, and compliance frameworks in regulated industries.
  • Experience integrating ERP's or other enterprise business systems.
  • Ability to interoperate within multi-platform services.
Compensation: $140,000-$170,000 annually*
  • *Final agreed-upon compensation will be based on a variety of factors including, but not limited to, an individual's related experience, education, certifications, skills, and work location. Successful candidates must pass a pre-employment drug test and background check prior to beginning employment.
Working ConditionsEnvironment
  • This job operates in a remote or office-based environment and this role routinely uses standard office equipment such as computers, phones, printers, etc.
Physical Requirements The physical demands described here are representative of those that must be met to successfully perform the essential functions of the job. This job requires the following:
  • Working on a computer, sitting, or standing for long periods of time in an office or remote office setting.
  • Attending meetings, both in person and virtually, and speaking on the phone with peers, clients, etc.
  • Specific vision abilities, including close vision, distance vision, color vision, peripheral vision, depth perception, and the ability to adjust focus.
Dudek is committed to creating a workplace where all employees, regardless of their background, feel valued, respected, and have equal opportunities to succeed. We believe that a diverse and inclusive workforce is essential to our business success, and we are dedicated to fostering a culture where everyone can thrive. We are committed to fair and equitable processes, based on merit, free from any discrimination.
Dudek is genuinely committed to equal employment opportunities within our company and on our project teams. Dudek is also committed to compliance with all applicable laws providing equal employment opportunities. This commitment applies to all persons involved in Dudek's operations and prohibits unlawful discrimination by any employee of Dudek, including supervisors and coworkers. Equal employment opportunities will be extended to all persons (including those with disability and veteran status) in all aspects of the employment relationship, including recruitment, hiring, training, promotion, transfer, compensation, benefits, discipline, layoff, recall, and termination. Any employee who violates this policy and Dudek's commitment to equal employment opportunities will be subject to disciplinary action.
Dudek is a U.S.-based employer. All positions are based in the United States and require U.S. work authorization.

Applied = 0

(web-54bd5f4dd9-cz9jf)