Larridin Blog

Larridin Release Update Mid-March 2026

Written by Floyd Smith | Mar 31, 2026

Larridin is happy to announce the mid-March 2026 release of the Larridin software platform. Larridin enables measurement and management for AI adoption, user fluency with AI, and AI impact optimization in the enterprise. You can find a detailed description of the platform on the Larridin website.

This release includes new steps toward making Larridin a complete AI intelligence platform. Specifics follow.

The Dashboard: An Improved Command Center

The Dashboard is the default starting point, and includes many of the most important features in the new release. It provides a bird's-eye perspective with direct links to deeper analysis.

Key features include:

  • AI Adoption visibility, showing the percentage of active employees using AI tools, with a department-by-department bar chart breakdown.
  • Top AI Use Cases detailed in a donut chart, categorized by work type such as Analysis and Decision Support, Writing and Communication, and Technical Tasks Coding.
  • Top 5 AI Tools ranked by Average Daily Active Users (DAU).
  • Unapproved AI Tool Usage Rate, visualized with a horizontal gauge bar that measures current shadow AI against color-coded risk thresholds: Green (0% to 20%), Yellow (20% to 50%), and Red (50%+). This gauge also lists the top three departments by unapproved usage rate, helping pinpoint areas that warrant immediate governance attention.

The Department Snapshot table provides a cross-functional perspective, with month-to-date (MTD) usage metrics, including daily average users (DAU), total users, AI adoption percentage, and department-level AI engagement and proficiency scores.

The AI Tool Insights table offers an overview of all detected AI tools, showing status (approved/unapproved), DAU, and active events. Every card on the Dashboard is designed as a starting point, linking directly to the full analytical pages for investigation and action.

The revamped AI Adoption section is structured into two critical measurement categories: Reach (how many people are using tools) and Depth (how actively they engage).

  • Reach Metrics track the breadth of usage, including Total Users (installed base of Larridin), AI Users (unique users who engaged with any AI tool), Rolling 30-Day active users, and MAU/WAU/DAU (Monthly, Weekly, and Daily Active Users).
  • Depth Metrics introduce the AI Engagement Score (a Beta metric), which combines signals such as active usage days, interaction frequency, and breadth of tools used. It also reports Total Active Events and Active Events / User (recorded when a user opens a tracked AI tool in a new tab or window).

This separation is intentional: a high adoption rate with shallow engagement, or vice-versa, is instantly visible, giving HR, IT, and AI program leads a complete picture that headcount alone cannot provide.

The dedicated AI Engagement page provides a granular view of user consistency. While the Adoption Overview provides a single engagement score, this page shows the distribution of scores across the organization using percentile lines. This helps leaders determine if moderate engagement is widespread or concentrated among a smaller group of power users.

The AI Engagement Breakdown table lets you drill down by department, showing metrics such as Active Events / User / Week, Active Days / User / Week, and Distinct Apps used.

The new AI Proficiency page (labeled Proficiency Insights) goes beyond simply tracking presence and consistency to measure how effectively the workforce uses AI:

  • AI Proficiency Score (APS): This normalized headline metric (currently Beta) reflects the depth and sophistication of AI usage by evaluating prompt quality and application across different workflows.
  • Prompt Categories: A section showing the distribution of LLM usage across sophisticated work categories, such as Learning And Skill Development and Technical Tasks Coding, helping to identify the most and least used categories.1
  • AI Proficiency Distribution: A percentile line chart shows the P25, P50 (Median), and P75 (Upper Quartile) scores over time. The spread between P25 and P75 immediately indicates whether proficiency is consistent or concentrated among a few sophisticated users.1
  • AI Proficiency Breakdown: A department-level table driven by two key underlying metrics: Prompt Quality (sophistication/effectiveness) and Use Case Diversity (breadth of application).

In short: Adoption describes whether employees are using AI, Engagement asks how consistently, and Proficiency asks how effectively.

The dedicated AI Governance tab offers security, compliance, and IT leaders the tools to monitor AI policy effectiveness through two key subtabs: AI Tool Compliance and Policy Enforcement. The AI Tool Compliance subtab provides a real-time, volume-weighted view of approved versus unapproved AI tool usage. It tracks three key metrics with week-over-week change indicators:

  • Unapproved AI Tools: A count of distinct, unapproved tools detected.
  • Unapproved AI Tool Usage Rate: The share of all AI sessions that involved an unapproved tool.
  • Unapproved AI DAU %: The percentage of daily active AI users who used at least one unapproved tool.

The goal is to provide visibility into signals from unsanctioned, "shadow AI" tools and the use of personal, rather than company, logons for AI tools. Leaders can quickly identify high-volume unapproved tools and logons—which may be filling a real, unmet need—and decide whether to fast-track a review or start a conversation, rather than defaulting to a block of the tool in question.

Policy Enforcement closes the governance loop by showing how active block and warning policies are performing.

  • Key Metrics include Blocked Events (user prevented from accessing the tool) and Warned Events (user sees a warning but may proceed). High and rising blocks, for example, signal a persistent unmet need among employees.
  • Breakdowns show the Fastest Growing Policy Enforcement Triggers and the full AI Tool Access Policy log.

Unified Analytics and Advanced Reporting Provide Comprehensive Visibility

The platform is expanding its data capture and reporting capabilities to give you a complete picture of application usage across your organization.

The AI Tools section now features Unified Analytics, a new default tab that consolidates browser and desktop tool usage into a single view. This eliminates the need to cross-reference data when tracking total tool usage.

  • Tool Insights now supports license tracking for select LLMs (e.g., Claude and Perplexity), providing sub-rows to show Enterprise vs. Personal usage separately.
  • Clicking a tool name opens the Tool Detail Side Panel, providing an atomic unit of intelligence: Category (Agent, LLM, Notetaker), Authorization Status, associated Domains, License Tracking status, and a plain-language Reasoning for its AI classification.

A dedicated Reports tab offers deep-dive views that move beyond summary metrics:

  • AI Use Cases Advanced Report: This report details what your workforce is using AI for, tracking prompt category trends (e.g., Analysis and Decision Support) over time, broken down week-by-week and department-by-department. This is foundational for targeted enablement and smarter license decisions.1
  • App Usage Analytics Report: This powerful new report tracks usage of all browser applications—not limited to AI tools—by department. It gives IT and operations leaders crucial context for AI adoption patterns and provides essential data for software spend decisions by surfacing applications with consistently low or declining usage.

Identity Provider (IDP) Sync and Org Structure Deliver Foundational Efficiency

Accurate department-level analysis depends on accurate organizational structure data. The platform’s new Identity Provider (IDP) Sync feature in Settings > Org Chart eliminates manual org chart upkeep.

Larridin now integrates directly with your existing identity or HR systems using SCIM-based directory sync, supporting identity providers such as Okta, Google Workspace, Microsoft Azure AD, Workday, and others.

By connecting your identity provider, employee data, departments, and reporting structures are kept continuously in sync, ensuring that all department-level analytics across Adoption, Engagement, Proficiency, and Governance are grounded in your real organizational structure. You maintain control over scope by choosing to sync all groups or select specific ones.

In-Context Surveys Drive Your Feedback Loop

Larridin introduces a powerful new way to gather qualitative data directly from employees using AI: Scout Surveys.

Designed to measure AI adoption, proficiency, and impact, these surveys are delivered in-context to achieve higher response rates and more authentic feedback:

  • Delivery: Surveys can be deployed via a browser plugin pop-up while the employee is actively on a tracked AI site (e.g., Claude, Gemini) or distributed via Slack.
  • Targeting and Cadence: Surveys can be scoped to specific departments. For continuous measurement, Ongoing surveys sample employees over time based on a configurable daily sampling percentage (0-100%) to manage survey fatigue. Snapshot surveys are sent once for benchmarking.

This feature complements the platform’s quantitative metrics, providing the "how are you feeling" (qualitative) information to complement the "lab work" (quantitative) data delivered by the analytics dashboard.

This release delivers the most comprehensive intelligence available for managing enterprise AI. From a consolidated Dashboard to deep insights into Proficiency, robust Governance controls, and in-context Survey feedback, Larridin is equipping you to master the strategic deployment of AI across your organization.

Ready to see what is actually in your environment?

Schedule a Demo